Skip to content
Search

Latest Stories

HELLO FUTURE: EPISODE 2: TRUST, OVERREACTION, AND THE SIGNAL-TO-NOISE PROBLEM

HELLO FUTURE: EPISODE 2: TRUST, OVERREACTION, AND THE SIGNAL-TO-NOISE PROBLEM


In a world of nonstop outrage, the hardest leadership challenge isn’t identifying what’s fake — it’s knowing what not to react to. Kevin Cirilli sits down with Dan Brahmy to examine how coordinated manipulation exploits institutional reflexes, why overcorrection can do more damage than inaction, and how decision-makers can distinguish organic discourse from influence operations at scale. Drawing on real-world cases across governments and global brands, this episode looks at why trust, discernment, and restraint are emerging as core leadership skills — and what happens when they fail.

Meet The Future: https://mtf.tv/


See omnystudio.com/listener for privacy information.

Speaker 1 (00:07):
Nowadays, it seems like everybody's negative online, but you don't
know if it's a human, a robot, a bot, someone
in their basement. Who knows? But how do you react
and when do you not react? How do you are
you at risk? Or companies or politicians, are the public
anyone really? Are they at risk of overreacting when they're
monitoring their online engagement? This is a very futuristic problem.
Hello Future, It's me keV. This is a dispatch from
the Digital Frontier. The planet is Earth. The year is
twenty twenty six. Remember you can listen to all of
the latest Hello Future episodes wherever you get your podcast,
but particularly on the iHeartMedia app. I've Kevin's really. You
can sign up for our newsletter at MTF dot TV.
My guest today returning to the program is Dan Brami.
He is the CEO and co founder of Saiabra, which
is a company which helps to navigate the online trolling
commune unity if you're a business. There's also obviously geopolitical
implications because it's like the wild West of robots out
there if you see all of the bizarre stuff that
gets posted. So for this conversation, I really want to
focus Dan. First of all, welcome back. I really want
to focus on you when to ignore and when to
engage with a butt.

Speaker 2 (01:21):
That's a great question, you know, and thank you for
having me again. This is awesome. I'll take it back
to what you said at the beginning. I think that
the overreaction can be today very dangerous. We've seen many
use cases where you have it, whether it's a government
or brand, and we work we both, you know, kind
of both out of these types. The overreaction or the
lack of response, when you don't know how to filter
what you're seeing or watching or reading right now, can
be incredibly dangerous and can lead to a huge avalanche.
Because what if I told you that? And here's the
interesting thing, because we talk a lot about like, oh,
what's the percentage of bos, what's the percentage of like
spam accounts and whatever, let me give you an interesting
piece here. I think that while it was or early
claim to fame, you know, the butt busters kind of thing,
you know, here's an interesting piece for you. What if
I told you that. Initially there's a conversation and it's
talking about the latest fries of the latest burger of
a really big company, and I'm telling you, hey, it's
only one percent. But your initial response would be Okay,
don't know, don't care, doesn't matter. But the percentage is
not that interesting. What really matters is what if I
told you that one percent of the conversation is inauthentic,
but it drove eighty five percent of the conversation. Wow,
the entire narrative, the entire conversation that you have right
in front of your eyes that is developing as we
speak about the fries and the burgers. Yeah, it could
be something more dramatic. Actually, what if I told you
that the one percent was able to influence the eighty
five or ninety percent and it happens. I can give
you the inverse. Happens so many times, which is why
I'm asking. Sometimes it's the underreaction. Sometimes it's the overreaction.
Let me give you the example of the overreact.

Speaker 1 (03:09):
Story of my life as a human. But keep going.

Speaker 2 (03:12):
I get it. I get Sometimes sometimes you could say
the weirdest shit ever in your life, and then you
have nobody cleaning anything. Sometimes you could say this morning
I woke up and my muscles were sore, and you
got like ten thousand people. I get it. I feel
you and you are famous. Yeah, here's the here's the
here's the flip of the story. If you have I
think I think you are.

Speaker 1 (03:34):
Well, you know, it's not my objective. My objective is
to stay sane for me.

Speaker 2 (03:42):
Yeah, I get it. I'm just saying for you know,
in Mayas, I think you're famous in your grade. But
I'll tell you this. I'll tell you this. Here's the flip.
If you have the fries and the burger equation, yeah,
and then you have forty percent bots, you would go nuts.
You're the chief communications of that company, You're like, oh
my god, we got to do something immediately. Come up
with a statement, do a press release, you know, put
out the fire, but there's no fire.

Speaker 1 (04:05):
The conversation that we're having on the digital frontier about
this false fire still requires an emergency responder. Cyabas the
emergency responder in this case, because you need guidance on
what's real and what's fake in order to navigate that
or what's what's just the fire alarm and what's actually
a fire? Correct, right, But it's also a very human
human dilemma because how often in our own lives do
we think of a scenario that's emerged in our life
and we think of you know, we catastrophize or we
think of the worst case scenario and we think, oh,
I have to do x y intaneus. Really you don't
have to do anything. You really just got to stop, pause,
take a breath. And that's a very human human quality.
So this, this as the trolls and the bots have
really kind of emerged, being able to decipher through what's
real and what's fake. Candidly, folks, we've been trying to
do that as humans for our whole existence.

Speaker 2 (05:02):
For our whole existence, And and I love your Meet
the Future moments because they they're good summaries of like
what we should expect. But but you know what you asked.
You asked me something interesting. You said, what.

Speaker 1 (05:11):
Should we do?

Speaker 2 (05:13):
Right? Like, what should we do? Is there? Is there
a guideline? Is there framework? Are there any steps? And
so you know, the way that we look at it
usually is something along the lines of before you respond,
before you react, before you put something out there, you
put yourself out there. There are a couple of questions
that you can ask and it usually takes a few
seconds and then it's your own sound calm judgment before
you put yourself in the avalanche. Probably the first one
would be does it seem coordinated? Does it feel like
somebody is trying to manipulate me into a shift, putting
me into a box instead of me naturally going into this.
I mean, that's a very important piece. You don't have
to have a pH in computer science to ask that question,
because the human brain, guess what, is beautifully wired and
we have the soul skills to ask that question and
get a good enough of an answer to maybe not
fall into the traps per se. You know, even if
it's fifty percent of the time, trust me, that would
be a great progress. So I think the first question
should be does it seem coordinated to me or not?
I guess the second question that you could ask yourself
it would probably be, you know, if let's say that
you're a big brand, you're fortune five thousand, and you're
in the pharmaceutical company, right, is it reaching? Is it exposing?
Is it impacting the opinions of the people that matter
to me? Because if you are a pharmaceutical company and
your audience is in the US, but to react to
a storm that's happening in Taiwan. I'm not sure you
should be involved.

Speaker 1 (06:45):
I'm gonna argue if it's Thailan you should be involved
with that's maybe. If it's on If it's on I
get your point.

Speaker 2 (06:53):
Yeah, if it's on Pluto, if it's like a completely
outside country from a go to market standpoint, so like,
is it reaching the people that matter to me subjectively speaking?
And then I guess I'd say maybe the Maybe the
third point would probably be what is the proportional response?
Because we talked about overreacting or not doing anything at all,
which is another thing. What's the proportional response to this
that I'm seeing right now? If I'm going to say
or do something that doesn't do their distribution for them
for sure.

Speaker 1 (07:24):
And you don't and because you don't want to give
oxygen to the fire.

Speaker 2 (07:27):
This is kind of the distinction between disinformation and misinformation.
People tend to forget, but to be misinformed, by its
pure definition, means that it's not your fault. And to
create disinformation means that you actively are pushing it. So
the malicious and the inauthentic forces behind the wave, yeah,
they are pushing this information. But your mother and my
mother are getting misinformed. And this is why we need
people to create that distinction.

Speaker 1 (08:02):
That and underline that disinformation is the negative intense. If
the intense is malicious, it's disinformation. The results of disinformation
is misinformed. Public is misinformation that that someone consumes. That's
a really really important point. And and and I think
the reason and it's really easy to remember which comes which,
because it's just alphabetical order. Disinformed first and then misinformed
comes next. And that's that's how it's a chain reaction.
What I really really think is so frustrating as someone,
you know, just just the average person, is the fact
that there doesn't appear to be metrics or standards of
how to navigate it and it and it. I say
that because when bad actors like Russia, China, Iran are
purposefully infiltrating our digital spaces where we as a society
value freedom and democracy. And I'm not talking about you know,
values in a political sense. I'm talking about the stuff
we all agree on, literally, we all agree on. But
when those bad actors are weaponizing our freedom of speech.
I mean, back when Walter Cronkite was alive, China or
Russia or Iran or you know other bad actors of
that era. They couldn't go on the nightly news and
spread disinformation, but in twenty twenty six they can in
our digital spaces. They still can't do it on the
nightly news, but not as many people are watching that,
so it's and they can infiltrate the conversation of the
nightly news by simply peddling this stuff online and not
to me and as a citizen. And infuriates me because
I've literally witnessed url jumping from IRL and people making
and just being weaponizing hate. I really think that the
work that you're doing, Dan is so incredibly important to
help put out fires, identify with smoke, identify what's real,
identify what's fake. And I hope that you are successful
because I hope then it trickles down to kids learning
about this in school, because I don't think they are.
And how a young person who's addicted to TikTok China
is TikTok or I guess, you know, I guess it's
a little different now, but how that person's worldview is
informed is way more at risk than whether or not
they're watching Fox or MS. Now, thank you very much
and thanks shout out to my friend Mary who connected
us for these episodes. Appreciate that as well. All Right,
my friend, thank you so much, sir, having great tomorrow
today

More For You