Why Most AI Projects Fail
Beyond the AI hype, we see a stark divide: failed implementations that waste time and resources, and success stories that redefine entire industries. What makes the difference? Vadim Peskov, CEO of DIFFCO and an AI advisor with nearly a decade in the field, joins Melinda Lee on the Speak in Flow podcast to reveal the one critical factor separating leaders from laggards.
This episode is a must-listen for any leader ready to bridge the gap between AI potential and tangible business results.
In This Episode, You Will Learn:
The Gap Between Hype and Execution
Why most AI "failures" stem from a simple misunderstanding, not technical limits. Vadim explains why treating AI like a magic genie with one-line prompts leads to dead ends, and how shifting from a user to a strategic operator mindset is the first step to real impact.
From "Founder Mode" to Strategic Operator
“If you don't know the flows of your processes, you should be replaced as CEO.”
Why dabbling with chatbots isn't a strategy. Learn why executives must deeply understand AI's capabilities and constraints to make informed decisions that protect their business and unlock scalable opportunities.
Building the "10x Business" with AI Augmentation
“You want to enhance humans with AI. You want to give them a tool to be more productive. So you can have the same workforce, but just do more things.”
How AI enables businesses of 10 to outperform legacy players with 1,000. Vadim shares concrete examples, from insurance to manufacturing, showing how AI doesn't replace teams but transforms them into super-reviewers who oversee and refine AI-executed workflows, achieving unprecedented scale and accuracy.
Review, Don't Create
“We can do a lot of research and understanding with AI that a human definitely will not have time to do. But we can learn from the human what we need to look for.”
Discover the paradigm shift where AI prepares reports, generates insights, and flags issues, turning management into a high-value approval loop that takes minutes instead of days.
BLOG:
Crises can arise at any time, and one thing is certain: they will not occur when you expect them to. The way you communicate during this time plays a more significant role than you may realize in overcoming the crisis. Do you have the necessary tools?
Discover more in our latest blog post, "The Art of Crisis Communication."
About the Guest:
Vadim Peskov is the CEO of DIFFCO and an advisor at Alchemist Accelerator, where he helps startups and enterprises translate AI hype into scalable business results. With nearly a decade of expertise in AI, cybersecurity, and product development, he is a sought-after speaker who focuses on the practical implementation of intelligent technology.
He is a sought-after expert and speaker who demystifies AI, translating complex concepts into actionable strategies for growth, efficiency, and competitive disruption.
Social Handles:
Company website: https://diffco.us/
Linkedin: https://www.linkedin.com/in/vadimpeskov/
Fun Facts:
- ✈️ Aviation Enthusiast: A licensed pilot who finds clarity and perspective above the clouds.
- 📸 Exhibited Artist: Pursues landscape photography at a semi-professional level, with work featured in multiple exhibitions.
- 🔧 Hands-On Tinkerer: Loves experimenting with IoT and AI side projects for fun, blending hobby with expertise.
- 🚶 Walking Meeting Advocate: Firmly believes that the best strategic thinking and conversations happen while on the move.
About Melinda:
Melinda Lee is a Presentation Skills Expert, Speaking Coach, and nationally renowned Motivational Speaker. She holds an M.A. in Organizational Psychology, is an Insights Practitioner, and is a Certified Professional in Talent Development as well as Certified in Conflict Resolution. For over a decade, Melinda has researched and studied the state of “flow” and used it as a proven technique to help corporate leaders and business owners amplify their voices, access flow, and present their mission in a more powerful way to achieve results.
She has been the TEDx Berkeley Speaker Coach and has worked with hundreds of executives and teams from Facebook, Google, Microsoft, Caltrans, Bay Area Rapid Transit System, and more. Currently, she lives in San Francisco, California, and is breaking the ancestral lineage of silence.
Website: https://speakinflow.com/
Facebook: https://m.facebook.com/speakinflow
Instagram: https://instagram.com/speakinflow
LinkedIn: https://www.linkedin.com/in/mpowerall
Thanks for listening!
Thanks so much for listening to our podcast! If you enjoyed this episode and think that others could benefit from listening, please share it using the social media buttons on this page.
Do you have some feedback or questions about this episode? Leave a comment in the section below!
Subscribe to the podcast
If you would like to get automatic updates of new podcast episodes, you can subscribe to the podcast on Apple Podcasts or Stitcher. You can also subscribe in your favorite podcast app.
Leave us an Apple Podcast review.
Ratings and reviews from our listeners are extremely valuable to us and greatly appreciated. They help our podcast rank higher on Apple Podcasts, which exposes our show to more awesome listeners like you. If you have a minute, please leave an honest review on Apple Podcasts.
Welcome, dear listeners, to the Speak and Flow Podcast, where we dive into unique strategies and stories
2
00:00:08,000 --> 00:00:15,000
To help you and your team achieve maximum potential and momentum through communication.
3
00:00:15,000 --> 00:00:27,000
Today, I have an amazing leader. He is in the AI space. We're going to be talking about the conversations that are happening with AI. What are the things that we're resistant about, and what are…
4
00:00:27,000 --> 00:00:32,000
Um, the conversations that we need to have as leaders when it comes to AI.
5
00:00:32,000 --> 00:00:35,000
And he is the CEO of DIFCO.
6
00:00:35,000 --> 00:00:38,000
He is an advisor to
7
00:00:38,000 --> 00:00:47,000
Alchemist Accelerator. He's an expert speaker and leader in the AI space in cybersecurity and also product development.
8
00:00:47,000 --> 00:00:52,000
As an advisor, he likes to help companies scale through AI, and he's
9
00:00:52,000 --> 00:00:57,000
Also, passionate about aviation. Uh, he loves coffee.
10
00:00:57,000 --> 00:01:02,000
And… and so I welcome Vadim Peskov.
11
00:01:02,000 --> 00:01:03,000
Hi, Vadim.
12
00:01:03,000 --> 00:01:08,000
are great to be here, and thanks for inviting. It's, uh, definitely will be a fun conversation.
13
00:01:08,000 --> 00:01:10,000
How AIA will change the world?
14
00:01:10,000 --> 00:01:14,000
how AI will change the world, as long as we don't kill ourselves.
15
00:01:14,000 --> 00:01:16,000
Yeah, that's a second chapter. We'll… we may skip it. We'll see.
16
00:01:16,000 --> 00:01:24,000
I love it, I love it. So tell me, Vadim, what are you excited about when it comes to DivCo?
17
00:01:24,000 --> 00:01:25,000
Your company.
18
00:01:25,000 --> 00:01:31,000
Yeah, so, just for the context, again, we are providing software development services for last
19
00:01:31,000 --> 00:01:33,000
17 years, and uh…
20
00:01:33,000 --> 00:01:36,000
We're doing a lot of work with AI, specifically, eh?
21
00:01:36,000 --> 00:01:41,000
Um, and we see a lot of different startups and a lot of enterprises trying to
22
00:01:41,000 --> 00:01:47,000
change how they work, and uh… how they do things, basically, with use of
23
00:01:47,000 --> 00:01:50,000
or differently and different ML models.
24
00:01:50,000 --> 00:01:55,000
And I think this is the really interesting, uh, edge to be on, because we…
25
00:01:55,000 --> 00:01:59,000
see how things is progressing and changing in direction.
26
00:01:59,000 --> 00:02:02,000
of what the future will look like in…
27
00:02:02,000 --> 00:02:04,000
like, half a year, a year, two years, etc.
28
00:02:04,000 --> 00:02:07,000
So, I think what I'm really passionate about…
29
00:02:07,000 --> 00:02:11,000
that the use cases and how we use the current software
30
00:02:11,000 --> 00:02:15,000
And what we can get from the… our current AI models.
31
00:02:15,000 --> 00:02:17,000
As well as, uh, the current…
32
00:02:17,000 --> 00:02:20,000
products that we use day in, day-to-day life.
33
00:02:20,000 --> 00:02:24,000
will be actually much more advanced and much more interesting
34
00:02:24,000 --> 00:02:31,000
in near future. So, I think it's a really interesting, uh, way to see how we progress in this society.
35
00:02:31,000 --> 00:02:34,000
And the acceleration that we see right now,
36
00:02:34,000 --> 00:02:36,000
It's absolutely crazy.
37
00:02:36,000 --> 00:02:42,000
I don't think we ever saw so many companies growing so freaking fast.
38
00:02:42,000 --> 00:02:46,000
And I think we will have even more crazier stories.
39
00:02:46,000 --> 00:02:47,000
Yeah.
40
00:02:47,000 --> 00:02:52,000
So, this is what we're doing, and this is what is keeping me motivated for this.
41
00:02:52,000 --> 00:02:55,000
How long have you been in the AI space?
42
00:02:55,000 --> 00:03:01,000
That's a good question. Uh, right now, it's… I think it's 8, 9 years when we started…
43
00:03:01,000 --> 00:03:05,000
initially are doing a lot of, uh, mostly as well, computer vision and ML work.
44
00:03:05,000 --> 00:03:12,000
Back then, um, and a lot of those things is called the big data, et cetera, it's a big…
45
00:03:12,000 --> 00:03:15,000
the different kind of naming for the same things, in a way, um…
46
00:03:15,000 --> 00:03:16,000
Ah, I see.
47
00:03:16,000 --> 00:03:20,000
Uh, however, it'd been a long time.
48
00:03:20,000 --> 00:03:26,000
That's so… because from an outsider who had just been exposed to AI this year,
49
00:03:26,000 --> 00:03:33,000
And now I see so many people that have been in the AI space for many, many years, like, but they're… all of you have different
50
00:03:33,000 --> 00:03:35,000
expertise in the AI world.
51
00:03:35,000 --> 00:03:36,000
We've been cooking this for some time.
52
00:03:36,000 --> 00:03:45,000
But still, you've been cooking this! It's been brewing! That's so fascinating! And so, thank you so much for joining the conversation today, and…
53
00:03:45,000 --> 00:03:53,000
And so, again, you see how, um, how AI can really change the way that we grow our businesses in such a rapid pace.
54
00:03:53,000 --> 00:04:02,000
And, um, what are you hearing with the conversations of the people that are just resistant to it? What are they saying to you?
55
00:04:02,000 --> 00:04:06,000
To be honest, I don't have a lot of conversation about resistance.
56
00:04:06,000 --> 00:04:07,000
Okay, good.
57
00:04:07,000 --> 00:04:10,000
I do have a lot of conversation of misunderstanding.
58
00:04:10,000 --> 00:04:11,000
Okay.
59
00:04:11,000 --> 00:04:14,000
Um, because in many cases, people…
60
00:04:14,000 --> 00:04:16,000
uh, tried some tools.
61
00:04:16,000 --> 00:04:20,000
And try to see what is possible,
62
00:04:20,000 --> 00:04:23,000
And they wasn't able to get the result.
63
00:04:23,000 --> 00:04:29,000
One way or another, and in many cases, they just use the tool in their own way.
64
00:04:29,000 --> 00:04:33,000
So basically, in many cases, they will take a…
65
00:04:33,000 --> 00:04:35,000
large tool, and we'll try to…
66
00:04:35,000 --> 00:04:41,000
ask it even larger questions sometimes, and do not provide any context.
67
00:04:41,000 --> 00:04:43,000
Um, and…
68
00:04:43,000 --> 00:04:46,000
like, hey, I'm asking ChatGPT for something,
69
00:04:46,000 --> 00:04:49,000
Um, and it's provided me a terrible answer.
70
00:04:49,000 --> 00:04:51,000
is great. Um…
71
00:04:51,000 --> 00:04:55,000
Uh, and I'm thinking that it's a useless tool. The reality is…
72
00:04:55,000 --> 00:05:02,000
In many cases, you need to explain what the heck you want as a result, how you want to process the data, how you want the things…
73
00:05:02,000 --> 00:05:04,000
Again, in plain English, no, no…
74
00:05:04,000 --> 00:05:06,000
again, magic here.
75
00:05:06,000 --> 00:05:08,000
The problem is that many people…
76
00:05:08,000 --> 00:05:15,000
Um, try to kind of, like, do a prompt with one line or two lines is great.
77
00:05:15,000 --> 00:05:17,000
It doesn't explain you what to do.
78
00:05:17,000 --> 00:05:24,000
So, if you would ask your coworker that's never worked on this problem before, you maybe will spend 5 minutes.
79
00:05:24,000 --> 00:05:28,000
Maybe more? Explain what exactly you want this person to do.
80
00:05:28,000 --> 00:05:31,000
And after this, you can make this work.
81
00:05:31,000 --> 00:05:32,000
Mm-hmm.
82
00:05:32,000 --> 00:05:34,000
In some cases, we'll have a prompt that
83
00:05:34,000 --> 00:05:38,000
more than a page long to actually execute on things that we need.
84
00:05:38,000 --> 00:05:41,000
Because, again, you didn't want that some…
85
00:05:41,000 --> 00:05:43,000
A random result, you want to…
86
00:05:43,000 --> 00:05:46,000
exact result, what you're looking for,
87
00:05:46,000 --> 00:05:50,000
And sometimes you need to explain the things that need to be done.
88
00:05:50,000 --> 00:05:55,000
Right, and providing the right context, and you're saying that the…
89
00:05:55,000 --> 00:06:01,000
Having the right information in there sometimes could take up a whole page versus just a one sentence.
90
00:06:01,000 --> 00:06:04,000
Yep, absolutely. Again, especially if it's a…
91
00:06:04,000 --> 00:06:06,000
not like, hey, fix the…
92
00:06:06,000 --> 00:06:08,000
grammar in this email, but, like,
93
00:06:08,000 --> 00:06:09,000
Right.
94
00:06:09,000 --> 00:06:12,000
help me actually research some kind of topic.
95
00:06:12,000 --> 00:06:13,000
Right.
96
00:06:13,000 --> 00:06:16,000
Um, so if you ask in AI, help me
97
00:06:16,000 --> 00:06:21,000
find a better future for myself, my career, like, some career advices.
98
00:06:21,000 --> 00:06:23,000
I have no freaking idea who you are.
99
00:06:23,000 --> 00:06:29,000
beside your name, which basically sometimes will get not a lot of information that you can find.
100
00:06:29,000 --> 00:06:35,000
online about you, or about your experience, about your strengths. So, it is a lot of angles that you need to add.
101
00:06:35,000 --> 00:06:38,000
to get the result. Same for the business as well.
102
00:06:38,000 --> 00:06:45,000
And earlier you said that companies are misunderstanding of how to use it.
103
00:06:45,000 --> 00:06:56,000
Have you run into a company that have used it, um, not knowing how to use it well, and losing a lot of money, or a lot of time, productivity, effort?
104
00:06:56,000 --> 00:06:59,000
Do you have an example of that?
105
00:06:59,000 --> 00:07:03,000
I don't have a lot of examples of, like, losing specifically money, I don't think…
106
00:07:03,000 --> 00:07:06,000
um… I've seen those cases so much.
107
00:07:06,000 --> 00:07:08,000
But I seen a lot of missed opportunities.
108
00:07:08,000 --> 00:07:11,000
where, um, you just…
109
00:07:11,000 --> 00:07:14,000
Having a lot of, uh, guardrails installed.
110
00:07:14,000 --> 00:07:20,000
Uh, and you cannot do these things and this thing, and you're worried about the data so much.
111
00:07:20,000 --> 00:07:23,000
Um, that you will share with some model,
112
00:07:23,000 --> 00:07:30,000
Um, and company not providing you an on-prem solution that actually sharing your data,
113
00:07:30,000 --> 00:07:36,000
And, again, it's not a complex thing, so it's a, again, um…
114
00:07:36,000 --> 00:07:40,000
But when we do EA transformation sessions, when we help those companies,
115
00:07:40,000 --> 00:07:43,000
In many cases, we see that managers and management
116
00:07:43,000 --> 00:07:47,000
See, Elise is like, yeah, let's actually make it work.
117
00:07:47,000 --> 00:07:50,000
Um, but they…
118
00:07:50,000 --> 00:07:55,000
themselves do not have a deep understanding of this. And my strong belief…
119
00:07:55,000 --> 00:07:57,000
And maybe I'm completely wrong here?
120
00:07:57,000 --> 00:08:03,000
But my strong belief is you, as a founder, or you as a C-level exec,
121
00:08:03,000 --> 00:08:06,000
You need to understand what this tool is doing.
122
00:08:06,000 --> 00:08:11,000
Um, again, you don't need to specifically understand the technicalities of this, you don't need to
123
00:08:11,000 --> 00:08:13,000
called anything yourself, sure.
124
00:08:13,000 --> 00:08:15,000
But you need to understand
125
00:08:15,000 --> 00:08:21,000
what is the capabilities of this? Because, let's say you have a construction and you…
126
00:08:21,000 --> 00:08:26,000
buying a truck, you want to understand the characteristic of this truck.
127
00:08:26,000 --> 00:08:31,000
can distract live the certain weight, and can it do X, Y, and Z things that you need?
128
00:08:31,000 --> 00:08:39,000
Sure, you want to understand this, and you didn't want to buy a different version that will not be able to hold the weight that you need.
129
00:08:39,000 --> 00:08:42,000
Same story here, but in many cases, people…
130
00:08:42,000 --> 00:08:47,000
Just do not spend enough attention there, and they believe that, hey, it's a…
131
00:08:47,000 --> 00:08:51,000
I played with ChatGPT, or I played with Clod, or I played with Gemini,
132
00:08:51,000 --> 00:08:55,000
Um, and it's good enough for me to understand how this works.
133
00:08:55,000 --> 00:08:57,000
The truth is, probably not.
134
00:08:57,000 --> 00:08:58,000
Got it.
135
00:08:58,000 --> 00:09:03,000
Probably you need to spend a little bit more time understanding that gentica approaches, you can understand
136
00:09:03,000 --> 00:09:07,000
how you can actually apply to your business, what the actual limitations are,
137
00:09:07,000 --> 00:09:09,000
what functions you can replace, what…
138
00:09:09,000 --> 00:09:11,000
functions you can enhance with AI.
139
00:09:11,000 --> 00:09:22,000
This is interesting. So, as an executive, you're finding that they're not as resistant as before, they're more misunderstanding.
140
00:09:22,000 --> 00:09:30,000
They don't really understand how to use it, the power of it, and how they can even apply it to a business.
141
00:09:30,000 --> 00:09:31,000
Yep.
142
00:09:31,000 --> 00:09:41,000
to be strategic, right? It's more about, like you said, they just using it to type in some prompts in Claude, or in ChatGPT, and they think that that's
143
00:09:41,000 --> 00:09:43,000
enough in understanding.
144
00:09:43,000 --> 00:09:45,000
Yeah, so, um, and again,
145
00:09:45,000 --> 00:09:47,000
different companies, different execs, different…
146
00:09:47,000 --> 00:09:48,000
Yeah, right.
147
00:09:48,000 --> 00:09:50,000
levels. But that's why we…
148
00:09:50,000 --> 00:09:54,000
again, for the last couple of years, we, like, really, uh…
149
00:09:54,000 --> 00:09:58,000
love to use a founder mode, uh, uh, examples.
150
00:09:58,000 --> 00:10:05,000
Uh, of many, many, uh, companies here, and we have a lot of jokes about what the founder more is and why it's a legal acceptance.
151
00:10:05,000 --> 00:10:06,000
Okay.
152
00:10:06,000 --> 00:10:07,000
But in reality,
153
00:10:07,000 --> 00:10:10,000
you need to understand what the heck is going on.
154
00:10:10,000 --> 00:10:11,000
Yes.
155
00:10:11,000 --> 00:10:13,000
Uh, because if you don't…
156
00:10:13,000 --> 00:10:16,000
Um, diving to…
157
00:10:16,000 --> 00:10:20,000
the actual opportunities of what your business can do with this,
158
00:10:20,000 --> 00:10:21,000
Yes.
159
00:10:21,000 --> 00:10:23,000
you will be replaced. And…
160
00:10:23,000 --> 00:10:24,000
What I see, like, in…
161
00:10:24,000 --> 00:10:27,000
The CEO will be replaced?
162
00:10:27,000 --> 00:10:28,000
For the company, the organization.
163
00:10:28,000 --> 00:10:30,000
the businesses will die. The whole company, the whole company, because…
164
00:10:30,000 --> 00:10:32,000
Yeah, correct, correct.
165
00:10:32,000 --> 00:10:39,000
In many cases, I see that, for example, let's take the insurance company's example.
166
00:10:39,000 --> 00:10:45,000
I strongly believe that it's possible to build a large insurance company with, uh,
167
00:10:45,000 --> 00:10:50,000
maybe hundreds of people, not thousands or tens of thousands, that'd be.
168
00:10:50,000 --> 00:10:53,000
heavily now, and do the same work even better.
169
00:10:53,000 --> 00:10:54,000
Mm-hmm.
170
00:10:54,000 --> 00:10:57,000
Which means, uh, much lower capex.
171
00:10:57,000 --> 00:11:01,000
Which means much better profitability, which means we can provide
172
00:11:01,000 --> 00:11:05,000
better quality of the service, and charge less.
173
00:11:05,000 --> 00:11:11,000
And I don't think it's a problem to build those type of businesses. I think it's easy.
174
00:11:11,000 --> 00:11:14,000
enough, so many, uh…
175
00:11:14,000 --> 00:11:19,000
people will have those kind of examples, build and really replacing.
176
00:11:19,000 --> 00:11:26,000
So, last year we saw… sorry, this year, we saw the company that was acquired for $80 million with 1%
177
00:11:26,000 --> 00:11:29,000
Uh, that built this company?
178
00:11:29,000 --> 00:11:32,000
Which is, hey, one person was…
179
00:11:32,000 --> 00:11:36,000
80 mil, uh, for, like, a couple years' worth of work.
180
00:11:36,000 --> 00:11:42,000
It's not really a sophisticated product, but, I mean, it's a good product. The point is,
181
00:11:42,000 --> 00:11:46,000
had no one else. It's just one person, specifically, that
182
00:11:46,000 --> 00:11:51,000
did everything there. There's not 10 people, not 20,
183
00:11:51,000 --> 00:11:57,000
1. And again, it is a wonderful journey, and I think this is what is possible right now. And I think…
184
00:11:57,000 --> 00:12:00,000
we may have a billion dollar re-exit
185
00:12:00,000 --> 00:12:09,000
with the folks that maybe one or two people, etc. I still believe if he had maybe 10 people, it would be a billion bucks.
186
00:12:09,000 --> 00:12:13,000
not, uh, specifically, uh, 80 million, but…
187
00:12:13,000 --> 00:12:15,000
I think it's still a great opportunity.
188
00:12:15,000 --> 00:12:18,000
for the growth. So you have a lot of…
189
00:12:18,000 --> 00:12:20,000
ways to see this, and…
190
00:12:20,000 --> 00:12:23,000
Eve the one guy can do this,
191
00:12:23,000 --> 00:12:26,000
is most likely you can actually…
192
00:12:26,000 --> 00:12:30,000
change the logic of the current business in the same way?
193
00:12:30,000 --> 00:12:35,000
that much more dramatic, but people scared because they do not understand.
194
00:12:35,000 --> 00:12:38,000
And they typically don't spend enough time understanding.
195
00:12:38,000 --> 00:12:40,000
And they think it's toys.
196
00:12:40,000 --> 00:12:43,000
While people on the other side actually earning money.
197
00:12:43,000 --> 00:12:49,000
Yeah, so what advice would you give an entrepreneur, you know, with
198
00:12:49,000 --> 00:12:57,000
some employees, like 200, 300, 400 employees, how can they start to understand
199
00:12:57,000 --> 00:13:01,000
how AI could support this business.
200
00:13:01,000 --> 00:13:02,000
What advice?
201
00:13:02,000 --> 00:13:05,000
I think it's a… it depends on the organization, to be honest.
202
00:13:05,000 --> 00:13:07,000
Because if you have a…
203
00:13:07,000 --> 00:13:09,000
rights to do things.
204
00:13:09,000 --> 00:13:17,000
in this organization. Um, I think it's a wonderful opportunity for you to start implementing and learning those things.
205
00:13:17,000 --> 00:13:21,000
And I'm not saying that it's really, uh…
206
00:13:21,000 --> 00:13:24,000
requiring a lot of technical effort.
207
00:13:24,000 --> 00:13:27,000
But it is a basic, uh…
208
00:13:27,000 --> 00:13:31,000
visual programming that you can do with some tools, etc.
209
00:13:31,000 --> 00:13:37,000
Sometimes there's maybe some code, but to be honest, AI can write the code for you, so you don't even need to program.
210
00:13:37,000 --> 00:13:42,000
Uh, yourself, so I think in many cases, it's possible to do a lot of things.
211
00:13:42,000 --> 00:13:46,000
that, um, many people just, hey, I'm not a programmer.
212
00:13:46,000 --> 00:13:49,000
I was like, sure, but you don't need to be.
213
00:13:49,000 --> 00:13:56,000
It's maybe need to be executed in a code in some cases, or it's better executed with a code in some cases.
214
00:13:56,000 --> 00:14:03,000
But I think we will have a lot of tools, and we already have them, that, like, require zero coding.
215
00:14:03,000 --> 00:14:06,000
that you can build a great prototype, so you can…
216
00:14:06,000 --> 00:14:08,000
create a lot of internal tools.
217
00:14:08,000 --> 00:14:10,000
that will be used, um…
218
00:14:10,000 --> 00:14:13,000
Uh, for different purposes, etc.
219
00:14:13,000 --> 00:14:17,000
Uh, that can be, uh, much better than a current.
220
00:14:17,000 --> 00:14:18,000
flows. Sure,
221
00:14:18,000 --> 00:14:25,000
So how do I know a flow, and what even to program, right? There's probably so many workflows, so many different…
222
00:14:25,000 --> 00:14:28,000
tools, like, how would I, as a CEO,
223
00:14:28,000 --> 00:14:32,000
looking to grow and scale my company, how do I know?
224
00:14:32,000 --> 00:14:35,000
If you don't know your…
225
00:14:35,000 --> 00:14:39,000
flows of your processes in your organization, you should be replaced as CEO.
226
00:14:39,000 --> 00:14:44,000
I don't know, meaning, like, I do know, but how do I know which ones to start with?
227
00:14:44,000 --> 00:14:45,000
Sorry.
228
00:14:45,000 --> 00:14:48,000
Oh, um, which one is taking the, uh…
229
00:14:48,000 --> 00:14:52,000
all, uh, basically, the more team effort.
230
00:14:52,000 --> 00:14:53,000
Okay.
231
00:14:53,000 --> 00:14:56,000
To complete. Um, and my point is,
232
00:14:56,000 --> 00:14:58,000
In many cases, just to be clear,
233
00:14:58,000 --> 00:15:01,000
The right answer may be not in AI.
234
00:15:01,000 --> 00:15:06,000
Wright answered maybe in a different tool. I'm not saying that AI is a solution for everything.
235
00:15:06,000 --> 00:15:07,000
No, it's not.
236
00:15:07,000 --> 00:15:08,000
It's not? I thought it is!
237
00:15:08,000 --> 00:15:10,000
No, it's not.
238
00:15:10,000 --> 00:15:15,000
It's not so accurate in some cases, and it will have some issues.
239
00:15:15,000 --> 00:15:19,000
And let me kind of maybe, um…
240
00:15:19,000 --> 00:15:24,000
draw the picture of the things, how I think the future interfaces will look like.
241
00:15:24,000 --> 00:15:25,000
Okay.
242
00:15:25,000 --> 00:15:27,000
Uh, in many tools that we used in a day-to-day basis.
243
00:15:27,000 --> 00:15:30,000
I think we will not have…
244
00:15:30,000 --> 00:15:32,000
People managing the tools?
245
00:15:32,000 --> 00:15:38,000
Specifically, and hey, we need to create this report, we need to do this, we need to send this to Joe.
246
00:15:38,000 --> 00:15:45,000
I don't need to review this, etc. So, I think we need to see the interfaces that we will have,
247
00:15:45,000 --> 00:15:49,000
is, uh, the tasks that prepared for us.
248
00:15:49,000 --> 00:15:54,000
already done, the report was already generated, the result was already there,
249
00:15:54,000 --> 00:15:58,000
You just need to review this and click send.
250
00:15:58,000 --> 00:16:00,000
or approval, or whatever.
251
00:16:00,000 --> 00:16:04,000
And in this case, you're going through the tasks,
252
00:16:04,000 --> 00:16:11,000
that maybe you will take maybe 15 minutes, maybe 30 minutes of your day going through those things, versus spending
253
00:16:11,000 --> 00:16:14,000
a half a day on doing those tasks.
254
00:16:14,000 --> 00:16:18,000
So you're reviewing the results. So, my point is,
255
00:16:18,000 --> 00:16:21,000
You can ask here to, again, fully execute this flow,
256
00:16:21,000 --> 00:16:28,000
Or you can ask, or be much more careful and execute this in a way that humans still approve this.
257
00:16:28,000 --> 00:16:34,000
So, human review and understand what is going on, and saying, yeah, here's a mistake.
258
00:16:34,000 --> 00:16:38,000
We need to fix this, we don't want to disclose this, we want to disclose this.
259
00:16:38,000 --> 00:16:40,000
Etc. So, it is a lot of…
260
00:16:40,000 --> 00:16:44,000
details that you can, uh, add to the flow.
261
00:16:44,000 --> 00:16:46,000
that will be against steel,
262
00:16:46,000 --> 00:16:48,000
Uh, involving a human in a loop.
263
00:16:48,000 --> 00:16:51,000
Um, at least in the beginning.
264
00:16:51,000 --> 00:16:58,000
Because, again, strongly believe that in a lot of things in finance and in a lot of things in accounting could be really easily automated.
265
00:16:58,000 --> 00:17:07,000
Uh, again, uh, maybe same for, uh, definitely same for marketing, and definitely a lot of things in terms of the management.
266
00:17:07,000 --> 00:17:13,000
Again, uh, why you would ask the same thing…
267
00:17:13,000 --> 00:17:14,000
every day of the…
268
00:17:14,000 --> 00:17:16,000
some team members. Like, hey,
269
00:17:16,000 --> 00:17:18,000
Can you send me this report?
270
00:17:18,000 --> 00:17:22,000
Yeah, I can do this for you, and can come to you and say, hey,
271
00:17:22,000 --> 00:17:27,000
Here's the report, here's the key things that I saw in the report that
272
00:17:27,000 --> 00:17:33,000
Um, not right. So, I asked the question, here's the answers to the questions.
273
00:17:33,000 --> 00:17:36,000
So, again, not complicated at all.
274
00:17:36,000 --> 00:17:39,000
again, different businesses, different flows.
275
00:17:39,000 --> 00:17:42,000
But, again, speaking of example of insurance,
276
00:17:42,000 --> 00:17:48,000
Again, we can profile, we can do a lot of research and understanding with that human
277
00:17:48,000 --> 00:17:51,000
Um, I'm underwriter definitely will not have
278
00:17:51,000 --> 00:17:55,000
time to do this. But we can learn from the human
279
00:17:55,000 --> 00:17:59,000
Uh, to understand what the things that we need to look for.
280
00:17:59,000 --> 00:18:04,000
to be able to, again, decline an approval wherever we need to, um, estimate there.
281
00:18:04,000 --> 00:18:16,000
Right. Right. It sounds like a lot of the functions of a business will still be there, but it might, rather than have 10 people, maybe 1 person.
282
00:18:16,000 --> 00:18:17,000
Yep.
283
00:18:17,000 --> 00:18:25,000
Does it? In marketing, or maybe, like, it could be completely wiped out, but maybe you just have one marketing person overseeing all these things and just saying approved or not approved, or edit.
284
00:18:25,000 --> 00:18:29,000
And same thing with finance, same thing with accounting.
285
00:18:29,000 --> 00:18:30,000
Yeah.
286
00:18:30,000 --> 00:18:32,000
You're absolutely right. So, I don't think it's a…
287
00:18:32,000 --> 00:18:39,000
the case is that you would want to replace the humans with AI. In some cases, it's true, but…
288
00:18:39,000 --> 00:18:44,000
Yeah.
289
00:18:44,000 --> 00:18:45,000
Great.
290
00:18:45,000 --> 00:18:48,000
In many cases, it's actually different. You want to enhance humans with AI. You want to give them a tool to be more productive.
291
00:18:48,000 --> 00:18:49,000
Right.
292
00:18:49,000 --> 00:18:50,000
So you can have the same workforce,
293
00:18:50,000 --> 00:18:51,000
Right.
294
00:18:51,000 --> 00:18:53,000
but just do more things?
295
00:18:53,000 --> 00:18:59,000
with the same, uh, things. Again, coming back to our insurance company. We can approve
296
00:18:59,000 --> 00:19:02,000
10 times, 50 times more, uh…
297
00:19:02,000 --> 00:19:05,000
uh… estimates.
298
00:19:05,000 --> 00:19:07,000
really, really quickly, just because…
299
00:19:07,000 --> 00:19:12,000
He already did the job, and humans maybe just need to approve this. Or maybe it's…
300
00:19:12,000 --> 00:19:17,000
If it's meeting some level of thresholds, it could be approved automatically, and we don't really…
301
00:19:17,000 --> 00:19:20,000
to have the approval.
302
00:19:20,000 --> 00:19:22,000
are of the human.
303
00:19:22,000 --> 00:19:24,000
So, again,
304
00:19:24,000 --> 00:19:29,000
I think we can see the functionality that's done differently,
305
00:19:29,000 --> 00:19:31,000
Uh, rather than trying to again.
306
00:19:31,000 --> 00:19:35,000
accomplish this using, uh, the traditional methods.
307
00:19:35,000 --> 00:19:42,000
I'm excited. I'm… I know that how it's impacted me and impacted my business, and I could see that at a larger scale.
308
00:19:42,000 --> 00:19:45,000
what you're talking about.
309
00:19:45,000 --> 00:19:53,000
And how it supports. And so, can you share, um, a client success? You don't have to share names. I know you might be probably under NDA, but
310
00:19:53,000 --> 00:19:58,000
what, um, client success have you seen? Results?
311
00:19:58,000 --> 00:20:00,000
From your work.
312
00:20:00,000 --> 00:20:06,000
So, let me kind of give you maybe a couple of quick examples of that I think is
313
00:20:06,000 --> 00:20:11,000
making sense in this case. Because, uh, like, we can take…
314
00:20:11,000 --> 00:20:13,000
Uh…
315
00:20:13,000 --> 00:20:16,000
regular flows that exist, for example, for…
316
00:20:16,000 --> 00:20:19,000
uh, really boring…
317
00:20:19,000 --> 00:20:23,000
thing, like, again…
318
00:20:23,000 --> 00:20:26,000
accounting, or, again, um, financial planning, etc.
319
00:20:26,000 --> 00:20:28,000
Um…
320
00:20:28,000 --> 00:20:29,000
Some people love it.
321
00:20:29,000 --> 00:20:30,000
Some people are loving, but you and I are in agreement, I don't…
322
00:20:30,000 --> 00:20:32,000
Again, I'm a big fan of…
323
00:20:32,000 --> 00:20:36,000
financial portions, I will probably die if I will be an accountant.
324
00:20:36,000 --> 00:20:37,000
Yes.
325
00:20:37,000 --> 00:20:44,000
Um, again, I have a lot of respect to those people, because I don't understand how… my brain just wires differently.
326
00:20:44,000 --> 00:20:51,000
Uh, I were… again, we see… so we built into the system right now that, uh, utilizing AI to
327
00:20:51,000 --> 00:20:53,000
do this, uh, functionality.
328
00:20:53,000 --> 00:20:55,000
And, oh, funny enough,
329
00:20:55,000 --> 00:20:59,000
And that we built in this system for public sector.
330
00:20:59,000 --> 00:21:02,000
And the core problem that we have…
331
00:21:02,000 --> 00:21:07,000
We need to convince the client, which is a state's, uh, that we'll be using this,
332
00:21:07,000 --> 00:21:14,000
Uh, we need to convince them to actually, um, use this tool in a way that will allow them to
333
00:21:14,000 --> 00:21:16,000
uh… have…
334
00:21:16,000 --> 00:21:19,000
probably 10% of the stuff that they currently have.
335
00:21:19,000 --> 00:21:21,000
Um, because…
336
00:21:21,000 --> 00:21:24,000
a lot of things is actually automated.
337
00:21:24,000 --> 00:21:26,000
Um, and they're excited about this.
338
00:21:26,000 --> 00:21:29,000
The unions does not.
339
00:21:29,000 --> 00:21:31,000
But the fun thing is…
340
00:21:31,000 --> 00:21:36,000
This is the logic that, uh, we can bring to the table. This is the…
341
00:21:36,000 --> 00:21:40,000
kind of approach that could be much different.
342
00:21:40,000 --> 00:21:47,000
And, again, same with multiple other things that we build in, um, in day-to-day life.
343
00:21:47,000 --> 00:21:52,000
Uh, where we can basically change the flows and make it…
344
00:21:52,000 --> 00:21:57,000
that, again, you collect information with AI much faster.
345
00:21:57,000 --> 00:22:05,000
process different kind of information, or you provide the different insights to the end clients, or…
346
00:22:05,000 --> 00:22:08,000
to the team. I think, like, um, in…
347
00:22:08,000 --> 00:22:10,000
I can bring you one example of the…
348
00:22:10,000 --> 00:22:15,000
one enterprise client that we have, uh…
349
00:22:15,000 --> 00:22:19,000
First, uh, company that's dealing in manufacturing and chemical industry.
350
00:22:19,000 --> 00:22:25,000
So, we have a tool that used to create different types of reports.
351
00:22:25,000 --> 00:22:28,000
Oh, there. And it is a large…
352
00:22:28,000 --> 00:22:32,000
system that we pull in the data from, and we generate reports.
353
00:22:32,000 --> 00:22:41,000
Um, previously it was a team that did this, currently it's done by EI, uh, and we have a human that's reviewing this.
354
00:22:41,000 --> 00:22:47,000
For legal purposes. However, it's all done with AI, with basically minimal intervention
355
00:22:47,000 --> 00:22:55,000
from the human. And, uh, another thing that's important, uh, we have interface for the inclines not just to
356
00:22:55,000 --> 00:22:58,000
review the PDF reports, which they had.
357
00:22:58,000 --> 00:23:03,000
Now it's a tool where they can actually ask questions.
358
00:23:03,000 --> 00:23:07,000
And they can understand what is this going on, etc.
359
00:23:07,000 --> 00:23:14,000
And we saw being examples when they share this, not with the technical folks, they share this with the VPs and C-levels.
360
00:23:14,000 --> 00:23:17,000
that can ask a question like, hey,
361
00:23:17,000 --> 00:23:25,000
What about this, how this is progressing, how we can optimize the results, etc.? Can we get more out of this turbine, etc.?
362
00:23:25,000 --> 00:23:29,000
And the answer is most likely yes, and here's what you can do, etc.
363
00:23:29,000 --> 00:23:34,000
Again, it's have all the disclaimers that you need to have a technical review, etc.
364
00:23:34,000 --> 00:23:39,000
Sure. It's really, again, critical manufacturer, we cannot just, like…
365
00:23:39,000 --> 00:23:44,000
We can assume things. Uh, we need to have a multiple confirmation. However, based on data,
366
00:23:44,000 --> 00:23:50,000
we can make this tool that will tell the client multiple options how they can
367
00:23:50,000 --> 00:23:53,000
improve this, and how they can make it really, really quick.
368
00:23:53,000 --> 00:23:56,000
on a board meeting versus, hey, can we send this
369
00:23:56,000 --> 00:24:03,000
someone and get their response in one month. No, they can do this in, like, 15 seconds.
370
00:24:03,000 --> 00:24:05,000
Wow. Wow.
371
00:24:05,000 --> 00:24:09,000
That's amazing. That is amazing.
372
00:24:09,000 --> 00:24:16,000
Congratulations on that! Congratulations! You're at this really inflection point with the work that you do.
373
00:24:16,000 --> 00:24:19,000
And really seeing the results for your clients.
374
00:24:19,000 --> 00:24:21,000
I think we'll have much more, uh, in this regard.
375
00:24:21,000 --> 00:24:23,000
Yeah.
376
00:24:23,000 --> 00:24:25,000
that, uh, will be much more disrupted.
377
00:24:25,000 --> 00:24:28,000
to the current flows.
378
00:24:28,000 --> 00:24:29,000
Yeah.
379
00:24:29,000 --> 00:24:31,000
I strongly believe that if you muster
380
00:24:31,000 --> 00:24:35,000
understanding of the current tool, you can do much more.
381
00:24:35,000 --> 00:24:39,000
with this. Because in many cases, um,
382
00:24:39,000 --> 00:24:42,000
Again, if you…
383
00:24:42,000 --> 00:24:49,000
I understand many people don't really be big fans of reading manuals, but in some cases, you actually need to.
384
00:24:49,000 --> 00:24:54,000
Uh, and you can, uh… again, in real life, you can break things and have to read the manual, sure.
385
00:24:54,000 --> 00:24:57,000
for home appliances, sure, please do this.
386
00:24:57,000 --> 00:25:00,000
Um, but for everything else,
387
00:25:00,000 --> 00:25:03,000
you need to understand the capabilities,
388
00:25:03,000 --> 00:25:06,000
And the point is, those capabilities changing every day.
389
00:25:06,000 --> 00:25:07,000
Yeah, yeah.
390
00:25:07,000 --> 00:25:10,000
Each day, I'm reading something in the news that…
391
00:25:10,000 --> 00:25:16,000
I was telling a year ago that, hey, it's probably 5 years ago away, it will not be possible, etc.
392
00:25:16,000 --> 00:25:19,000
Yeah, yeah, yeah, right.
393
00:25:19,000 --> 00:25:21,000
Every day.
394
00:25:21,000 --> 00:25:25,000
This is absolutely astonishing, and I don't think we…
395
00:25:25,000 --> 00:25:28,000
Um, actually seen anything like this.
396
00:25:28,000 --> 00:25:29,000
Right.
397
00:25:29,000 --> 00:25:31,000
I don't think we'll have a GI soon. I don't think…
398
00:25:31,000 --> 00:25:37,000
we will… we can ignore the progress that's currently happening with those tools.
399
00:25:37,000 --> 00:25:40,000
And think, like, hey, yeah, we probably will be okay.
400
00:25:40,000 --> 00:25:42,000
No, we will not. A lot of…
401
00:25:42,000 --> 00:25:45,000
Uh, at least, uh…
402
00:25:45,000 --> 00:25:47,000
Um, thinking, uh…
403
00:25:47,000 --> 00:25:51,000
type of job, so basically anything that knowledge workers do.
404
00:25:51,000 --> 00:25:54,000
I think will be replaced.
405
00:25:54,000 --> 00:25:56,000
Really soon. One way or another…
406
00:25:56,000 --> 00:25:59,000
or enhanced, if you're smart.
407
00:25:59,000 --> 00:26:01,000
or replaced if you're not. So…
408
00:26:01,000 --> 00:26:02,000
Yeah.
409
00:26:02,000 --> 00:26:05,000
Play… how you want it.
410
00:26:05,000 --> 00:26:16,000
I love it. I agree. I agree. So, Vadim, so what would you say to these people, these entrepreneurs right now?
411
00:26:16,000 --> 00:26:21,000
Uh, what is the mantra that you would like to share?
412
00:26:21,000 --> 00:26:23,000
I think the core thing is…
413
00:26:23,000 --> 00:26:25,000
All they're the same.
414
00:26:25,000 --> 00:26:34,000
It's the same playbook, experiment more, the people that win in business, not the people that have the most genius ideas, in many cases.
415
00:26:34,000 --> 00:26:38,000
that people that, uh, dumb enough to try this multiple times?
416
00:26:38,000 --> 00:26:39,000
Yeah.
417
00:26:39,000 --> 00:26:41,000
Uh, and I think…
418
00:26:41,000 --> 00:26:44,000
the core point is trying more,
419
00:26:44,000 --> 00:26:48,000
and experimenting more, and seeing opportunities, and being bold.
420
00:26:48,000 --> 00:26:50,000
own solutions.
421
00:26:50,000 --> 00:26:51,000
Yeah.
422
00:26:51,000 --> 00:26:52,000
Because someone will.
423
00:26:52,000 --> 00:26:56,000
And someone will… will take your share of the market.
424
00:26:56,000 --> 00:27:00,000
If you will be slow, and if you think that…
425
00:27:00,000 --> 00:27:02,000
hey, the tool is not there.
426
00:27:02,000 --> 00:27:05,000
The point is, the tool is not there today.
427
00:27:05,000 --> 00:27:10,000
you need to think how this tool will enhance in a couple of years, so…
428
00:27:10,000 --> 00:27:16,000
you will see the opportunity to implement this to your business that will be much more disruptive.
429
00:27:16,000 --> 00:27:18,000
Because someone actually doing this right now?
430
00:27:18,000 --> 00:27:20,000
And they're changing the way…
431
00:27:20,000 --> 00:27:25,000
how they will compete, and you will not be able to outsmart them
432
00:27:25,000 --> 00:27:27,000
and do this much faster than they did.
433
00:27:27,000 --> 00:27:32,000
Just because they're already running in a…
434
00:27:32,000 --> 00:27:35,000
basically speed of sounds, you need to be…
435
00:27:35,000 --> 00:27:36,000
Right. Right. Like you mentioned, so take it, read the playbook, understand it, implement it, do it fast, do it quickly.
436
00:27:36,000 --> 00:27:45,000
Even faster than this.
437
00:27:45,000 --> 00:27:52,000
And I would suggest hire someone as an expert like Vadim, so you don't fall too hard.
438
00:27:52,000 --> 00:27:54,000
And so how…
439
00:27:54,000 --> 00:27:55,000
Experiment!
440
00:27:55,000 --> 00:27:56,000
Or experiment yourself. I'm really encouraging people to experiment themselves.
441
00:27:56,000 --> 00:28:04,000
experiment themselves, but I'm also inviting them to contact you in case they have questions. Would that be okay?
442
00:28:04,000 --> 00:28:05,000
Great, and so…
443
00:28:05,000 --> 00:28:07,000
Yeah, absolutely. Yeah, you can Google me and find me on LinkedIn, or, uh…
444
00:28:07,000 --> 00:28:08,000
X, etc.
445
00:28:08,000 --> 00:28:20,000
Wonderful! So, uh, that's how you… you know, get ahold of Adeem, you have any questions about how to implement AI into your company to accelerate, grow, and scale.
446
00:28:20,000 --> 00:28:23,000
Uh, yeah, contact them on Google or LinkedIn.
447
00:28:23,000 --> 00:28:29,000
And I… that's it, that's all I have. I really appreciate your time today, Vadim. I think that we had a great conversation.
448
00:28:29,000 --> 00:28:30,000
Thanks for inviting.
449
00:28:30,000 --> 00:28:38,000
It was really wonderful to hear. I learned so much. Uh, I am so glad that we have people like you at the helm.
450
00:28:38,000 --> 00:28:44,000
Leading the charge in the AI space and helping us to grow with it and learn how to use it better.
451
00:28:44,000 --> 00:28:46,000
Appreciate it, and uh…
452
00:28:46,000 --> 00:28:49,000
Experiment more, and we'll have fun.
453
00:28:49,000 --> 00:28:54,000
We'll have fun, we'll have fun! Just don't kill each other, right?
454
00:28:54,000 --> 00:28:55,000
That's the next chapter. We'll see.
455
00:28:55,000 --> 00:29:10,000
That's the next chapter! Okay. Alright, thank you, audience, for being here. I'm so glad. Um, until next time, I'm your sister-in-Flo. Remember, anytime you have a chance to communicate, you also have a chance to inspire,
456
00:29:10,000 --> 00:29:13,000
and make a positive difference in the world.
457
00:29:13,000 --> 00:29:16,000
Much love, take care. Bye-bye.
458
00:29:16,000 --> 00:29:20,000
Hi, Vadim.