Wall Street, Ethical AI, and Order Takers Beware

Download MP3
Intro:

Welcome to wake up with AI, the podcast where human powered meets AI assisted. Join your hosts, Chris Carillon, Niko Lofakas, and George b Thomas as we dive deep into the world of artificial intelligence. From the latest AI news to cutting edge tools and skill sets, we are here to help business owners, marketers, and everyday individuals unlock their full potential with the power of AI. Let's get started.

Chris Carolan:

Good morning, everyone. Happy Thursday. Welcome to wake up with AI. I am one of your hosts, Chris Carroll, and here with my cohosts, George b Thomas and Niko Lofakis. How are you guys doing today?

Nico Lafacis:

Time to wake up. Okay. So it's doing pretty good today. There's plenty of news, tons of stuff out there. We're gonna talk about one of the most pertinent and interesting topics today.

George B. Thomas:

Oh, I'm excited about that. I'm doing good because I actually got the opportunity to Wake up. So that's a good thing. Any day that I'm, like, I wake and can do something on the planet, I'm happy about that.

Chris Carolan:

Good to take stock and and what needs to happen on the daily, which is why we're here. Because, thankfully, it makes sense to do the show because we're always going to have something super insightful and relevant to talk about, thanks to the blinding speed of AI development. So let's not hold back, Niko. What have we got today?

Nico Lafacis:

Okay. So today is actually yeah. Hate to say it. It's not so much about development. It kind of you know, it it is in a sense.

Nico Lafacis:

But today, the topic is implementation Implementation in your org.

George B. Thomas:

Okay. This is interesting because once again, just wait till I get to my skill that pays the bills section. Go ahead. Go ahead.

Nico Lafacis:

You guys gotta understand, like, this show is baller to me because we do not, like, converse and, like, hey. I'm gonna talk about this tomorrow. You talk about this. No. So, like, if I'm bringing it and George is matching it, you know that we're on the same wavelength.

Nico Lafacis:

So, yeah, what I am looking at I'm looking at a lot of studies. I'm looking at LinkedIn studies. I'm looking at Yale studies, Stanford studies, Harvard studies on adoption rates by corporate. And what was very, very interesting was earlier this year, we got a lot of guff. We got a lot of BS, you know, from me in particular if if you know and you follow what I what I talk about.

Nico Lafacis:

I've been railing against Wall Street for the last 8 months because even still to this day, I'm hearing the word hype connected to artificial intelligence, throws me for a loop. So here's how much the adoption rate is, you know, happening, though people might be saying it isn't. I posted about this, but earlier, like, maybe 6 months ago, Goldman Sachs was releasing probably one of the most negatively detrimental articles about artificial intelligence. Everybody ate it up. Stocks started dropping in the in the AI sector.

Nico Lafacis:

On the same day that nobody happened to notice, Wall Street Journal releases an article about Goldman Sachs rolling out its generative AI product across the org. K. 2 months later, JPMorgan rolling out a generative AI product across the board. 1 month later, Wall Street Journal rolling out a generative AI product across the board. I started talking about this a year and a half ago.

Nico Lafacis:

I said, look, if you ain't first, you're last. I've been saying it ever since GPT launched. I said, if you ain't first, you're last. If you have a quintessential product service data and you're not wrapping that up with a bow and putting it underneath a large language model, you're gonna end up last because the first person to do it is gonna run out the gate. And the first person that does it just gets the the largest piece of the pie.

Nico Lafacis:

Let's put it that way. So at this point, what these studies are showing is that the orgs we're we're seeing what the percentage rates are. So after everybody did the adoption, all of a sudden Wall Street was like, yeah. But we're not seeing any money out of it. And that was because there's no commoditization.

Nico Lafacis:

There was no product. There was nothing to actually put money into. You couldn't short anything. Right? So the Wall Street boys go all all upset because we couldn't make our normal monies.

Nico Lafacis:

Then they released q three reports. Productivity is up all across the board. Meta, 22% up on where? Advertising driven by what? AI algorithm.

Nico Lafacis:

Right? Google up 10%. Everybody up. Every like, just about everybody I follow that's worth following that's in the sector. Right?

Nico Lafacis:

NVIDIA up. Not only was NVIDIA up, it was well beyond expectations. Okay?

George B. Thomas:

Baller status on that one. Like, come on.

Nico Lafacis:

I I don't care how much you guys think that these stocks are gonna tank. Nvidia is a great example of a company that is getting messed with. Okay? That stock should be near 150, 160 right now, just based on faith in the technology itself. There is no one who is moving away from this.

Nico Lafacis:

Eric Schmidt has done 3 talks in the last month. Everybody should go watch because they are all speaking to what is coming, what is already here, how business is reacting to it, and what you should be doing in order to make your way in the world. Right?

George B. Thomas:

Niko, if you have those links, send them to me. I'll make sure they're in the show notes of the podcast for the people.

Nico Lafacis:

Okay. Will do. So that all brings it back and boils it down to productivity's up, adoption rates are up. When the latest study I think it was from MACON, so I wanna say this is a study that was done from or brought in from, Paul Reitzer and Mike Kaputt that I wanna say the adoption rate was, like, 400 plus percent over last year. So earlier this year, leading up to now, pretty much all I heard in media was enterprise slow to adopt.

Nico Lafacis:

Enterprise not really adopting. Big companies not really taking into it. And now the studies have come out, and it's like, no. They've been doing it this whole time to the point where productivity numbers are up and no one can really figure out why. Okay.

Nico Lafacis:

So this is what comes back to this subject of your people in your org are using the tool. They're doing cool things. They're probably increasing productivity. Are you talking about it? Are the teams openly talking about it?

Nico Lafacis:

Because let me tell you, if I'm in my silo and I'm doing some seriously cool stuff, but it's not able to get shared with the rest of my team, that's crippling my team. That's not it's not helping anybody. Right? If my team is working on stuff, and we have some really cool stuff going that helps org within the team, like helps us, you know, keep the busy work easy, and that's not getting shared. Well, then there's other teams that could benefit from that that are not able to to leverage it.

Nico Lafacis:

Right? For the org to be able to speak to the clients, if you're not using it, if you don't have faith in your team using it, if you don't understand and have seen the visual outputs of what your team is doing, how do you best explain that to your clients? And then the biggest question of them all that nobody wants to answer because it is what it is. I have answers, and they're the one I have is is very good to this question, but I just wanna see what you guys come back with.

George B. Thomas:

Oh, it's a test today? Oh, man.

Nico Lafacis:

A little bit. A little bit.

George B. Thomas:

I did not prepare. I did not prepare.

Nico Lafacis:

What do you think should happen to the pay structure? Meaning, our clients still build at the same rate when AI is applied to their account. Because they will know underneath that same time is not equivalent now. It doesn't take as much time to generate either content or whatever it might be. Including including portal now offers AI build out.

Nico Lafacis:

I can build real like pretty decently complex workflows with Copilot, saving me I'm not even I'm not joking. Saving me minimum 30 minutes of setting up a workflow. Right?

George B. Thomas:

So need to get

Chris Carolan:

out of that billable time business and into the billable value business.

Nico Lafacis:

So two questions. How are you handling it in your org? How does your org talk about it? How does your org spread it out? How do you enable your org to be able to adopt this?

Nico Lafacis:

At least the methodology? The tools is one thing. We all know what the tool sets are. We all know what the major ones are, and we're literally talking about GPT being a PC. That's where we're at.

Nico Lafacis:

This is 1980. Right? We brought PCs into the office. Now we're bringing GPT into the office. And again, we're gonna have 100% accuracy rate if we're not there already.

Nico Lafacis:

I haven't heard anything about hallucinations in a minute.

George B. Thomas:

It's getting real good. It's getting real good.

Chris Carolan:

Chris, do

George B. Thomas:

you wanna answer that question first, or do you want me to? Because I feel like

Chris Carolan:

I just I told you I was bringing the heat. It's it's simple. Like, gotta get out of the billable hours business and into the the billable value business. If the customer is more concerned about how long it's taking you to do something instead of the value of the output, then you're in a bad place anyways. And the struggle is because that's what we've been doing, it's so much easier for both sides of the equation to understand I'm giving you this, I'm getting this back because it can be a lot harder to measure value versus hours in hours out.

Chris Carolan:

So since it's easier, it's gonna be a lot harder to change away from that from most orgs. It's easier, which means likely prices go down because they can't defend the fact that it doesn't take people as long and things like that. I mean, it just speaks to focus on value outcomes instead of the time it takes to do stuff.

Nico Lafacis:

I got good.

Chris Carolan:

Go for it, George.

George B. Thomas:

So here's the deal. Order takers, dead. You're dead. Order takers, you're dead. Let me just give you a little light into what's happening.

Nico Lafacis:

Define for everybody so that they're not lost. Order takers means what?

George B. Thomas:

If you're just brain dead and you have, we do this inbound strategy that is 5 blog articles, 2 landing pages, and 1 campaign in the next 3 months for you, yes, we can repeat and Yes. We can repeat and rinse the 4th

Chris Carolan:

No. Gonna say. You're sounding like a robot there. Damn.

George B. Thomas:

Purposely. Purposely. Now I'll tell you, like, if the clients that I'm serving, they see that I can get more done. So and I spend less time doing because I have an assistant doing. Here's the thing though.

George B. Thomas:

That enables me to do more thinking, which by the way, thinking is actually doing. It gives me more time to think, AKA strategize, therefore, direct more direction in where they're going. Here's the other thing that less doing and more thinking and strategizing enables me as somebody who provides a service. I can do more enabling. Meaning, we're giving our clients their GPT.

George B. Thomas:

We're giving our clients their strategy. And here's the craziest thing. You know a conversation that I it's funny how this all works. I had a conversation last night with a client. He goes, listen.

George B. Thomas:

If you need to charge us more, we understand because I'm enabling, and I'm thinking, and I'm strategizing, and I'm not taking orders. I'm not on a conveyor belt. I'm creating an experience that happens to leverage AI technology as part of it. And I think this is what Brian Halligan meant when he said anybody that's a HubSpot partner on LinkedIn needs to think about becoming an AI powered agency because the agencies that are order takers, that the agencies that transform themselves, hey. Can we pay you more?

George B. Thomas:

Yes. You can. I'll send you a new invoice.

Nico Lafacis:

Yep. Yep. So now adoption. So now we understand and I I like those answers because the way I see the future is is that we do just that. We move away from the idea of production being the sole value in content, and we move to, contribution.

Nico Lafacis:

I think we talked about this before. And that being the case, yeah, the cost of production comes down because as time goes on now we have GPT doing it, but we only have GPT doing the thinking and planning for us and us doing, you know, the checking, right, and the review. But what happens when we have the agents, and the agents are also doing the doing and the planning. Right? And if they have self taught reasoning, also the reviewing.

Nico Lafacis:

So let's back up from that. We'll we'll cover that another day. We'll still have episodes forever. Forever. Episodes forever.

Nico Lafacis:

Org adoption. How are you guys looking at it?

Chris Carolan:

Do you have a policy? Has it made it into the handbook yet? For all the organizations that live and die by their handbook, if it's not in there yet, it's making its way very unintentionally into your organization right now, which carries a lot of opportunity for the people that choose to, like, find those tools to help them do their jobs. But like you say, that creates risk in a couple of different ways in terms of they're doing what they need to do for their job, not considering necessarily the the health of the organization from a customer experience standpoint and, like, competitive information standpoint depending on how they're using AI. But then, also, it creates, you know, shifts in perceived value of this person as their productivity has gone through the roof now.

Chris Carolan:

What are you doing, you know, Nico? George is pushing out, like, 15 podcasts a week now. You are the best. Get one out the door. Like, what's going on here?

Chris Carolan:

When in reality, like

George B. Thomas:

Magician.

Chris Carolan:

The organization is not creating that even playing field of here's the tools that you need to be successful for the expectations that we have and approaching it from that perspective. You're just gonna create all of these unintentional consequences in the face of probably some really awesome things happening too. And those are the things that, like, okay. We don't want to try and figure this out. So now just everybody can't use AI.

Chris Carolan:

That's where policies go when you don't wake up with AI, like, honestly.

George B. Thomas:

Yeah. I wanna throw something out here. First of all, you're like, George is over there doing 15 podcast episode. Like, first of all, it's adopt, adapt, accelerate. Alright?

George B. Thomas:

I just want everybody to write that down. It's adopt, adapt, accelerate. And to answer your question, Niko, I'm gonna say this, soaking it like a sponge. But the word that I'm gonna use that kind of mirrors or attaches to Chris' I'm just gonna say the word because, by the way, I haven't even been able to do my skills that pays the bills yet. Ethically, that's the word.

George B. Thomas:

I'll use that word for right now. Yeah.

Nico Lafacis:

I like that. Okay. I like that.

Chris Carolan:

This is where you segue into the

George B. Thomas:

No. You can. You can segue. No. I can I can go into it?

George B. Thomas:

So listen. It's funny because, Nico, you

Nico Lafacis:

said across the board. Across the board.

George B. Thomas:

Across the board. I also sent a video in our Slack channel that I'll make sure that we put in the chat pane. And this phrase hit me like a ton of bricks, and it was a 100 years of change in the next 4 years. And I was like, oh god. Oh god.

George B. Thomas:

What does that mean? Well, that means that it becomes the wild, wild west real quick for a lot of people. Well, what does that mean? That means we need the law, ladies and gentlemen. We need some people that are out there being, like, you know, the why it irks of the AI landscape, if you will.

George B. Thomas:

And so that got my brain going. Like, what's a skill that equates to that? And so today, ladies and gentlemen, we're diving into something that's not just important. Good golly. It's essential.

George B. Thomas:

Like, we live in a world where AI can make decisions that shape industries. I mean, you just heard Niko talk about Wall Street. Anyway, influence behavior and directly impact real people, real humans. With that power comes serious responsibility. Yes.

George B. Thomas:

I could have done the whole Spider Man Marvel, like, with great power. But today's AI skill that pays the bill is ethical AI use. That's what I want people thinking about. Leveraging AI to boost business or smooth processes is not enough. We need to make sure that every AI system that we use is rooted in fairness, rooted in transparency, rooted in accountability.

George B. Thomas:

When we incorporate or when you, the listener, the viewer, incorporate AI into your marketing, your operations, or any other area of your business, you're choosing how that AI will influence the lives of your customers. It will influence the lives of your do you understand the power, the weight that that actually breeds? Also, it will influence the lives of your employees. It will influence the lives and community that you have around your brand. So today, we're focusing on how to use AI responsibly because if we don't, the ripple effect could be far more damaging than you actually can sit here and realize.

George B. Thomas:

Now I want you to picture this. You're working on a big project using AI to analyze, let's say, some customer data. Things are going great. Your AI predicts behaviors, offers insights, and it feels like you're really onto something. This is awesome.

George B. Thomas:

Let's giddy up. But then you start to notice a pattern. Certain customer groups aren't being represented in the data. It's like they're being left out of the recommendations. Why is that?

George B. Thomas:

So you dig deeper and what do you find? The AI bases its decisions on historical data that carry some unintentional bias. Now that bias is now showing up in your campaign affecting the outputs, accuracy, and fairness of the things that you're actually creating and doing. In that moment, you realize that just because an AI can do something doesn't mean it's doing it fairly or doing it responsibly. And here's the kicker, it's not the AI's fault.

George B. Thomas:

It's about the data fed into it and how we as humans actually use the dang tool. We as the humans need to be prepared for that scenario. AI can only be as ethical and unbiased as the data it's trained on. While AI may be running the show on a technical level, we are still in control. We're the ones that need to be steering the ship.

George B. Thomas:

So use it in an ethical way. Make it that it's a nonnegotiable when it comes to your organization, your adoption, that it's ethical AI, meaning transparency and accountability, and like Chris said, documentation and maybe a little governance around what the, whoo, Wild Wild West is actually listen. You gotta stop and think right here before you just go, hey. These guys are cool. Let's go use all the tools.

George B. Thomas:

Your people have the right to understand how decisions are being made, especially when they directly impact them. If a customer feels they're being mistreated through price recommendations or how they interact with your product, it's your job to explain the why behind those decisions. And if something goes wrong, you have to own it. You have to be at the helm. Accountability means stepping up, being transparent, and taking responsibility of the outcomes of your AI systems.

George B. Thomas:

Some of us can't even be responsible for ourselves.

Nico Lafacis:

I was thinking about.

George B. Thomas:

Go ahead. Go ahead.

Intro:

Go ahead.

Nico Lafacis:

Yeah. Just something you said there and and along the lines of of ethical thinking. Do you give the prompt to the client that you used to produce the content? Do they own that because it maybe used their name? Is it proprietary to the org?

Nico Lafacis:

How does that work?

George B. Thomas:

I don't think people are thinking at that level. And here's the thing, Nico and Chris. I wanna before I wrap this skills that pay the bills section up today, I wanna leave people with 3 main things to think about and kind of take away today. Ladies and gentlemen, data quality matters more than it ever has in your life. The AI you use is only as good as the data it's built on.

George B. Thomas:

The prompts that you're trying to engineer are paying attention to the context that you're giving it. So always make sure your data is clean, fair, and free of any bias. If you start with flawed data, you get flawed outcomes. Garbage in, garbage out. The second thing, transparency is key.

George B. Thomas:

Be upfront with your customers, employees, or stakeholders about how AI works. Have an AI policy. You have a privacy policy on your website. Does it talk about how you use AI inside of your organization? Whether it's a chatbot recommendation engine, customer segmentation tool, transparency builds trust.

George B. Thomas:

And if they understand how it works, they're more likely engaged with it positively. Like, this is why I'm handing them the keys to their AI kingdom as the person who's providing the services so they understand it and engage with it in a positive way. The last thing I'll say is accountability is a nonnegotiable. When things go wrong, and they will, it's your job to take responsibility. Own the mistakes, fix them, and make sure your AI is aligned with your brand's ethical standards, your core values, who you are as a human, as teams, and as a company culture.

George B. Thomas:

Chris, we're gonna talk more about that in a little bit later today when our next session that we do. Accountability doesn't just protect your reputation. Ladies and gentlemen, I'm here to tell you that it strengthens it, and that is your skills that pays the bills for today's session.

Nico Lafacis:

Oh, man. I I'm telling you, man, like, the the day of us doing it

Chris Carolan:

Reminds me of our of our talk last week about honesty and transparency. Yep. Yep. Why I get so concerned about Nico's earlier question is that so when productivity goes up and cost goes down, resource need goes down, that impacts 3 primary things, in my opinion, profitability, margins, and compensation. I've seen so often, more times than I've seen not, that the most unethical decisions are made because of those three things.

Chris Carolan:

And it's gonna be so easy, and either it's the leadership not being honest or it's the individual contributor not being honest with the customer that says, oh, yeah. That took the normal amount of time, but I actually used AI, and I'm not telling either my boss or the customer. That's not gonna last very long. You're not gonna be able to get away with that very long, if at all. And that's where just being upfront as quickly as possible.

Chris Carolan:

1, it just creates this other level of trust that you're just being open, and it gives you the path. Like, I'm being open with you because I'm not changing the price. You're still getting the same value. Like, we're doing this. Now what else could we do?

Chris Carolan:

Because you know we're doing AI. Like, we're happy to, like, get you involved in the prompting. Like, how can we all make this much better to get to the value faster? That doesn't change how much the value, you know, that we're bringing. Like, if anything, it should increase the value.

Chris Carolan:

And like George says, like, man, you guys are delivering more value of higher quality faster. Can I pay you more? Like, please let me pay you more money.

George B. Thomas:

Yeah, Chris. It almost sounds like you're doing a a small call to action for people out there that you're saying, please be a helpful, honest human. That's what I hear, brother. That's what I hear.

Chris Carolan:

That's that's what we need. You know, it's hard not to go down that road when you when you think about Wall Street being involved in Nika's whole news story. Like, it's very easy to control the narrative in today's, you know, media landscape. And if you can do that and and be out front of stuff while kind of, like, saying, don't look over here. You know, look over here instead.

Chris Carolan:

Okay. Now we got a 6 month head start. Oh, this is what's going on, guys. Like, that's why I mean, that's why we're here. That's why we're on the show.

Chris Carolan:

All we can do is share knowledge and hope we can help make good decisions.

George B. Thomas:

Yeah. Being the great and powerful Oz is not a viable AI business strategy.

Nico Lafacis:

Yeah. The the day for debate is is coming. It's so close. I think we we touched on it a little bit because I was teasing it, and now we only have a couple of minutes. So it seems like I'm teasing it again.

Nico Lafacis:

I'm not doing this on purpose. At one point, during inbound, George had a very interesting debate with, I can't remember his name. What was his name?

Chris Carolan:

Dale Bertrand?

Nico Lafacis:

Yes. Dale Bertrand.

Chris Carolan:

Great debate. Fire and spark.

Nico Lafacis:

Yep. I need

George B. Thomas:

to do it. Tip of your tongue, bro. Very well done.

Nico Lafacis:

I still need to post the the video I have, of that. And during the debate, there was a section that was mentioned towards the end that I found so interesting and forgot to stick around for. Empathy. Yeah. Empathy on the machines.

Nico Lafacis:

If you follow me on LinkedIn, there's a really interesting post that I'm going back and forth with Clement Horvath on, and we're on that one, we are discussing intelligence and the definition of intelligence. So what I wanna leave with is this is the time in which we have to think about everything differently. This is the greatest disruptor to technology and to human civilization that we have ever experienced. Even greater than the atomic bomb, greater than computers, greater than the Internet, greater than cell phones. It's the largest disruptor.

Nico Lafacis:

Every single industry will be touched by this technology. There isn't a single aspect of human life and production that we do with regards to work that won't incorporate this technology at some point. That's why it is ever so important to wake up to AI. Wake up with it. Come on, guys.

Chris Carolan:

Have a great day, everybody.

Intro:

That's a wrap for this episode of wake up with AI. We hope that you feel a little more inspired, a little more a little more informed, and a whole lot more excited about how AI can augment your life and business. Always remember that this journey is just the beginning and that we are right here with you every step of the way. If you love today's episode, don't forget to subscribe, share, and leave a review. You can also connect with us on social media to stay updated with all things AI.

Intro:

Until next time. Stay curious, stay empowered, and wake up with AI.

Creators and Guests

Chris Carolan
Host
Chris Carolan
Chris Carolan is a seasoned expert in digital transformation and emerging technologies, with a passion for AI and its role in reshaping the future of business. His deep knowledge of AI tools and strategies helps businesses optimize their operations and embrace cutting-edge innovations. As a host of Wake Up With AI, Chris brings a practical, no-nonsense approach to understanding how AI can drive success in sales, marketing, and beyond, helping listeners navigate the AI revolution with confidence.
Nick Lafakis
Host
Nick Lafakis
Niko Lafakis is a forward-thinking AI enthusiast with a strong foundation in business transformation and strategy. With experience driving innovation at the intersection of technology and business, Niko brings a wealth of knowledge about leveraging AI to enhance decision-making and operational efficiency. His passion for AI as a force multiplier makes him an essential voice on Wake Up With AI, where he shares insights on how AI is reshaping industries and empowering individuals to work smarter, not harder.
Wall Street, Ethical AI, and Order Takers Beware
Broadcast by