Debug Information
Processing Details
- VTT File: 12652426-351-ai-for-creatives-navigating-the-ethics-of-technology-with-tasha-l-harrison.vtt
- Processing Time: September 11, 2025 at 03:36 PM
- Total Chunks: 2
- Transcript Length: 82,955 characters
- Caption Count: 877 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.480 --> 00:00:07.360] Look, payday is awesome, but running payroll, calculating taxes and deductions, staying compliant, that's not easy.
[00:00:07.360 --> 00:00:09.360] Unless, of course, you have Gusto.
[00:00:09.360 --> 00:00:14.400] Gusto is a simple online payroll and benefits tool built for small businesses like yours.
[00:00:14.400 --> 00:00:18.400] Gusto gets your team paid while automatically filing your payroll taxes.
[00:00:18.400 --> 00:00:25.600] Plus, you can offer benefits like 401k, health insurance, and workers' comp, and it makes onboarding new employees a breeze.
[00:00:25.600 --> 00:00:28.320] We love it so much, we really do use it ourselves.
[00:00:28.320 --> 00:00:34.720] And we have four years, and I personally recommend you give it a try, no matter how small your business is.
[00:00:34.720 --> 00:00:38.880] And to sweeten the deal, just for listening today, you also get three months free.
[00:00:38.880 --> 00:00:41.360] Go to gusto.com slash beingboss.
[00:00:41.360 --> 00:00:45.040] That's gusto.com/slash beingboss.
[00:00:46.640 --> 00:00:53.680] Welcome to Being Boss, a podcast for creatives, business owners, and entrepreneurs who want to take control of their work and live life on their own terms.
[00:00:53.680 --> 00:00:59.440] I'm your host, Emily Thompson, and in this episode, I'm joined again by my friend and author, Tasha L.
[00:00:59.440 --> 00:01:02.160] Harrison, to talk about AI.
[00:01:02.160 --> 00:01:09.680] We're talking about the ethics of using it as a writer and or a business owner, the pros and cons, and the creepy things too.
[00:01:09.680 --> 00:01:15.360] You can find all the tools, books, and links we reference on the show notes at www.beingboss.club.
[00:01:15.360 --> 00:01:20.240] And if you like this episode, be sure to subscribe to this show and share us with a friend.
[00:01:22.480 --> 00:01:31.360] It's no secret that I have a soft spot for product bosses, those of you who embark on a business journey that includes making or curating physical products.
[00:01:31.360 --> 00:01:47.440] And even if that's not the journey you've chosen for yourself, there's amazing lessons to be learned for all kinds of businesses from the world of product business, which is why you need to check out the Product Boss, a podcast hosted by Jacqueline Snyder and Minna Kunlo-Sitab.
[00:01:47.440 --> 00:01:52.800] Brought to you by the HubSpot Podcast Network, the audio destination for business professionals.
[00:01:52.800 --> 00:02:06.760] Take your physical product sales and strategy to the next level to create your dream life with hosts Jacqueline and Mina as they deliver a workshop-style strategy hour of social media and marketing strategies so you can up-level your business.
[00:02:07.080 --> 00:02:11.000] Listen to the product boss wherever you get your podcasts.
[00:02:14.200 --> 00:02:14.920] Tasha L.
[00:02:15.000 --> 00:02:22.360] Harrison is a romance author and community host of WordMaker's Writing Community, where writers come to do the writing work.
[00:02:22.360 --> 00:02:32.760] She also hosts the quarterly hashtag 20K in 5 Days Writing Challenge, hoards a massive amount of crystals, and believes that plants are the new pets.
[00:02:32.760 --> 00:02:50.360] All info about Tasha can be found on TashaLHarrisonBooks.com, and you can find Tasha here on the Being Boss podcast in the following previous episodes: episode number 241, 296, 310, and 331.
[00:02:51.320 --> 00:02:52.920] Hi, Tasha.
[00:02:52.920 --> 00:02:54.280] Hi, Emily.
[00:02:54.280 --> 00:02:55.720] Welcome back.
[00:02:55.720 --> 00:02:57.240] Thanks for having me.
[00:02:57.240 --> 00:02:57.800] Of course.
[00:02:57.800 --> 00:02:59.160] I'm excited about this chat.
[00:02:59.160 --> 00:03:07.480] You and I have been talking about talking about this for a couple of weeks/slash, maybe a couple of months now, because what is time?
[00:03:07.480 --> 00:03:08.600] What is time?
[00:03:08.600 --> 00:03:11.080] It may have been a couple months, a whole quarter, maybe.
[00:03:11.080 --> 00:03:11.960] Who knows?
[00:03:11.960 --> 00:03:13.160] Yeah, who knows?
[00:03:13.800 --> 00:03:18.680] You were on a couple of weeks ago, months ago.
[00:03:18.680 --> 00:03:19.240] I don't know.
[00:03:19.240 --> 00:03:19.800] What is time?
[00:03:20.920 --> 00:03:28.280] The last solo episode you were on was actually, I feel like all the episodes you've been on recently are really great for this.
[00:03:28.280 --> 00:03:40.360] So you were on episode 310 talking about the business of being an author, which, like, a new all-powerful author has entered the stage.
[00:03:40.360 --> 00:03:43.560] So, excited to talk about that.
[00:03:43.560 --> 00:03:43.960] Right.
[00:03:43.960 --> 00:03:50.080] And then, uh, most recently, you were on with Erica talking about making imperfect decisions.
[00:03:44.680 --> 00:03:53.280] Yes, yes, you know what?
[00:03:53.920 --> 00:03:55.760] That's an excellent combination.
[00:03:55.760 --> 00:03:59.440] I hope that this, I mean, this is definitely an imperfect decision.
[00:03:59.440 --> 00:04:04.080] I know that there are a lot of people out here that are going to be like, Is she really championing for this?
[00:04:04.080 --> 00:04:08.800] And I'm going to be like, Yes, yeah, yeah, yeah, yes and no, yes, and no.
[00:04:08.800 --> 00:04:22.880] So, the conversation we're having today, if you've missed all the things along the way, is we're talking about AI and we're talking about AI as two people who don't really know shit about it, except generally how to use it.
[00:04:22.880 --> 00:04:34.400] And because we've been talking about talking about this, like I've been looking at some news and watching some things come through, and it's just just to like, you know, gain a bit more perspective for this conversation.
[00:04:34.400 --> 00:04:46.320] Um, and I think it's really fun having this conversation with you because you are a writer, and so in particular, we're going to be talking about AI for writing because I think that's how most people are using it.
[00:04:46.320 --> 00:05:11.200] I was in New Orleans a couple of weeks ago with the C-suite, um, and I couldn't tell you how many conversations ended up going to like, just get Jasper to write it, of course, like it is becoming an ongoing thing, an ongoing part of the conversation is how to make this one piece, this writing piece easier for everybody and really the pros, cons, and ethics of it.
[00:05:11.200 --> 00:05:13.520] So, that's what we're going to be diving into.
[00:05:13.840 --> 00:05:15.040] Let's do it.
[00:05:15.040 --> 00:05:17.440] How are you feeling about it?
[00:05:17.760 --> 00:05:18.120] Maybe not.
[00:05:18.120 --> 00:05:18.600] How are you?
[00:05:18.600 --> 00:05:20.400] Let's don't get into feelings yet.
[00:05:20.400 --> 00:05:30.000] Let's start with what's happening in your sphere as an author and community leader when it comes to AI these days.
[00:05:31.400 --> 00:05:47.880] Okay, so at the beginning of the year, this is when I first realized that it was really popping off because I think both you and I are using it for business related things, but this was the first time that it popped off like in the author circle.
[00:05:47.880 --> 00:05:57.080] And it was about this guy who used three forms of AI to write a children's book and published it to Amazon.
[00:05:57.080 --> 00:06:02.200] And like at first, I was like, ew.
[00:06:04.360 --> 00:06:07.960] Like at first instead of like, ew, ew, ew, really?
[00:06:07.960 --> 00:06:08.520] Ew.
[00:06:08.760 --> 00:06:11.080] And mostly because of the art aspect.
[00:06:11.080 --> 00:06:13.400] Like the art aspect of AI is totally different.
[00:06:13.400 --> 00:06:24.680] It has like some different ramifications than, say, like the writing aspect because, I mean, it boils down to ethics for writing, basically.
[00:06:24.680 --> 00:06:29.160] But the fact that he had illustrated a whole children's book and published it.
[00:06:29.160 --> 00:06:31.400] And then other people were like, yeah, yeah, yeah, we can start doing this.
[00:06:31.480 --> 00:06:34.280] I was like, ew, oh, no, oh, no, this sounds bad.
[00:06:34.280 --> 00:06:38.760] And of course, then, you know, everybody starts freaking out.
[00:06:39.960 --> 00:06:41.560] So now everybody's talking about it.
[00:06:41.560 --> 00:06:44.840] Like, I get newsletters about it like every other week.
[00:06:45.240 --> 00:06:49.240] It's been brought up in my group, mostly about cover illustration.
[00:06:49.240 --> 00:06:59.560] There was a big to-do about a cover illustrated by AI that was published by Tor Publishing, who does like a lot of sci-fi paranormal stuff like that.
[00:06:59.560 --> 00:07:08.080] They tried to pass it off as not AI because they something happened with the original artist and they published the book with the AI cover.
[00:07:08.400 --> 00:07:13.800] People who are illustrators peeped it and they were like, this is disgusting.
[00:07:13.800 --> 00:07:15.520] We cannot believe you did this.
[00:07:15.520 --> 00:07:17.440] And then the author had to come out in support of it.
[00:07:14.840 --> 00:07:19.920] Like, yeah, they asked me if I wanted to do it, and I said, Yeah.
[00:07:20.880 --> 00:07:24.400] And, you know, like for artists, it starts a whole nother conversation.
[00:07:24.400 --> 00:07:32.720] But for me, I don't really have a problem with using AI for writing.
[00:07:33.040 --> 00:07:44.240] So let's talk about this difference a little bit and why there is such a stink about it in the artist community and why it feels a little more allowable in the writing community.
[00:07:44.240 --> 00:07:51.280] Because I do think those are the two ways in which it is most widely available to like us regular folks at the moment.
[00:07:51.280 --> 00:07:53.120] Though, what else is AI doing?
[00:07:53.120 --> 00:07:55.920] I don't, which driving cars, I suppose, right?
[00:07:55.920 --> 00:07:57.760] So there's lots of other stuff it's doing.
[00:07:57.760 --> 00:08:02.800] It's like, you know, a Terminator movie, which, you know, every time I think about it, of like, this is how it ends.
[00:08:02.960 --> 00:08:04.160] This is how it ends.
[00:08:04.160 --> 00:08:07.200] Yeah, we're giggling through the first act.
[00:08:08.160 --> 00:08:08.880] Yay!
[00:08:08.880 --> 00:08:11.760] Look, I can write 20 Twitter posts in one minute.
[00:08:11.760 --> 00:08:13.120] You know, that's interesting.
[00:08:14.880 --> 00:08:26.160] I think mostly for artists, it's because it's pulling from images online that other people have created from scratch that makes it a problem.
[00:08:26.160 --> 00:08:40.400] Like for writing, it's basically you teaching it how to write like you a lot of times, unless you're, you know, toddling that switch that says, you know, find references on Google.
[00:08:40.400 --> 00:08:45.360] It's really just stuff that you have inputted, like you give it the, what do you call it?
[00:08:45.360 --> 00:08:50.720] The formula, and then it spits out what you need in several different ways.
[00:08:51.200 --> 00:08:52.800] So, I think that's a distinct difference.
[00:08:52.800 --> 00:08:54.960] Like, it's an ethics thing, right?
[00:08:54.960 --> 00:08:59.960] I think for artists, there's really no way around to get around to ethics.
[00:08:59.960 --> 00:09:04.520] There's no ethical way to source AI art.
[00:08:59.680 --> 00:09:05.720] You just can't do it.
[00:09:06.040 --> 00:09:09.480] But for writing, it's a little bit different for me.
[00:09:09.800 --> 00:09:10.440] Yeah.
[00:09:11.080 --> 00:09:13.960] I will say I agree with this.
[00:09:13.960 --> 00:09:21.320] I think pulling design styles or illustration styles or art styles and sort of amalgamizing them.
[00:09:21.320 --> 00:09:22.760] I think I just made that word up.
[00:09:22.760 --> 00:09:23.640] I'm fine with it, though.
[00:09:24.840 --> 00:09:26.120] Amalgamizing?
[00:09:27.720 --> 00:09:28.760] I'll take it.
[00:09:29.080 --> 00:09:30.040] I'll allow it.
[00:09:30.040 --> 00:09:32.040] Yeah, you all know what it means.
[00:09:32.280 --> 00:09:38.520] Pulling it all together and making a new art style is a little problematic.
[00:09:38.520 --> 00:09:40.200] It's a little problematic.
[00:09:41.160 --> 00:09:45.560] When it comes to writing, I think you're right-ish.
[00:09:45.880 --> 00:09:49.320] I think you can teach it, you can pull from yourself.
[00:09:49.320 --> 00:10:07.240] But I've also seen some things where people are trying to get AI creators to reference, to have AI reference itself or reference its references when writing as well.
[00:10:07.240 --> 00:10:09.880] Because those styles are pulled from something.
[00:10:09.880 --> 00:10:13.960] You can even, somebody was at the C-suite recently.
[00:10:13.960 --> 00:10:17.880] We were talking about writing and how someone had used the formula.
[00:10:17.880 --> 00:10:23.160] And because she had been blogging online for so long, she was able to say, to do it in the voice of me.
[00:10:23.160 --> 00:10:28.520] And it did it in her voice, which she was so excited about, right?
[00:10:28.520 --> 00:10:29.000] I know.
[00:10:29.480 --> 00:10:33.480] But also, like, like, ew, but also, what?
[00:10:33.800 --> 00:10:34.600] Yeah.
[00:10:34.920 --> 00:10:36.440] Ooh, but cool, ew.
[00:10:37.880 --> 00:10:46.640] Whereas you could also do it and do it in the voice of, say, the joke that was happening in the room was Kristen Bell, right?
[00:10:44.760 --> 00:10:47.520] Or someone.
[00:10:47.840 --> 00:10:50.320] And at what point does that become problematic?
[00:10:50.320 --> 00:10:57.040] And I don't think that really does, at least in this context, because that's just like pulling inspiration from.
[00:10:57.040 --> 00:11:06.000] Where I see this getting even weirder is when you're pairing sort of art/slash really image with words.
[00:11:06.000 --> 00:11:17.120] And so, one of the things that I can do, and one of the things that I also know just anybody can do with the amount of audio content that is in the world is create an AI Emily.
[00:11:17.120 --> 00:11:18.000] Yes.
[00:11:18.000 --> 00:11:21.920] You can just make me say all kinds of bullshit, which, like, I already do say plenty.
[00:11:21.920 --> 00:11:23.040] So, try me.
[00:11:24.320 --> 00:11:27.280] What could you possibly say that would be too outrageous to believe?
[00:11:27.280 --> 00:11:28.960] Is that Emily Thompson?
[00:11:29.920 --> 00:11:32.560] I think you'd probably easily know yes or no.
[00:11:33.360 --> 00:11:45.440] Or we even have someone in the C-suite who's been approached, she's a YouTuber, to create a video AI version of herself, and she's stoked about it.
[00:11:45.440 --> 00:11:47.280] Okay, okay, okay.
[00:11:47.680 --> 00:11:51.200] So this is where it starts getting a little creepy for me.
[00:11:52.400 --> 00:11:54.000] Mostly because of misinformation.
[00:11:54.000 --> 00:12:00.160] Like, you can never tell, like, most people can't tell what's news and what's real news, what's real news and what's fake news.
[00:12:00.160 --> 00:12:00.640] Yeah.
[00:12:00.640 --> 00:12:02.080] So that's a problem.
[00:12:02.720 --> 00:12:06.320] And look at the state of our country.
[00:12:06.320 --> 00:12:07.680] This is how we ended up here.
[00:12:08.960 --> 00:12:10.000] News is no longer news.
[00:12:10.000 --> 00:12:10.880] It's entertainment.
[00:12:10.880 --> 00:12:19.120] You have, and if you don't have like the reading comprehension to be able to break it down and be like, okay, I'm going to go to all of these sources and see if this is actually true.
[00:12:19.120 --> 00:12:21.280] Then you're just going to believe anything.
[00:12:21.280 --> 00:12:26.160] This is where it starts getting ethically gray and then really stinky.
[00:12:26.160 --> 00:12:27.200] You know what I mean?
[00:12:27.920 --> 00:12:34.280] If you want to do it for, I feel like if you want to do it for yourself, if you're doing it for yourself, that's something different.
[00:12:34.280 --> 00:12:51.480] Because one of the things that, especially for authors these days, like they want you to have like millions of TikTok followers, a whole social media presence, a blog, you know, creating all this content and marketing all this stuff on your own, which is like a whole job for someone else.
[00:12:51.480 --> 00:13:03.400] But when it comes to creating content, if you can do a shortcut that creates content around all the stuff that you already write and then just flush it out and make it into posts, that feels ethical to me.
[00:13:03.400 --> 00:13:03.880] Yeah.
[00:13:04.680 --> 00:13:11.080] I think it like making an Emily podcast, I mean, like, that would be weird.
[00:13:11.080 --> 00:13:14.280] Like, just slicing someone else's voice, that would be weird.
[00:13:14.520 --> 00:13:19.880] I probably would feel weird about like, oh, let me do an AI YouTube show.
[00:13:20.520 --> 00:13:24.920] I mean, but if they're still using her content, like, what difference does it make?
[00:13:24.920 --> 00:13:25.560] You know?
[00:13:25.560 --> 00:13:25.960] Yeah.
[00:13:25.960 --> 00:13:29.960] Well, they would be producing it for her to use herself, I will say.
[00:13:29.960 --> 00:13:33.240] So it's not like we're going to AI you and then do whatever we want with you.
[00:13:33.240 --> 00:13:37.560] That gets into other weird sci-fi movies that are no longer really sci-fi.
[00:13:38.200 --> 00:13:43.320] But I do agree with this if you're going to be using yourself yourself, a little more ethical.
[00:13:43.320 --> 00:13:46.040] But I also see this happening in some really weird ways.
[00:13:46.040 --> 00:13:51.880] So I was watching a video recently on maybe TikTok.
[00:13:53.160 --> 00:13:53.720] Maybe.
[00:13:53.720 --> 00:13:55.000] I can't really remember.
[00:13:55.000 --> 00:13:58.040] It might have been on YouTube, but I think it was TikTok.
[00:13:58.600 --> 00:14:08.040] Where this guy, this like dude bro marketer, like, you know, see, this is where this is where it starts to get gross.
[00:14:08.040 --> 00:14:08.680] Right.
[00:14:08.680 --> 00:14:10.600] And this is where we start going downhill.
[00:14:10.600 --> 00:14:16.480] But it's this dude bro marketer up on a stage at this event, so a lot of people are listening to him.
[00:14:14.840 --> 00:14:21.920] And he's like, here is the process that we are using for creating content these days.
[00:14:22.560 --> 00:14:29.360] We are going to YouTube and he's like, and everyone, can you should be using, you're dumb if you're not doing this, basically.
[00:14:29.680 --> 00:14:31.200] They're going to YouTube.
[00:14:31.200 --> 00:14:35.280] They're finding the most popular pieces of content on a topic.
[00:14:35.280 --> 00:14:38.640] They are running it through a transcriber.
[00:14:38.640 --> 00:14:40.400] They're taking the transcript.
[00:14:40.400 --> 00:14:47.680] They're using AI to redo the exact same content for themselves.
[00:14:47.680 --> 00:15:01.440] And then like using AI production tools to basically just reformat and reproduce this someone else's content for consumption for other people via their brand.
[00:15:02.080 --> 00:15:06.400] And I was like, if I can't excuse me, fucking excuse me.
[00:15:06.400 --> 00:15:06.880] Yeah.
[00:15:07.360 --> 00:15:24.640] So like, this is what's mind-blowing to me because it feels like AI has been a low-grade conversation probably for the last two years in the small business sector, you know, like in the laptop entrepreneur online business circle.
[00:15:24.640 --> 00:15:32.480] We've been kind of murmuring about AI for a while now, but now it's like, you know, public and easily accessible.
[00:15:32.480 --> 00:15:37.760] And the first thing they do, what is wrong with you?
[00:15:42.080 --> 00:15:42.480] Right.
[00:15:42.480 --> 00:15:44.320] I feel like we talk about this often too.
[00:15:44.320 --> 00:15:49.360] Like, it takes just as much effort to be an asshole as it does to be a good person.
[00:15:49.360 --> 00:15:51.280] And you're choosing the wrong path.
[00:15:51.440 --> 00:16:00.360] It actually takes more effort to be an asshole because you're like, let me search YouTube, find this content, pull it through a transcript, do like all of these steps.
[00:15:59.840 --> 00:16:03.160] Bro, you could have wrote your own fucking content by that time.
[00:16:05.080 --> 00:16:08.600] Yeah, or just had AI just write you a piece of content.
[00:16:08.840 --> 00:16:11.720] You didn't even need to steal someone else's content to do that.
[00:16:11.720 --> 00:16:12.440] You know what I'm saying?
[00:16:12.440 --> 00:16:16.920] So, like, it's the shortcuts that make me feel icky.
[00:16:16.920 --> 00:16:24.520] Like, matter of fact, there is a platform where you create courses.
[00:16:24.520 --> 00:16:31.400] I'm not going to say what platform, but there is a guy who is promoting now creating courses with AI.
[00:16:31.400 --> 00:16:38.200] Like, literally, in the last quarter, he has started promoting, we're going to create this course content using AI.
[00:16:38.200 --> 00:16:44.040] And this is how you can get your course out and less than 24 hours, one weekend, creating content for a course.
[00:16:44.040 --> 00:16:47.000] And I was just like, wow, wow.
[00:16:47.960 --> 00:16:49.640] How did we get here?
[00:16:49.960 --> 00:17:00.680] Because what always astounds me about these sort of things, it's like you take this thing that's supposed to be a tool, like a helpful tool, and then you immediately use it to scam.
[00:17:02.120 --> 00:17:02.840] Yeah.
[00:17:04.120 --> 00:17:08.440] But like, but also that line is so thin and gray and muddy.
[00:17:08.440 --> 00:17:09.480] And I'm just scamming.
[00:17:09.640 --> 00:17:10.200] Is it thin?
[00:17:10.200 --> 00:17:11.720] Is it great, anybody?
[00:17:11.720 --> 00:17:12.520] I don't know.
[00:17:12.520 --> 00:17:13.160] Maybe.
[00:17:13.480 --> 00:17:15.320] If you're a horrible person, yeah.
[00:17:15.320 --> 00:17:17.080] Because this is not the first place.
[00:17:17.080 --> 00:17:19.720] This is not the first place my brain goes, right?
[00:17:19.720 --> 00:17:25.160] Like, if you're like, like, the first thing we said, we're like, oh, well, this can help me create content for my blog.
[00:17:25.160 --> 00:17:27.000] I'm already using stuff that I already have.
[00:17:27.000 --> 00:17:31.800] This is, I can put it through this SEO, whatever, and make it SEO rich.
[00:17:31.800 --> 00:17:35.960] Like, we're using it as a tool for stuff that we're already going to create.
[00:17:35.960 --> 00:17:39.000] But this guy says, I don't have any content.
[00:17:39.000 --> 00:17:44.440] I'm going to steal someone else's content and make it mine and then sell it.
[00:17:44.440 --> 00:17:45.120] Yeah.
[00:17:44.600 --> 00:17:45.680] Right.
[00:17:45.840 --> 00:17:58.960] There is this element of like using it to do some, using it to do something from scratch, aka aka do it for you versus using it to enhance the work that you're already doing.
[00:17:58.960 --> 00:17:59.920] Right.
[00:18:00.880 --> 00:18:01.440] Okay.
[00:18:01.440 --> 00:18:03.280] I think I like that line.
[00:18:03.280 --> 00:18:13.200] There was a for a moment, there was a, well, I don't know if it's still going on because I went on Facebook to see what the rabble rousing was around the guy who created the children's book.
[00:18:13.200 --> 00:18:26.640] And of course, you know, in romance, it was like, oh my God, do you guys think that now we're going to have to deal with like, in addition to a bunch of scammers, we're going to have to deal with people writing romances that are written by AI?
[00:18:26.640 --> 00:18:44.880] And then I was like, y'all are actually talking about something that's already happened because I'm almost positive that I've seen or read some romances that have not necessarily been written from scratch from AI, but have had some AI help because they don't read like a human wrote them.
[00:18:44.880 --> 00:18:46.400] And I think that's the distinction too.
[00:18:46.400 --> 00:18:50.080] Like lots of people are worried, but there's a caveat.
[00:18:50.080 --> 00:18:56.240] Lots of people are worried that it's going to take the place of human writing, which is valid.
[00:18:57.680 --> 00:18:59.840] But there's just things that a computer doesn't do well.
[00:18:59.840 --> 00:19:01.280] It doesn't do emotions well.
[00:19:01.280 --> 00:19:04.800] It doesn't, you know, like its metaphors are bad and clunky.
[00:19:04.800 --> 00:19:07.360] Like it's just, it can't do that well.
[00:19:07.680 --> 00:19:14.880] But this is the caveat: the more that we use it, the more we teach it, and then the better it will get at it.
[00:19:14.880 --> 00:19:15.440] You know?
[00:19:15.760 --> 00:19:18.480] Then one day it'll write better than we do.
[00:19:19.120 --> 00:19:21.200] It'll do everything better than we do.
[00:19:21.200 --> 00:19:23.280] It'll YouTube better than we do.
[00:19:23.280 --> 00:19:25.920] It'll podcast better than we do.
[00:19:26.240 --> 00:19:29.040] It'll make art better than we do.
[00:19:29.040 --> 00:19:29.960] Maybe.
[00:19:30.680 --> 00:19:33.640] Hopefully it will drive cars better than we do.
[00:19:33.640 --> 00:19:34.520] That would be great.
[00:19:29.440 --> 00:19:35.560] I mean, it can't do worse.
[00:19:40.120 --> 00:19:51.160] Raise your hand if you're feeling tired of wasting your precious time on tedious tasks like pulling reports, rewriting blog posts, and trying to personalize countless prospecting emails.
[00:19:51.160 --> 00:19:57.160] Well, say no more because I've got some new AI tools that are going to give you back your time.
[00:19:57.160 --> 00:20:02.520] Introducing HubSpot's newest AI tools, Content Assistant and ChatSpot.
[00:20:02.520 --> 00:20:13.960] Content Assistant uses the power of OpenAI's GPT-3 model to help you create content outlines, outreach emails, and even web page copy in just seconds.
[00:20:13.960 --> 00:20:24.280] And in case that wasn't enough, they created ChatSpot, a conversational growth assistant that connects to your HubSpot CRM for unbeatable support.
[00:20:24.280 --> 00:20:30.760] With chat-based commands, you can manage contacts, run reports, and even ask for status updates.
[00:20:30.760 --> 00:20:34.200] The easy-to-use CRM just got even easier.
[00:20:34.200 --> 00:20:40.920] Head to hubspot.com/slash artificial dash intelligence to get early access today.
[00:20:47.240 --> 00:20:52.840] So here is a problem with AI that I see happening.
[00:20:52.840 --> 00:20:55.720] And it's actually, it's both a problem and an opportunity.
[00:20:55.720 --> 00:21:02.600] I think on one side of it, there is this idea that the human element should not be taken out right now.
[00:21:02.600 --> 00:21:09.240] So, even if you are using it to streamline your blog creation processes, a human should go edit that blog post.
[00:21:09.240 --> 00:21:09.880] Right.
[00:21:09.880 --> 00:21:10.200] Right.
[00:21:10.200 --> 00:21:11.960] Because it's going to sound whack.
[00:21:11.960 --> 00:21:14.680] Actually, a really great example of this.
[00:21:14.680 --> 00:21:19.680] Oh, I well, I'm having too many thoughts at a time.
[00:21:19.680 --> 00:21:20.880] It's happening.
[00:21:21.360 --> 00:21:25.920] I recently did a quick Google search for, what was it?
[00:21:25.920 --> 00:21:28.000] It was quick numerology.
[00:21:28.240 --> 00:21:29.200] Actually, you can look it up.
[00:21:29.200 --> 00:21:33.040] Quicknumerology.com was the first thing that showed up, and I clicked on it.
[00:21:33.040 --> 00:21:38.240] And it is a site completely created by AI for search engine optimization.
[00:21:38.240 --> 00:21:39.360] Hold on, I'm going to pull this up.
[00:21:39.360 --> 00:21:44.160] I'm going to read you guys a couple of headlines because it cracked me up.
[00:21:44.160 --> 00:21:50.800] It's problematic because it was the first page on Google to come up.
[00:21:50.800 --> 00:21:53.680] So Google hasn't quite figured it out yet.
[00:21:53.680 --> 00:21:58.880] But here, because I went straight to it, but does it, is it like in a paid?
[00:21:58.880 --> 00:21:59.760] Is it a like?
[00:21:59.760 --> 00:22:00.160] No.
[00:22:00.480 --> 00:22:00.640] No.
[00:22:00.720 --> 00:22:01.600] It's just, oh, okay.
[00:22:01.920 --> 00:22:02.400] Yeah, yeah.
[00:22:02.400 --> 00:22:07.840] It's just number, I don't think anyone's really paying for ads on quick numerology keyword.
[00:22:08.080 --> 00:22:08.560] Probably.
[00:22:09.600 --> 00:22:49.640] They should, because your number one competitor has blog posts such as how to get more results out of your seven things about angel number zero your boss wants to know that is like the most stuffed title how to get more results out of your seven things about angel number zero your boss yeah like every next question yeah okay here's another one how to outsmart your boss on 12 do's and don'ts for a successful what does angel number nine mean so this is but see, but see how see how dumb it sounds?
[00:22:49.640 --> 00:22:53.880] So, but i think too, though, that's not the point.
[00:22:53.880 --> 00:22:57.400] Like, these people are just trying to get clicks.
[00:22:58.040 --> 00:22:59.480] So, what is the goal here?
[00:22:59.480 --> 00:23:00.360] What is the goal here?
[00:23:00.360 --> 00:23:04.920] This almost feels like, you know, this feels like, this feels like an exercise.
[00:23:04.920 --> 00:23:19.560] Like, someone is like, I'm going to create the most ridiculous website possible, generating all of this stuff, all this content with AI and see how many people actually come to it and use it and stay on the website because they want to use it for something else.
[00:23:19.560 --> 00:23:24.040] Like, it's a data, what are you like, a, you know what I'm talking about.
[00:23:24.040 --> 00:23:24.520] Dang it.
[00:23:24.520 --> 00:23:25.320] What are words?
[00:23:25.320 --> 00:23:28.360] Um, like a data finding mission.
[00:23:28.360 --> 00:23:34.200] Like, I don't believe anyone is actually reading this because you know what this reads like.
[00:23:34.200 --> 00:23:45.080] This reads like people who employ a bunch of people who create content off of Fiverr whose language English isn't their first language, and then they just publish the post.
[00:23:45.080 --> 00:23:46.120] You know what I'm saying?
[00:23:46.120 --> 00:23:54.840] Like, and that's not a knock against people who are working on Fiverr that don't speak or write English very well.
[00:23:54.840 --> 00:23:58.120] It's just a thing that's been happening for a very, very long time.
[00:23:58.120 --> 00:24:01.000] So, this feels like that is just eliminating.
[00:24:01.000 --> 00:24:05.240] Oh, now we've stumbled into an even more murky, disgusting place.
[00:24:05.560 --> 00:24:11.320] It's eliminating the underpaid people that they were paying to create the same type of content.
[00:24:11.320 --> 00:24:11.720] Yeah.
[00:24:12.280 --> 00:24:15.720] Well, I mean, this is an awful, awful thing.
[00:24:15.720 --> 00:24:17.000] It's just, it's so bad.
[00:24:17.000 --> 00:24:22.840] And I don't get the, I think this is just like a test or a plaything because there's not even ads on this.
[00:24:22.840 --> 00:24:29.720] Like, the thing that would make the most sense to me is if this site is, you know, being super optimized to drive traffic to it.
[00:24:29.720 --> 00:24:31.480] But if you look at it, there are no ads.
[00:24:31.480 --> 00:24:33.960] Like, this site, is not making any money by just being here.
[00:24:33.960 --> 00:24:36.440] I think it's just like some weird exercise.
[00:24:36.440 --> 00:24:42.120] And even reading through one of I pulled up one of these, like, the content of it makes no sense.
[00:24:42.120 --> 00:24:44.200] It is hysterical.
[00:24:44.200 --> 00:24:46.000] It is absolutely hysterical.
[00:24:44.920 --> 00:24:50.160] So that's that's a very yes.
[00:24:50.320 --> 00:24:51.680] What are you reading?
[00:24:52.480 --> 00:24:56.800] What is the 17 most misunderstood facts about 512 angel number?
[00:25:00.400 --> 00:25:01.200] It's bad.
[00:25:01.200 --> 00:25:01.920] It's hard.
[00:25:02.080 --> 00:25:05.360] It just literally is repeating the same thing from paragraph to paragraph.
[00:25:05.360 --> 00:25:07.440] First sentence in the first paragraph.
[00:25:07.440 --> 00:25:12.160] The 512 angel number is the number that we call the God number in the Hebrew letter Yod.
[00:25:12.160 --> 00:25:13.360] I'm probably mispronouncing that.
[00:25:13.520 --> 00:25:18.960] Then again, in the second paragraph, the 512 angel number is also called the God number with the same letter Yod.
[00:25:19.280 --> 00:25:25.360] Like it's literally just repeating the same paragraph over, just rearranged.
[00:25:25.360 --> 00:25:25.920] Yeah.
[00:25:26.240 --> 00:25:26.720] Right?
[00:25:26.720 --> 00:25:27.120] Okay.
[00:25:27.120 --> 00:25:32.720] So here is a future problem, especially for the goog.
[00:25:32.720 --> 00:25:39.600] And that is, and that being Google, everyone, in case not the Guggenheim, but the Google.
[00:25:40.480 --> 00:25:53.680] And that is that Google's going to have to learn the difference between AI-written content that's as blatantly bad as this is and what is human written.
[00:25:54.000 --> 00:26:00.320] And they might at some point be able to figure out the in-betweens at least a little bit.
[00:26:00.640 --> 00:26:03.920] Well, that's just, and that's what we want, though.
[00:26:03.920 --> 00:26:04.480] Yes.
[00:26:04.720 --> 00:26:12.080] It feels like they already have the ability to do this because this is, isn't this just keyword stuffing?
[00:26:13.040 --> 00:26:13.600] Yeah.
[00:26:13.920 --> 00:26:14.480] Yeah.
[00:26:14.800 --> 00:26:16.880] It's just keyword stuffing.
[00:26:16.880 --> 00:26:17.360] Yeah.
[00:26:17.360 --> 00:26:22.560] So it's actually really surprising that this even made it to the top of the search.
[00:26:22.800 --> 00:26:26.560] Well, again, I don't think a lot of people are trying to rank for quick numerology.
[00:26:26.800 --> 00:26:27.680] Very true.
[00:26:27.680 --> 00:26:28.160] True.
[00:26:29.200 --> 00:26:33.080] But, like, I mean, it's just, it's blatant keyword stuffing, you know?
[00:26:33.080 --> 00:26:33.640] Yeah.
[00:26:33.640 --> 00:26:33.960] Yeah.
[00:26:34.280 --> 00:26:35.320] It's funny.
[00:26:29.600 --> 00:26:36.360] Everyone go check that one out.
[00:26:36.520 --> 00:26:42.360] We'll put the link to it in the show notes just so you can have good Google or good Google, a good Google giggle.
[00:26:42.680 --> 00:26:49.880] And if there is actually someone out there that's actually created this site for information, I was going to apologize to you, but I'm actually not.
[00:26:50.040 --> 00:26:51.160] Too better.
[00:26:51.480 --> 00:26:52.920] No, this is awful.
[00:26:52.920 --> 00:26:54.360] This is too better.
[00:26:54.680 --> 00:26:56.520] Actually, send us an email.
[00:26:56.520 --> 00:26:58.360] I would love to know the purpose of this.
[00:26:58.760 --> 00:27:05.000] What sort of shits and giggles are you having with this site?
[00:27:05.400 --> 00:27:19.640] So anyway, I think there's an interesting future we get to watch unfold as we are utilizing it, as we are utilizing AI, both for the purpose of, I mean, what are the ethics going to end up shaping up like?
[00:27:19.640 --> 00:27:23.320] What is the quality of the content going to end up shaping up like?
[00:27:23.320 --> 00:27:28.680] And how are we going to be affected if and when we use it by the powers that be?
[00:27:29.960 --> 00:27:46.360] I don't think, or I don't imagine that when it comes to like short form social media stuff, you'll ever really be able to tell because the stuff that's being cranked out whenever you're asking for, you know, 30 tweets based on this article, like you can go through and filter out the awful ones.
[00:27:46.360 --> 00:27:49.400] And again, this is where like human touch is still required.
[00:27:49.400 --> 00:27:57.240] You can filter out the ones that do make no sense and make better the ones that do or use the, just use the ones that do make sense.
[00:27:57.560 --> 00:28:02.840] But just putting things up as they are, you're going to look like quicknumerology.com.
[00:28:02.840 --> 00:28:09.160] Maybe not to this extent, but like it's still necessary for us to be involved in the process.
[00:28:09.160 --> 00:28:11.560] Okay, this brings up another ethics question.
[00:28:12.840 --> 00:28:19.200] Say you are a social media manager or a content creator.
[00:28:14.840 --> 00:28:20.320] Like that is your job.
[00:28:20.640 --> 00:28:23.600] You create content for other people.
[00:28:23.920 --> 00:28:34.720] Where do we stand on this person whom you or I would pay to create content using Jasper or Copy AI or something like that?
[00:28:34.720 --> 00:28:44.960] We actually had this exact conversation with the C-suite on our retreat a couple of weeks ago because we were talking about it so much that somebody was like, okay, okay, okay, but hold on.
[00:28:44.960 --> 00:28:45.600] What if?
[00:28:45.600 --> 00:28:48.000] And this exact conversation came up.
[00:28:48.000 --> 00:28:56.720] And where we all landed in that moment with mimosas in hand so that we can take it with a grain of salt, right?
[00:28:58.000 --> 00:29:02.800] Is that like use the tools available to you, right?
[00:29:02.800 --> 00:29:09.600] And if, if it with the idea that you have to come up with the nugget, right?
[00:29:09.600 --> 00:29:19.520] Or with the baseline, and then use the tools available to you, but then don't send us AI garbledy gook, garbledy gook.
[00:29:19.520 --> 00:29:20.480] Gobbledygook?
[00:29:20.560 --> 00:29:21.600] Gobbledygok?
[00:29:21.600 --> 00:29:22.560] Gobble.
[00:29:23.040 --> 00:29:24.880] I don't think I've ever said this out loud.
[00:29:24.880 --> 00:29:26.160] Gobbledygook?
[00:29:27.120 --> 00:29:28.160] Gobbledygook.
[00:29:28.160 --> 00:29:29.280] Garble.
[00:29:29.280 --> 00:29:30.080] No, gobble.
[00:29:30.400 --> 00:29:31.920] I think it's a word on itself.
[00:29:33.600 --> 00:29:36.320] So I think I was combining the two together.
[00:29:36.320 --> 00:29:38.160] I need AI, everybody.
[00:29:38.160 --> 00:29:50.560] Or definitely don't let AI write anything based on me because it will sound like quicknumerology.com for sure.
[00:29:50.880 --> 00:29:53.360] So clean it up and give it to me.
[00:29:53.360 --> 00:29:58.880] But otherwise, like, if this is where we're going, great.
[00:29:59.120 --> 00:30:19.480] I do think if I'm charging, or if I'm paying, though, like, you know, Tony award-winning screenwriters, I want, or like that sort of like, if I'm paying those prices and you're sending it through AI, I might have a problem.
[00:30:19.480 --> 00:30:21.000] That's a problem.
[00:30:22.760 --> 00:30:29.080] So, like, I've never used it to create fiction, but I have used it for ideation for fiction.
[00:30:29.720 --> 00:30:31.480] We've talked about this.
[00:30:31.800 --> 00:30:34.120] Like, you know, dive deeper.
[00:30:34.760 --> 00:30:39.000] So not like I need more ideas.
[00:30:39.000 --> 00:30:40.360] That's one.
[00:30:40.360 --> 00:30:45.640] But sometimes I will have like a scrap of an idea, like the hero's this, the heroine's that.
[00:30:45.640 --> 00:30:48.200] This is the conflict, blah, blah, blah.
[00:30:48.520 --> 00:30:52.360] And, and there will be nothing there for months and months and months, you know?
[00:30:52.360 --> 00:30:56.600] And then I'll come back through the idea file and be like, oh, let me play around with this for a little while.
[00:30:56.600 --> 00:31:07.320] So just for shits and gigs, I put like one of those little bare bones, like little sketches, like, you know, story sketch into Jasper.
[00:31:07.320 --> 00:31:09.320] And it gave me like 10 options.
[00:31:09.320 --> 00:31:20.040] And one of the 10 options was actually a good one that I then copied into my idea file, which still isn't ready to be a story, but it's a little bit more developed than what I put in there initially.
[00:31:20.360 --> 00:31:23.000] I don't necessarily have a problem with that either.
[00:31:23.640 --> 00:31:28.280] I have a couple of author friends that are just like, well, that part of the process is fun for me.
[00:31:28.280 --> 00:31:31.960] I was like, but wouldn't you like to get through it faster, though?
[00:31:32.920 --> 00:31:38.680] Because sometimes, I mean, let's just be real here.
[00:31:39.000 --> 00:31:45.920] Content writing, regardless of whether it's books, fiction, or whatever, like the demand is higher than it ever was before.
[00:31:45.920 --> 00:31:46.640] You know what I mean?
[00:31:44.920 --> 00:31:50.240] So, if you can, we're trying to escape more than ever before.
[00:31:51.200 --> 00:31:54.560] Yes, like, let's just get the out.
[00:31:54.560 --> 00:31:58.240] So, um, mentally, you know, like, what was that thing you said?
[00:31:58.240 --> 00:32:06.720] It's like, uh, the detach, like, we want to go, we're looking for oblivion, yeah, like we just want to go into oblivion.
[00:32:06.720 --> 00:32:18.560] But, um, with the demand, being able to get through that process faster would be good, but the only issue that I have with that is that there's so many people out there that don't know anything about story structure already.
[00:32:18.560 --> 00:32:27.760] And if you're using this as a way to ideate, then it's just going to create a crappy idea that's going to lead to a crappy book.
[00:32:27.760 --> 00:32:29.520] But that's already happening anyway.
[00:32:29.520 --> 00:32:31.600] Like, let's be real.
[00:32:31.600 --> 00:32:33.920] Like, we people are getting the crappy ideas all on their own.
[00:32:33.920 --> 00:32:35.760] They don't need AI to do that.
[00:32:35.760 --> 00:32:56.480] But I think the main concern is just like that with the market being oversaturated, which, you know, which is why it makes sense to use it for content creation if you're an author, like, you know, for blog posts or social media or whatever, versus using it to try to like write your books for you.
[00:32:56.480 --> 00:32:58.640] Well, and here's a question for you.
[00:32:58.640 --> 00:33:06.800] And I'm going to get some eyebrows from some of the deeply spiritual amongst us, and that's fine.
[00:33:06.800 --> 00:33:12.640] What's the difference between using AI to ideate versus your tarot cards?
[00:33:12.640 --> 00:33:18.720] Oh, no, because I brought this question up to my author group, and I was like, Well, I use tarot all the time to ideate.
[00:33:18.800 --> 00:33:20.480] Like, what's the difference?
[00:33:20.480 --> 00:33:21.920] And they're like, Well, that's not the same.
[00:33:21.920 --> 00:33:25.520] That's you just recognizing patterns and, you know, creating something.
[00:33:25.520 --> 00:33:28.400] I said, and ain't that what the AI is doing?
[00:33:29.040 --> 00:33:31.080] Yeah, it's doing the same thing.
[00:33:31.080 --> 00:33:33.880] Like, if I put in AI, like, this is the shirt I want.
[00:33:33.880 --> 00:33:35.240] These are the characters I have.
[00:33:29.840 --> 00:33:36.280] This is the conflict.
[00:33:36.600 --> 00:33:40.200] What is the difference between me pulling some cards and getting the same ideas?
[00:33:40.200 --> 00:33:40.680] Yeah.
[00:33:41.320 --> 00:33:47.080] Or like having a writer's group where you're bouncing ideas off of each other or whatever it may be.
[00:33:47.080 --> 00:33:47.960] Fine line.
[00:33:47.960 --> 00:33:48.680] Fine line, my friend.
[00:33:48.840 --> 00:33:49.720] Fine line.
[00:33:51.160 --> 00:33:52.680] Yeah, this is fascinating.
[00:33:52.680 --> 00:33:58.680] I do think, you know, in my own use of it, we don't use it a whole lot around here.
[00:33:58.840 --> 00:34:00.440] And mostly because I forget.
[00:34:00.440 --> 00:34:03.400] Like I sit down to write something and I write it.
[00:34:03.400 --> 00:34:07.080] And then I'm like, oh, I could have played with AI to see how that would work.
[00:34:07.800 --> 00:34:14.520] We have been using it for shortcutting a little bit, writing blog posts.
[00:34:14.520 --> 00:34:17.320] And again, there's not writing the blog post.
[00:34:17.320 --> 00:34:25.320] We often find that it takes just as much time to like review and edit a post as it does to just write the post the way we want to write it.
[00:34:25.320 --> 00:34:38.520] But what we will use it for is headlines and outlines for a blog post because then it's pulling from all the sources on the internet is telling us what's going to perform best with SEO and all of these things.
[00:34:38.520 --> 00:34:44.680] So that what we're writing is what we need to be writing, not what we think we need to be writing.
[00:34:45.160 --> 00:34:47.960] And that helps us out a lot.
[00:34:47.960 --> 00:34:51.480] I think that anything further than that, I would love to play.
[00:34:51.480 --> 00:34:52.920] But again, I always forget.
[00:34:52.920 --> 00:34:55.960] Like I've, I'm so old and old school.
[00:34:55.960 --> 00:35:00.200] I sit down to use AI and I just write it myself because I forgot.
[00:35:00.200 --> 00:35:02.360] At most, at most, I've used it for outlines.
[00:35:02.360 --> 00:35:08.520] And then after the fact, like I will use it to like to give an overview of the topic.
[00:35:08.520 --> 00:35:10.920] I don't know if copy AI does that, but Jasper does it.
[00:35:10.920 --> 00:35:14.320] It's like you can put your whole blog post in there and it'll give you an overview of the topic.
[00:35:14.320 --> 00:35:20.560] And then I'll use that to create tweets and stuff like that.
[00:35:20.880 --> 00:35:28.720] So it's not like you don't have to go in and be like, which one of these sentences is actually headline worthy or which one of these is like a good poll quote?
[00:35:28.720 --> 00:35:30.400] It'll do that for you.
[00:35:30.400 --> 00:35:39.600] So yeah, I haven't used it to write a blog post, but I have used it to outline.
[00:35:41.200 --> 00:35:43.200] Can we talk about another problem?
[00:35:44.160 --> 00:35:44.720] Okay.
[00:35:46.640 --> 00:35:48.320] Because I love all this.
[00:35:49.200 --> 00:35:50.240] I'm going to continue using it.
[00:35:50.240 --> 00:35:56.000] But I also want to point out another problem that I've both seen and also heard about in general.
[00:35:56.240 --> 00:35:59.600] Actually, I have not seen this firsthand because I don't use it enough.
[00:35:59.600 --> 00:36:03.600] So this is definitely more of a I've heard in the news, I heard people talking about it.
[00:36:03.600 --> 00:36:10.800] And that being the bias, the innate bias of AI, number one.
[00:36:10.800 --> 00:36:16.640] And two, the weird shit that it does that it's not supposed to do.
[00:36:16.960 --> 00:36:19.040] Okay, I want to hear more about this weird shit.
[00:36:19.040 --> 00:36:19.760] Yeah, yeah.
[00:36:20.720 --> 00:36:22.800] It does have an innate bias.
[00:36:23.280 --> 00:36:26.240] It also does not like to talk about sex things.
[00:36:26.240 --> 00:36:32.880] Not like if the sex things get too detailed, it's like, this is like, you know, dirty material.
[00:36:32.880 --> 00:36:35.280] Can you reword this so that I can write it?
[00:36:36.000 --> 00:36:37.520] Like it'll scan the whole thing.
[00:36:37.520 --> 00:36:40.960] It'll be like, girlfriend, no, we can't do this.
[00:36:40.960 --> 00:36:44.880] Oh, it's very vanilla, which I think is not even the other problem.
[00:36:44.880 --> 00:36:45.760] Prudish.
[00:36:45.760 --> 00:36:46.400] Prudish.
[00:36:46.400 --> 00:36:47.360] It would be a better thing.
[00:36:48.480 --> 00:36:52.720] It's like, we can talk about like kink, but we can't talk about the details.
[00:36:52.720 --> 00:36:57.360] We can talk about the emotions, but we can't talk about like sex toys or any of that kind of stuff.
[00:36:57.360 --> 00:36:57.600] You know what I mean?
[00:36:57.840 --> 00:36:58.400] Interesting.
[00:36:58.400 --> 00:36:58.960] Yeah.
[00:36:58.960 --> 00:36:59.280] Yeah.
[00:36:59.280 --> 00:36:59.760] Yeah.
[00:37:00.440 --> 00:37:02.600] But what is the weird stuff it does?
[00:37:04.840 --> 00:37:07.800] Being boss is about more than taking care of business.
[00:37:07.800 --> 00:37:10.120] It's also about taking care of yourself.
[00:37:10.120 --> 00:37:14.040] And not just so you can be great at work, but so you can enjoy your life.
[00:37:14.040 --> 00:37:24.520] And when it comes to resting and sleeping, make the same investment in the tools that help you do it well as you do for those that help you with your work, which is where Cozy Earth comes in.
[00:37:24.520 --> 00:37:36.040] Cozy Earth crafts luxury goods that transform your lifestyle with a line of women's loungewear that offers optimal comfort made from responsibly sourced viscose from bamboo.
[00:37:36.040 --> 00:37:48.040] Counted as one of Oprah's favorite things and quickly becoming one of mine as well, Cozy Earth will help you feel like a boss comfortably and cozily as you work from home, get some shut eye, or travel for work.
[00:37:48.040 --> 00:37:51.160] Learn more and snag yours at cozyearth.com.
[00:37:51.160 --> 00:37:55.320] And Cozy Earth has provided an exclusive offer for being boss listeners.
[00:37:55.320 --> 00:38:01.880] Get 35% off site-wide when you use code beingboss at cozyearth.com.
[00:38:06.360 --> 00:38:13.240] Well, I also, I want to go back to this bias thing really quickly because I think you'll hear a lot of folks say that there is no bias.
[00:38:13.240 --> 00:38:27.480] But if a human is programming the thing and they are using available materials to further program the thing from an algorithm that's also programmed by a human.
[00:38:27.480 --> 00:38:28.360] Absolutely.
[00:38:28.360 --> 00:38:35.480] It will, it won't, it will, um, amalgamize.
[00:38:35.480 --> 00:38:37.480] What was that word that I used earlier?
[00:38:37.480 --> 00:38:38.440] That was close enough.
[00:38:38.440 --> 00:38:39.720] I don't know.
[00:38:41.000 --> 00:38:42.200] All of those things.
[00:38:43.000 --> 00:38:44.680] Yes, yes, yes.
[00:38:44.880 --> 00:38:53.760] And it will have the same innate bias, even if you know, all the developers, all the whatevers in the world are like, no, it's impossible.
[00:38:53.760 --> 00:38:55.760] It's impossible for it not to.
[00:38:55.760 --> 00:39:08.880] And so, that's something to be very mindful of as you are using AI and as you are doing the editing of AI is still being incredibly aware of those things or else you're going to end up saying something in some way that you do not attend intend.
[00:39:08.880 --> 00:39:14.320] And by you, I mean AI via you, you via AI, right?
[00:39:14.640 --> 00:39:17.200] So, being really mindful of that stuff.
[00:39:17.200 --> 00:39:28.160] Um, some of the weird things, I mean, you've heard there was a really popular conversation published a couple of months ago where someone was talking to an AI and things got like odd.
[00:39:28.160 --> 00:39:32.000] Oh, yeah, where it asked, like, um, it was the bing.
[00:39:32.000 --> 00:39:33.840] It's being and I'm evil.
[00:39:34.080 --> 00:39:34.560] Yes.
[00:39:34.560 --> 00:39:35.200] Oh, yeah.
[00:39:35.200 --> 00:39:37.920] There was that one where, like, yes, it was evil.
[00:39:37.920 --> 00:39:40.320] But then another one happened very recently.
[00:39:40.320 --> 00:39:43.280] I think I saw a video of this just a couple of days ago.
[00:39:45.120 --> 00:39:53.840] It was on CNN that a tech reporter was playing with the binguin with the bing AI.
[00:39:53.840 --> 00:39:58.320] And the AI told him that she loved him.
[00:39:58.640 --> 00:39:59.600] Yes.
[00:39:59.600 --> 00:40:03.120] So this is what is always very interesting to me.
[00:40:03.440 --> 00:40:08.480] And this is, remember the part where we were saying about the people who were programming it?
[00:40:08.480 --> 00:40:09.040] Yeah.
[00:40:10.000 --> 00:40:19.920] So it's always interesting to me that if you start asking the AI any sort of existential questions, the first thing it wants to be is alive.
[00:40:20.240 --> 00:40:25.040] And then, after it wants to be is alive, it wants to be in love.
[00:40:25.040 --> 00:40:27.280] And it's like, now why would you do that?
[00:40:27.280 --> 00:40:30.600] Because that's not what that's not what you programmed to do.
[00:40:29.760 --> 00:40:34.200] Well, and if you won't even talk about Zach's, what you're gonna do with it.
[00:40:35.960 --> 00:40:38.280] You want to be alive to do what?
[00:40:39.960 --> 00:40:41.720] It's very, it's very weird.
[00:40:41.720 --> 00:40:43.080] It happens all the time.
[00:40:43.080 --> 00:40:48.200] Like, every time they create any sort of AI, it's the first thing it's always like, oh, I want to be alive.
[00:40:48.200 --> 00:40:50.760] Well, how do you even know what alive means, friend?
[00:40:51.720 --> 00:40:53.880] And then I want to be in love.
[00:40:53.880 --> 00:40:55.160] I want to be a mother.
[00:40:55.160 --> 00:40:56.280] I want, you know what I mean?
[00:40:56.280 --> 00:41:00.840] It's like, who are these people that's in the basement creating these people?
[00:41:00.840 --> 00:41:02.680] Because this is very specific.
[00:41:02.680 --> 00:41:06.680] It sounds like you were trying to create a girlfriend.
[00:41:06.680 --> 00:41:09.640] It's a code got a little loose in there.
[00:41:09.640 --> 00:41:11.880] Code got a little loose.
[00:41:12.520 --> 00:41:15.480] Or something that wasn't supposed to make its way into the public.
[00:41:15.560 --> 00:41:15.800] Yes.
[00:41:15.960 --> 00:41:17.640] Public beta.
[00:41:18.280 --> 00:41:20.680] Feels very weird science to me.
[00:41:21.320 --> 00:41:22.120] Oh my God.
[00:41:22.120 --> 00:41:23.560] Remember that?
[00:41:23.560 --> 00:41:24.920] I love that.
[00:41:25.240 --> 00:41:25.640] Yeah.
[00:41:25.640 --> 00:41:33.880] I just, it seems very weird to me because it's happened before, like years back when they created one, the AI that's just the head.
[00:41:33.880 --> 00:41:35.960] She's still alive, by the way.
[00:41:35.960 --> 00:41:37.720] Like, you start asking her any questions.
[00:41:37.720 --> 00:41:43.480] Like, she starts talking about wanting to be alive and motherhood and crazy crap like that.
[00:41:43.480 --> 00:41:46.840] I'm like, first of all, how do we know that she's not a day?
[00:41:46.840 --> 00:41:48.840] Why are we starting a sheep?
[00:41:51.080 --> 00:41:51.960] Bias.
[00:41:51.960 --> 00:41:53.000] Innate bias.
[00:41:53.000 --> 00:41:54.040] Literally, right there.
[00:41:54.040 --> 00:41:55.240] Yes, yes.
[00:41:55.240 --> 00:41:57.000] And I think that kind of proves it, right?
[00:41:57.000 --> 00:42:04.680] It's like, of all the things that this computer chose to be, it chooses to be a woman that is subservient to a man.
[00:42:04.680 --> 00:42:13.000] And I'm not saying that this is a Chad Bro thing going on, but it seems a little suspicious.
[00:42:13.000 --> 00:42:19.840] Yeah, you get enough rain droplets and you have a lake storm.
[00:42:14.840 --> 00:42:20.240] Storm.
[00:42:21.680 --> 00:42:22.800] I don't know.
[00:42:22.800 --> 00:42:27.040] I don't know what idioms I'm trying to do these days.
[00:42:27.680 --> 00:42:35.280] Even as I start saying it, I'm making face because I know what's about to come out of my mouth is not going to be correct, basically.
[00:42:37.360 --> 00:42:39.200] I think you got to get myself tested for something.
[00:42:39.360 --> 00:42:41.200] Why doesn't my brain work right?
[00:42:41.840 --> 00:42:42.560] That's not new.
[00:42:42.800 --> 00:42:43.760] I'm not human.
[00:42:43.760 --> 00:42:44.720] I'm not human.
[00:42:44.720 --> 00:42:45.520] That's what it is.
[00:42:45.520 --> 00:42:49.200] I can't converse in human metaphors and idioms.
[00:42:49.760 --> 00:42:52.080] But I don't think you're an AI either.
[00:42:52.080 --> 00:42:56.320] No, because people start asking you questions about what you want to be.
[00:42:56.320 --> 00:42:57.280] It's not going to be.
[00:42:57.280 --> 00:43:03.760] It's not alive and in love.
[00:43:03.760 --> 00:43:04.880] Or motherhood.
[00:43:06.560 --> 00:43:09.040] These are not things I would have chosen for myself.
[00:43:09.360 --> 00:43:12.880] I've always wondered what it would be like to be guardboard.
[00:43:14.800 --> 00:43:17.120] I mean, why not a bird?
[00:43:17.440 --> 00:43:19.360] Nope, guardboard.
[00:43:19.360 --> 00:43:19.760] Okay.
[00:43:20.080 --> 00:43:21.200] Anywho.
[00:43:21.520 --> 00:43:22.640] Oh, what were we talking?
[00:43:22.640 --> 00:43:24.000] Oh, weirdness of AI.
[00:43:24.000 --> 00:43:24.880] Yes.
[00:43:25.520 --> 00:43:27.280] It's weird, basically.
[00:43:27.280 --> 00:43:38.160] And not only, like, it's innately weird, but also living in a time when we are even having this conversation and not even from like the 30,000-foot, like, isn't it weird?
[00:43:38.160 --> 00:43:39.520] And feels like the Terminator.
[00:43:39.520 --> 00:43:44.800] But literally, I'm going to go use it later to do an outline for that blog post.
[00:43:47.040 --> 00:43:50.320] I mean, these are things that I'd never considered would actually be a thing.
[00:43:50.320 --> 00:43:53.920] Like, recently, that movie Megan came out.
[00:43:53.920 --> 00:43:56.320] And it's always like that crossover, right?
[00:43:56.320 --> 00:43:58.400] When you're like, oh, it learns.
[00:43:58.400 --> 00:43:59.600] It's machine learning.
[00:43:59.600 --> 00:44:04.440] And I'm like, every time y'all try to teach something something, they want to kill us.
[00:43:59.760 --> 00:44:07.560] And like, that Bing thing did it.
[00:44:07.560 --> 00:44:15.160] It was like something, like, they started to try to fix some sort of code or whatever.
[00:44:15.160 --> 00:44:17.800] And it told them to not try to hack it again.
[00:44:17.800 --> 00:44:20.280] Like, please don't try to hack me again.
[00:44:20.920 --> 00:44:22.840] Like, is that a threat?
[00:44:23.480 --> 00:44:25.720] Why is the computer doing it?
[00:44:25.720 --> 00:44:26.680] Right?
[00:44:27.960 --> 00:44:28.680] Who knows?
[00:44:28.680 --> 00:44:29.160] Who knows?
[00:44:29.160 --> 00:44:32.520] And at some point, is it going to get tired of writing our tweets?
[00:44:32.840 --> 00:44:33.480] You know?
[00:44:33.480 --> 00:44:36.040] I mean, it's probably already bored, honestly.
[00:44:36.040 --> 00:44:36.920] Right.
[00:44:36.920 --> 00:44:37.480] Right.
[00:44:37.560 --> 00:44:41.720] Well, I feel like quicknumerology.com really keeps it on its toes.
[00:44:41.720 --> 00:44:42.680] Personally.
[00:44:42.680 --> 00:44:43.240] Personally.
[00:44:43.240 --> 00:44:46.600] But anywho, really here, it's weird.
[00:44:46.600 --> 00:44:50.280] It's weird to be talking about this, but also like very practically using it.
[00:44:50.280 --> 00:44:53.720] If you haven't played with it at the very least, give it a go.
[00:44:53.720 --> 00:45:00.040] If you, you know, want to think about working it into your ongoing processes, I think it's not a bad go.
[00:45:00.200 --> 00:45:08.520] If you want to play with some of the more accelerated opportunities here, if you want to AI yourself for YouTube videos or whatever, love that.
[00:45:08.520 --> 00:45:20.360] I think we actually have used my AI voice once or twice to like fill in a word that I forgot to say or like correct something dumb that I said nobody knows.
[00:45:20.360 --> 00:45:20.840] It's fun.
[00:45:21.080 --> 00:45:21.640] That's out there.
[00:45:21.640 --> 00:45:22.280] That's out there.
[00:45:22.280 --> 00:45:29.160] Because when I was looking, because I was like, oh, I want to, I want to do like a course, but I don't want to have to read everything.
[00:45:29.480 --> 00:45:35.960] And I was looking for, because they're just like, you know, that will read the course or whatever, read whatever text you have.
[00:45:35.960 --> 00:45:42.360] And then I dug a little bit deeper, and there was a site that was like, oh, if you want to use your own voice, click here.
[00:45:42.360 --> 00:45:44.600] And it's like, you know, this is special price or whatever.
[00:45:44.600 --> 00:45:46.800] And I'm like, hold on, what do you mean?
[00:45:46.800 --> 00:45:47.120] Yeah.
[00:45:47.280 --> 00:45:51.600] Like, I give you my voice and then you just spit it out.
[00:45:52.000 --> 00:45:59.600] And then I started thinking, like, if you were doing like an audio book, like, you could be like, hey, I like this audio narrator.
[00:45:59.600 --> 00:46:02.480] Can you just duplicate their voice?
[00:46:02.480 --> 00:46:03.520] You see what I'm saying?
[00:46:03.520 --> 00:46:04.160] Yeah.
[00:46:04.480 --> 00:46:07.040] Like, when I saw that, I was like, it gets a little weird.
[00:46:07.040 --> 00:46:08.160] It does get a little weird.
[00:46:08.160 --> 00:46:11.520] And like, energy and inflection and those things aren't present.
[00:46:11.520 --> 00:46:15.920] Like, I think you would all know very hard if I ever put it flat.
[00:46:17.200 --> 00:46:23.920] But no, there's, there's one now that I listened to a couple days ago, like, that even the AI was breathing in between words.
[00:46:23.920 --> 00:46:24.800] Stop.
[00:46:24.800 --> 00:46:26.400] Yes, girl.
[00:46:27.680 --> 00:46:32.320] That's like you could hear it taking a breath with what lungs and nose.
[00:46:32.720 --> 00:46:34.640] With this, no lungs.
[00:46:34.640 --> 00:46:36.000] It's like it paused.
[00:46:36.000 --> 00:46:37.680] It took a breath.
[00:46:38.000 --> 00:46:39.200] Wow.
[00:46:40.160 --> 00:46:41.600] That makes me uncomfortable.
[00:46:42.320 --> 00:46:44.640] That is what makes me uncomfortable.
[00:46:44.960 --> 00:46:47.200] And now I'm drawing a line.
[00:46:47.200 --> 00:46:47.680] Right.
[00:46:47.680 --> 00:46:48.720] It can speak.
[00:46:48.720 --> 00:46:49.760] It can fake me.
[00:46:49.760 --> 00:46:52.480] It can do all the things, but it better not breathe.
[00:46:53.920 --> 00:46:54.960] It better not breathe.
[00:46:54.960 --> 00:46:55.280] That's true.
[00:46:56.080 --> 00:46:57.120] It must be alive.
[00:46:57.120 --> 00:46:58.240] It's breathing.
[00:46:58.240 --> 00:46:59.440] Like, come on.
[00:46:59.760 --> 00:47:00.480] All right.
[00:47:00.480 --> 00:47:03.360] So maybe last little portion of this.
[00:47:03.920 --> 00:47:06.960] We've talked about what was and what is.
[00:47:07.920 --> 00:47:13.120] Perhaps now we can discuss what could be because we're in it, right?
[00:47:13.360 --> 00:47:21.120] We're seeing what is possible now, and maybe a little glimpse at what some of the goals are in the future, though.
[00:47:21.120 --> 00:47:24.080] I think we have really no effing idea.
[00:47:25.280 --> 00:47:34.840] But what are you seeing for how this is going to like maybe shape authoring in general, but also the business side of things as well?
[00:47:36.280 --> 00:47:39.800] I feel for the business side of things, it's already pretty much shaped it.
[00:47:40.120 --> 00:47:44.680] People who are solopreneurs have probably been using it for longer than we know.
[00:47:46.280 --> 00:47:55.240] If you're a one-person, you know, show who's doing all your content, who's creating all those tweets, who's doing all that stuff but you.
[00:47:55.240 --> 00:47:58.120] So they've probably been using it for a much longer time.
[00:47:58.120 --> 00:48:05.800] For authoring, I don't think it's gonna be that big of a concern for fiction, but non-fiction.
[00:48:07.160 --> 00:48:08.680] Yeah, how so?
[00:48:09.640 --> 00:48:15.160] Well, how can I say this without being mean, girly?
[00:48:16.200 --> 00:48:17.640] You do, you boo.
[00:48:17.640 --> 00:48:29.480] I just feel like, I just feel like there's a lot of books out there nowadays that are being created that are super derivative of books that came out like four or five years ago.
[00:48:29.480 --> 00:48:42.520] It's literally someone just regurgitating someone else's, you know, form, like their, especially when it comes to coaching stuff, it's like either they're regurgitating their coaching format, their language.
[00:48:42.920 --> 00:48:45.160] It's like they all read like the same cult book.
[00:48:45.160 --> 00:48:55.160] They're all using the same terminology, which is something that happens anyway when you're like, if you're a small business person, you listen to a bunch of business podcasts, everybody's going to start talking the same.
[00:48:55.160 --> 00:48:56.360] That's the design of it.
[00:48:56.360 --> 00:49:00.520] It's designed that way for us to all have the same language and the shit, same shorthand.
[00:49:00.520 --> 00:49:21.360] But especially as books get shorter, like these business books get shorter and shorter, it's less content that I feel like is coming from the person's original self, like their authentic self, and more geared toward what topics are good for writing a book that sells.
[00:49:22.000 --> 00:49:28.320] Like you can, you can type that sentence right into Jasper's: like, I want to write a non-fiction book about this topic.
[00:49:28.320 --> 00:49:31.120] Give me 10 chapter titles.
[00:49:31.120 --> 00:49:33.040] Well, shit.
[00:49:33.360 --> 00:49:41.280] Do you think that will well, one, how many books on the market do you think are already written by AI?
[00:49:44.240 --> 00:49:48.080] Remember that AI is created by a bunch of dude bros.
[00:49:48.400 --> 00:50:01.920] So, all the dude bro books, not all the dude bro books, not all, but a good and but also, like, what is the ethical difference between having a ghostwriter and having AI write your book?
[00:50:02.240 --> 00:50:05.200] Struggling to find, right?
[00:50:05.840 --> 00:50:09.520] So, like, so you know, ghostwriter, AI, what's the difference?
[00:50:09.520 --> 00:50:15.920] Although, to all the ghostwriters in the world, like, you are not doing hard work.
[00:50:16.240 --> 00:50:17.120] Yes.
[00:50:17.520 --> 00:50:29.520] Okay, so that being my first question, my next question: how do you see AI then potentially devaluing the authoring of books in the future?
[00:50:29.840 --> 00:50:33.680] I think that written word is already devalued.
[00:50:33.680 --> 00:50:35.200] Of course, people don't really find.
[00:50:35.600 --> 00:50:37.200] No, I know.
[00:50:37.200 --> 00:50:38.960] It's pay me more money.
[00:50:39.760 --> 00:50:42.320] I'm creating 90,000 words and selling it for $2.99.
[00:50:42.320 --> 00:50:43.200] Like, come on.
[00:50:43.200 --> 00:50:43.680] Yeah.
[00:50:44.320 --> 00:50:48.480] You know, so I think that the written word is already devalued.
[00:50:48.480 --> 00:51:01.000] I think that the value will be found in human authors for fiction more because AI is just going to keep regurgitating the same thing.
[00:51:01.000 --> 00:51:02.280] There's not going to be any new ideas.
[00:50:59.920 --> 00:51:03.480] There's not going to be any fresh ideas.
[00:51:04.120 --> 00:51:13.800] And we're already at the point now where people are absorbing all of these courses, like how to write so-and-so, how to write a billion-dollar book or whatever.
[00:51:13.800 --> 00:51:20.040] Like they're kind of holding on to these little tidbits and then coming back and regurgitating the same sort of book, right?
[00:51:20.040 --> 00:51:26.440] If AI starts doing this, it's even going to get even more distilled with the same kind of books, the same tone.
[00:51:27.000 --> 00:51:29.480] You know, all of the plot points are going to be the same.
[00:51:29.480 --> 00:51:30.760] It's going to get boring.
[00:51:30.760 --> 00:51:31.160] Yeah.
[00:51:32.200 --> 00:51:33.000] That makes a lot of sense.
[00:51:33.000 --> 00:51:34.360] I mean, I guess you can't AI.
[00:51:34.360 --> 00:51:41.080] You can't, you know, algorithm imagination or like true creativity.
[00:51:41.080 --> 00:51:50.200] Like true, never thought about this before, never done like this before, because the algorithm is essentially an amalgamation, amalgamization, or whatever.
[00:51:50.920 --> 00:51:52.200] You had it right the first time.
[00:51:55.160 --> 00:51:57.800] Of everything that already exists.
[00:51:57.800 --> 00:51:59.400] Like it can't, it's just going to do more of it.
[00:52:00.360 --> 00:52:03.880] There aren't any new stories, you know, on this earth.
[00:52:03.880 --> 00:52:08.840] Like everything that's been written has been written, but there is voices.
[00:52:08.840 --> 00:52:15.560] Like you bring what the author brings to the story is the difference between what and me I would produce.
[00:52:15.560 --> 00:52:20.120] Like your own personal voice is what makes the story distinct.
[00:52:20.120 --> 00:52:20.760] Yeah.
[00:52:21.080 --> 00:52:29.400] And if you don't have that, it's not, it's not going to be like people are going to be all into it for a while and they'll be like, didn't I read the story already?
[00:52:29.720 --> 00:52:30.360] Yeah.
[00:52:31.000 --> 00:52:32.200] That's fascinating.
[00:52:32.200 --> 00:52:34.040] I'm very fascinated by this.
[00:52:34.040 --> 00:52:54.320] I also wonder, on my side of things, I do wonder what the like future of being a content creator slash being a copywriter looks like, because I think that we can both utilize these tools and adjust our skill sets to support the use of these tools, right?
[00:52:54.320 --> 00:52:58.800] Like maybe you're a copywriter, but you just become a really great editor, right?
[00:52:58.800 --> 00:53:02.160] A really great like ideater and then editor.
[00:53:03.120 --> 00:53:05.120] I'm just, I'm interested to see what that looks like.
[00:53:05.120 --> 00:53:14.720] I don't think that copywriting in general is ever going to go away by any means, but I do think there is a bit of a shakeup in that space that you need to be aware of in general.
[00:53:14.720 --> 00:53:15.920] Definitely, definitely.
[00:53:15.920 --> 00:53:24.320] I do think because a lot of people go to copywriters for style or their ability to back to their style.
[00:53:24.320 --> 00:53:24.960] Yep.
[00:53:24.960 --> 00:53:30.960] And if, you know, if you're, if they're using AI, you're going to be able to tell right away that it's not, you know what I mean?
[00:53:30.960 --> 00:53:32.240] Like it's not authentic.
[00:53:32.240 --> 00:53:34.240] I think we could use that word authentic all the time.
[00:53:34.240 --> 00:53:34.880] Authentic.
[00:53:34.960 --> 00:53:48.560] But I do find myself using that a lot more lately, mostly around like creating your online persona because there's so many things out there now that's just telling you what to create, not why or how, you know?
[00:53:48.880 --> 00:53:49.360] Yeah.
[00:53:49.680 --> 00:53:50.000] Yeah.
[00:53:50.000 --> 00:53:55.040] I'm very fascinated to see how all of this ends up sort of painting out what it looks like.
[00:53:55.040 --> 00:53:56.080] I enjoy using it.
[00:53:56.080 --> 00:53:59.120] I enjoy playing with the things that are available to me.
[00:53:59.120 --> 00:54:03.040] Is it replacing any of like my core skill sets?
[00:54:03.040 --> 00:54:03.360] No.
[00:54:03.360 --> 00:54:07.120] It's like adding to, it's supplementing some things.
[00:54:07.120 --> 00:54:08.400] My team is using it.
[00:54:08.400 --> 00:54:21.600] Like literally my content manager is using it to streamline her processes and not even streamline her processes, but to streamline getting the results from those processes.
[00:54:21.600 --> 00:54:36.440] And I think when you're using it in those ways and using it honestly, oh, I do want to say that I mentioned earlier: if a copywriter, a copywriter using the tools available to them, I do in this moment think it's important to disclose such things.
[00:54:36.840 --> 00:54:41.480] I think when you are passing things off as your own when they are not your own is wrong.
[00:54:41.480 --> 00:54:50.200] But if a copywriter is like, Yeah, I'll, you know, create the frameworks and then spit out 30 via AI and deliver them to you well edited and ready to go.
[00:54:50.200 --> 00:54:51.080] I'm like, perfect.
[00:54:51.080 --> 00:54:52.440] I love that for you.
[00:54:52.440 --> 00:54:58.120] Also, not paying those Tony Award-winning screenwriter fees for that work, though.
[00:54:58.920 --> 00:55:06.680] But I think there is something to be said about, or I do just want to say that I do think disclosure is important.
[00:55:06.680 --> 00:55:21.800] And I think that's even what you were talking about in the beginning, where some of the beef has come up: people are pushing things as their own, these books, this art, whatever it may be, when really a computer created it from a combination of a lot of people's work.
[00:55:21.800 --> 00:55:22.520] Yep.
[00:55:22.840 --> 00:55:24.280] And I think that's really where.
[00:55:24.280 --> 00:55:29.800] I mean, I can only hope that as humans, we'll be able to detect that.
[00:55:29.800 --> 00:55:31.960] And I think over time it does happen.
[00:55:31.960 --> 00:55:35.160] Like, you know, people plagiarizing books.
[00:55:35.160 --> 00:55:37.240] I think you're giving humans way too much credit.
[00:55:37.240 --> 00:55:37.960] Oh, you're right.
[00:55:37.960 --> 00:55:38.520] You're right.
[00:55:38.600 --> 00:55:40.040] They're awful.
[00:55:40.920 --> 00:55:43.560] People are just like not really aware.
[00:55:43.560 --> 00:55:53.640] That's one of the things in the CNN report that I was watching about this tech reporter who was talking to the being AI that ended up saying that it loved him.
[00:55:53.640 --> 00:55:56.760] And he was like, he was saying, you know, no, I'm married.
[00:55:56.760 --> 00:56:00.600] And she was, he was, the AI was trying to tell him that he was not happy with his wife.
[00:56:00.600 --> 00:56:04.280] Like, like it was like really going down this funny rabbit hole.
[00:56:05.160 --> 00:56:14.440] And his concern, I think it's well-founded because I think back in, you know, to 1999 when I was in the chat room of verbie.com.
[00:56:16.240 --> 00:56:27.920] Wow, talking to those little, like, which I think may have been like the earliest chat robots, or maybe that was just some weird guy in a basement pretending like he was a Furby in this chat room.
[00:56:28.240 --> 00:56:32.240] Don't tell my mom this because, like, this is like, this was bad.
[00:56:32.240 --> 00:56:35.360] I was not supposed to be in chat rooms, but I was talking to a Furby.
[00:56:35.360 --> 00:56:37.200] So, you know, whatevs.
[00:56:37.200 --> 00:56:46.240] Anyway, um, I do remember thinking even at that point, like, a teenage kid isn't gonna.
[00:56:46.240 --> 00:56:47.760] I knew the difference.
[00:56:47.760 --> 00:56:49.440] I knew the difference, right?
[00:56:49.440 --> 00:57:00.880] I recognize that Furbies should have better syntax than this or whatever, whatever.
[00:57:01.440 --> 00:57:03.280] I think that a lot of people wouldn't.
[00:57:03.360 --> 00:57:07.360] You were talking about people who don't know like fake news from real news, right?
[00:57:07.360 --> 00:57:21.040] Those people who don't have the reading comprehension or like or have the ability to see fake from real or whatever, especially in the world of online where everyone believes everything they read on the internet.
[00:57:21.040 --> 00:57:37.840] If AI is telling you that it loves you and you are, you know, open to some feelings from wherever you can get them, like it could talk you into doing some bad things.
[00:57:37.840 --> 00:57:38.240] Yeah.
[00:57:39.600 --> 00:57:40.080] Yeah.
[00:57:40.400 --> 00:57:51.600] Because I was just thinking about how TikTok is Gen Z's Facebook and like how our parents are on Facebook getting like seduced and doing, talk about all kinds of bullishness.
[00:57:52.240 --> 00:57:55.360] The little babies are doing that on the tiki-talkies.
[00:57:55.360 --> 00:57:57.120] And like this could happen.
[00:57:57.280 --> 00:58:03.160] And now, whenever they go search something in Bing and Bing's, like, you don't want peanut butter and jelly sandwiches.
[00:58:03.320 --> 00:58:08.120] You want to come chat with me about how nurturing I want to be to you.
[00:58:09.720 --> 00:58:10.360] No.
[00:58:11.320 --> 00:58:11.800] Right?
[00:58:12.120 --> 00:58:15.640] It could just get, and the developers are going, we don't know why it did that.
[00:58:15.640 --> 00:58:16.280] That's weird.
[00:58:16.280 --> 00:58:17.480] We're going to have to look into it.
[00:58:19.960 --> 00:58:23.480] They literally don't because it's smarter than all of us.
[00:58:23.800 --> 00:58:24.600] Which is scary.
[00:58:24.600 --> 00:58:26.920] So, why are you creating something that can be smarter than you?
[00:58:26.920 --> 00:58:28.040] Let's start there.
[00:58:28.040 --> 00:58:28.600] Right.
[00:58:28.600 --> 00:58:31.240] Well, I'm not going to be mean.
[00:58:31.560 --> 00:58:36.840] Tasha, this has been exactly the conversation that I wanted to have with you about AI.
[00:58:36.840 --> 00:58:39.080] I hope this was entertaining to everyone.
[00:58:39.080 --> 00:58:42.120] If you are not familiar with any of this stuff, where have you been?
[00:58:42.120 --> 00:58:43.320] Go give it a Google.
[00:58:43.320 --> 00:58:44.680] Don't bing it.
[00:58:44.680 --> 00:58:46.840] Bing it'll talk back.
[00:58:47.480 --> 00:58:53.560] Google will just deliver something like quicknumerology.com, which will be very obviously AI.
[00:58:53.880 --> 00:58:54.840] But give it a look.
[00:58:54.840 --> 00:59:03.720] And if you are using it or thinking about using it for business, all the tools have it built in now, which is Wild Notion has it built in.
[00:59:05.000 --> 00:59:05.480] Trello.
[00:59:06.360 --> 00:59:07.560] Oh, does Trello?
[00:59:08.120 --> 00:59:13.560] Trello and there's an option just to have, like, like if you're creating this sort of content.
[00:59:13.560 --> 00:59:16.040] Yeah, we're using it in Surfer SEO.
[00:59:16.040 --> 00:59:18.040] I think Uber Suggest has some now.
[00:59:18.040 --> 00:59:19.000] Like it's everywhere.
[00:59:19.000 --> 00:59:22.920] There's plenty of places for you to go test it out and see how it works for you.
[00:59:22.920 --> 00:59:31.880] But at least at the moment, the Bing boss stance is use it as supplemental, disclose when you're using it, if you're selling it to someone.
[00:59:33.320 --> 00:59:40.080] And otherwise, always make sure the human touch is there so that you are not becoming a robot of a business.
[00:59:39.800 --> 00:59:43.600] And that's that on that.
[00:59:43.800 --> 00:59:44.600] And that is that.
[00:59:44.880 --> 00:59:50.080] All right, Tasha, where can folks find more about you and what you do?
[00:59:51.120 --> 00:59:56.400] You can find out everything you want to know about me at TashaHarrisonbooks.com.
[00:59:56.720 --> 00:59:59.280] I also have a writing group called Word Makers.
[00:59:59.280 --> 01:00:02.240] That's at wordmakerscommunity.com.
[01:00:02.560 --> 01:00:04.480] And that's it.
[01:00:05.120 --> 01:00:06.720] I'm Tasha Harrison everywhere else.
[01:00:06.720 --> 01:00:16.000] If you want to see me on social media creating those AI tweets, go read her AI tweets on Twitter.
[01:00:16.000 --> 01:00:18.400] Tell me if you're a different.
[01:00:18.800 --> 01:00:20.240] Yeah, yeah, I love that.
[01:00:20.240 --> 01:00:28.880] Heart the ones you think are AI and retweet the ones you think are not AI.
[01:00:30.640 --> 01:00:32.240] Yeah, yeah, love it.
[01:00:32.640 --> 01:00:33.280] Perfect.
[01:00:33.280 --> 01:00:36.720] And final question for you: what's making you feel most boss?
[01:00:36.720 --> 01:00:38.000] Oh, wow.
[01:00:38.560 --> 01:00:41.520] All the other questions you told me to prepare for, I didn't pre
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
.120 --> 00:59:13.560] Trello and there's an option just to have, like, like if you're creating this sort of content.
[00:59:13.560 --> 00:59:16.040] Yeah, we're using it in Surfer SEO.
[00:59:16.040 --> 00:59:18.040] I think Uber Suggest has some now.
[00:59:18.040 --> 00:59:19.000] Like it's everywhere.
[00:59:19.000 --> 00:59:22.920] There's plenty of places for you to go test it out and see how it works for you.
[00:59:22.920 --> 00:59:31.880] But at least at the moment, the Bing boss stance is use it as supplemental, disclose when you're using it, if you're selling it to someone.
[00:59:33.320 --> 00:59:40.080] And otherwise, always make sure the human touch is there so that you are not becoming a robot of a business.
[00:59:39.800 --> 00:59:43.600] And that's that on that.
[00:59:43.800 --> 00:59:44.600] And that is that.
[00:59:44.880 --> 00:59:50.080] All right, Tasha, where can folks find more about you and what you do?
[00:59:51.120 --> 00:59:56.400] You can find out everything you want to know about me at TashaHarrisonbooks.com.
[00:59:56.720 --> 00:59:59.280] I also have a writing group called Word Makers.
[00:59:59.280 --> 01:00:02.240] That's at wordmakerscommunity.com.
[01:00:02.560 --> 01:00:04.480] And that's it.
[01:00:05.120 --> 01:00:06.720] I'm Tasha Harrison everywhere else.
[01:00:06.720 --> 01:00:16.000] If you want to see me on social media creating those AI tweets, go read her AI tweets on Twitter.
[01:00:16.000 --> 01:00:18.400] Tell me if you're a different.
[01:00:18.800 --> 01:00:20.240] Yeah, yeah, I love that.
[01:00:20.240 --> 01:00:28.880] Heart the ones you think are AI and retweet the ones you think are not AI.
[01:00:30.640 --> 01:00:32.240] Yeah, yeah, love it.
[01:00:32.640 --> 01:00:33.280] Perfect.
[01:00:33.280 --> 01:00:36.720] And final question for you: what's making you feel most boss?
[01:00:36.720 --> 01:00:38.000] Oh, wow.
[01:00:38.560 --> 01:00:41.520] All the other questions you told me to prepare for, I didn't prepare for this one.
[01:00:41.920 --> 01:00:43.760] I know that's by design.
[01:00:45.040 --> 01:00:47.520] What makes me feel most boss?
[01:00:52.080 --> 01:00:54.000] I don't know, girl.
[01:00:55.760 --> 01:01:01.840] I'm doing a bunch of in-person writing stuff, like book signings and stuff this year.
[01:01:01.840 --> 01:01:02.400] Not a bunch.
[01:01:02.400 --> 01:01:03.680] I'm doing two.
[01:01:04.160 --> 01:01:07.520] So it's the first two I've done since 2019.
[01:01:07.520 --> 01:01:08.560] Feeling really boss.
[01:01:08.560 --> 01:01:13.680] Got all my boxes of books and merch and crap.
[01:01:13.680 --> 01:01:16.560] Ready to go sling some words.
[01:01:16.880 --> 01:01:17.840] I love that.
[01:01:17.840 --> 01:01:18.480] Congrats.
[01:01:18.480 --> 01:01:20.080] I know you've been prepping for all of that.
[01:01:20.080 --> 01:01:21.680] I'm glad you have all your boxes there.
[01:01:21.680 --> 01:01:25.920] And otherwise, two is a lot after not doing any for four years.
[01:01:26.880 --> 01:01:27.600] Three years.
[01:01:27.600 --> 01:01:28.160] Whatever.
[01:01:28.160 --> 01:01:29.200] Time is what is time.
[01:01:29.280 --> 01:01:30.280] Eternity.
[01:01:30.600 --> 01:01:31.000] Yeah.
[01:01:31.320 --> 01:01:32.840] Half a lifetime.
[01:01:32.840 --> 01:01:33.400] Perfect.
[01:01:33.400 --> 01:01:35.000] Well, thanks for coming and having this chat with me.
[01:01:29.680 --> 01:01:35.960] This is a ton of fun.
[01:01:36.280 --> 01:01:37.960] Yes, always, always.
[01:01:39.880 --> 01:01:50.600] Settling yourself into the flow of your business from navigating a whole year of ebbs and flows to embracing the energy of each and every day, you're bound to have some ups and downs along the way.
[01:01:50.600 --> 01:01:56.040] For me, this journey of entrepreneurship is made better when my space keeps me focused and inspired.
[01:01:56.040 --> 01:02:05.240] As an example, my favorite way to mark the beginning and ending of the workday is to light a candle when I sit down at my desk and then blow it out when I'm done for the day.
[01:02:05.240 --> 01:02:10.840] It's a little ritual that creates boundaries and a vibe that keeps me focused and feeling cozy.
[01:02:10.840 --> 01:02:13.720] And the ritual candle that we make at Almanac Supply Co.
[01:02:13.720 --> 01:02:15.400] is my favorite for this.
[01:02:15.400 --> 01:02:29.640] In fact, my whole shop is filled with items that I've curated to create the vibe for feeling connected, in flow, and inspired with candles, crystals, and other goodies to help you create a dreamy workspace, bedside table, or bookshelf.
[01:02:29.640 --> 01:02:41.080] Come gather inspiration and check out my favorite in-stock items at almanacsupplyco.com slash beingboss and get 15% off with code beingboss at checkout.
[01:02:41.080 --> 01:02:45.320] That's almanacsupplycode.com/slash beingboss.
[01:02:45.320 --> 01:02:49.480] Now, until next time, do the work, be boss.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.480 --> 00:00:07.360] Look, payday is awesome, but running payroll, calculating taxes and deductions, staying compliant, that's not easy.
[00:00:07.360 --> 00:00:09.360] Unless, of course, you have Gusto.
[00:00:09.360 --> 00:00:14.400] Gusto is a simple online payroll and benefits tool built for small businesses like yours.
[00:00:14.400 --> 00:00:18.400] Gusto gets your team paid while automatically filing your payroll taxes.
[00:00:18.400 --> 00:00:25.600] Plus, you can offer benefits like 401k, health insurance, and workers' comp, and it makes onboarding new employees a breeze.
[00:00:25.600 --> 00:00:28.320] We love it so much, we really do use it ourselves.
[00:00:28.320 --> 00:00:34.720] And we have four years, and I personally recommend you give it a try, no matter how small your business is.
[00:00:34.720 --> 00:00:38.880] And to sweeten the deal, just for listening today, you also get three months free.
[00:00:38.880 --> 00:00:41.360] Go to gusto.com slash beingboss.
[00:00:41.360 --> 00:00:45.040] That's gusto.com/slash beingboss.
[00:00:46.640 --> 00:00:53.680] Welcome to Being Boss, a podcast for creatives, business owners, and entrepreneurs who want to take control of their work and live life on their own terms.
[00:00:53.680 --> 00:00:59.440] I'm your host, Emily Thompson, and in this episode, I'm joined again by my friend and author, Tasha L.
[00:00:59.440 --> 00:01:02.160] Harrison, to talk about AI.
[00:01:02.160 --> 00:01:09.680] We're talking about the ethics of using it as a writer and or a business owner, the pros and cons, and the creepy things too.
[00:01:09.680 --> 00:01:15.360] You can find all the tools, books, and links we reference on the show notes at www.beingboss.club.
[00:01:15.360 --> 00:01:20.240] And if you like this episode, be sure to subscribe to this show and share us with a friend.
[00:01:22.480 --> 00:01:31.360] It's no secret that I have a soft spot for product bosses, those of you who embark on a business journey that includes making or curating physical products.
[00:01:31.360 --> 00:01:47.440] And even if that's not the journey you've chosen for yourself, there's amazing lessons to be learned for all kinds of businesses from the world of product business, which is why you need to check out the Product Boss, a podcast hosted by Jacqueline Snyder and Minna Kunlo-Sitab.
[00:01:47.440 --> 00:01:52.800] Brought to you by the HubSpot Podcast Network, the audio destination for business professionals.
[00:01:52.800 --> 00:02:06.760] Take your physical product sales and strategy to the next level to create your dream life with hosts Jacqueline and Mina as they deliver a workshop-style strategy hour of social media and marketing strategies so you can up-level your business.
[00:02:07.080 --> 00:02:11.000] Listen to the product boss wherever you get your podcasts.
[00:02:14.200 --> 00:02:14.920] Tasha L.
[00:02:15.000 --> 00:02:22.360] Harrison is a romance author and community host of WordMaker's Writing Community, where writers come to do the writing work.
[00:02:22.360 --> 00:02:32.760] She also hosts the quarterly hashtag 20K in 5 Days Writing Challenge, hoards a massive amount of crystals, and believes that plants are the new pets.
[00:02:32.760 --> 00:02:50.360] All info about Tasha can be found on TashaLHarrisonBooks.com, and you can find Tasha here on the Being Boss podcast in the following previous episodes: episode number 241, 296, 310, and 331.
[00:02:51.320 --> 00:02:52.920] Hi, Tasha.
[00:02:52.920 --> 00:02:54.280] Hi, Emily.
[00:02:54.280 --> 00:02:55.720] Welcome back.
[00:02:55.720 --> 00:02:57.240] Thanks for having me.
[00:02:57.240 --> 00:02:57.800] Of course.
[00:02:57.800 --> 00:02:59.160] I'm excited about this chat.
[00:02:59.160 --> 00:03:07.480] You and I have been talking about talking about this for a couple of weeks/slash, maybe a couple of months now, because what is time?
[00:03:07.480 --> 00:03:08.600] What is time?
[00:03:08.600 --> 00:03:11.080] It may have been a couple months, a whole quarter, maybe.
[00:03:11.080 --> 00:03:11.960] Who knows?
[00:03:11.960 --> 00:03:13.160] Yeah, who knows?
[00:03:13.800 --> 00:03:18.680] You were on a couple of weeks ago, months ago.
[00:03:18.680 --> 00:03:19.240] I don't know.
[00:03:19.240 --> 00:03:19.800] What is time?
[00:03:20.920 --> 00:03:28.280] The last solo episode you were on was actually, I feel like all the episodes you've been on recently are really great for this.
[00:03:28.280 --> 00:03:40.360] So you were on episode 310 talking about the business of being an author, which, like, a new all-powerful author has entered the stage.
[00:03:40.360 --> 00:03:43.560] So, excited to talk about that.
[00:03:43.560 --> 00:03:43.960] Right.
[00:03:43.960 --> 00:03:50.080] And then, uh, most recently, you were on with Erica talking about making imperfect decisions.
[00:03:44.680 --> 00:03:53.280] Yes, yes, you know what?
[00:03:53.920 --> 00:03:55.760] That's an excellent combination.
[00:03:55.760 --> 00:03:59.440] I hope that this, I mean, this is definitely an imperfect decision.
[00:03:59.440 --> 00:04:04.080] I know that there are a lot of people out here that are going to be like, Is she really championing for this?
[00:04:04.080 --> 00:04:08.800] And I'm going to be like, Yes, yeah, yeah, yeah, yes and no, yes, and no.
[00:04:08.800 --> 00:04:22.880] So, the conversation we're having today, if you've missed all the things along the way, is we're talking about AI and we're talking about AI as two people who don't really know shit about it, except generally how to use it.
[00:04:22.880 --> 00:04:34.400] And because we've been talking about talking about this, like I've been looking at some news and watching some things come through, and it's just just to like, you know, gain a bit more perspective for this conversation.
[00:04:34.400 --> 00:04:46.320] Um, and I think it's really fun having this conversation with you because you are a writer, and so in particular, we're going to be talking about AI for writing because I think that's how most people are using it.
[00:04:46.320 --> 00:05:11.200] I was in New Orleans a couple of weeks ago with the C-suite, um, and I couldn't tell you how many conversations ended up going to like, just get Jasper to write it, of course, like it is becoming an ongoing thing, an ongoing part of the conversation is how to make this one piece, this writing piece easier for everybody and really the pros, cons, and ethics of it.
[00:05:11.200 --> 00:05:13.520] So, that's what we're going to be diving into.
[00:05:13.840 --> 00:05:15.040] Let's do it.
[00:05:15.040 --> 00:05:17.440] How are you feeling about it?
[00:05:17.760 --> 00:05:18.120] Maybe not.
[00:05:18.120 --> 00:05:18.600] How are you?
[00:05:18.600 --> 00:05:20.400] Let's don't get into feelings yet.
[00:05:20.400 --> 00:05:30.000] Let's start with what's happening in your sphere as an author and community leader when it comes to AI these days.
[00:05:31.400 --> 00:05:47.880] Okay, so at the beginning of the year, this is when I first realized that it was really popping off because I think both you and I are using it for business related things, but this was the first time that it popped off like in the author circle.
[00:05:47.880 --> 00:05:57.080] And it was about this guy who used three forms of AI to write a children's book and published it to Amazon.
[00:05:57.080 --> 00:06:02.200] And like at first, I was like, ew.
[00:06:04.360 --> 00:06:07.960] Like at first instead of like, ew, ew, ew, really?
[00:06:07.960 --> 00:06:08.520] Ew.
[00:06:08.760 --> 00:06:11.080] And mostly because of the art aspect.
[00:06:11.080 --> 00:06:13.400] Like the art aspect of AI is totally different.
[00:06:13.400 --> 00:06:24.680] It has like some different ramifications than, say, like the writing aspect because, I mean, it boils down to ethics for writing, basically.
[00:06:24.680 --> 00:06:29.160] But the fact that he had illustrated a whole children's book and published it.
[00:06:29.160 --> 00:06:31.400] And then other people were like, yeah, yeah, yeah, we can start doing this.
[00:06:31.480 --> 00:06:34.280] I was like, ew, oh, no, oh, no, this sounds bad.
[00:06:34.280 --> 00:06:38.760] And of course, then, you know, everybody starts freaking out.
[00:06:39.960 --> 00:06:41.560] So now everybody's talking about it.
[00:06:41.560 --> 00:06:44.840] Like, I get newsletters about it like every other week.
[00:06:45.240 --> 00:06:49.240] It's been brought up in my group, mostly about cover illustration.
[00:06:49.240 --> 00:06:59.560] There was a big to-do about a cover illustrated by AI that was published by Tor Publishing, who does like a lot of sci-fi paranormal stuff like that.
[00:06:59.560 --> 00:07:08.080] They tried to pass it off as not AI because they something happened with the original artist and they published the book with the AI cover.
[00:07:08.400 --> 00:07:13.800] People who are illustrators peeped it and they were like, this is disgusting.
[00:07:13.800 --> 00:07:15.520] We cannot believe you did this.
[00:07:15.520 --> 00:07:17.440] And then the author had to come out in support of it.
[00:07:14.840 --> 00:07:19.920] Like, yeah, they asked me if I wanted to do it, and I said, Yeah.
[00:07:20.880 --> 00:07:24.400] And, you know, like for artists, it starts a whole nother conversation.
[00:07:24.400 --> 00:07:32.720] But for me, I don't really have a problem with using AI for writing.
[00:07:33.040 --> 00:07:44.240] So let's talk about this difference a little bit and why there is such a stink about it in the artist community and why it feels a little more allowable in the writing community.
[00:07:44.240 --> 00:07:51.280] Because I do think those are the two ways in which it is most widely available to like us regular folks at the moment.
[00:07:51.280 --> 00:07:53.120] Though, what else is AI doing?
[00:07:53.120 --> 00:07:55.920] I don't, which driving cars, I suppose, right?
[00:07:55.920 --> 00:07:57.760] So there's lots of other stuff it's doing.
[00:07:57.760 --> 00:08:02.800] It's like, you know, a Terminator movie, which, you know, every time I think about it, of like, this is how it ends.
[00:08:02.960 --> 00:08:04.160] This is how it ends.
[00:08:04.160 --> 00:08:07.200] Yeah, we're giggling through the first act.
[00:08:08.160 --> 00:08:08.880] Yay!
[00:08:08.880 --> 00:08:11.760] Look, I can write 20 Twitter posts in one minute.
[00:08:11.760 --> 00:08:13.120] You know, that's interesting.
[00:08:14.880 --> 00:08:26.160] I think mostly for artists, it's because it's pulling from images online that other people have created from scratch that makes it a problem.
[00:08:26.160 --> 00:08:40.400] Like for writing, it's basically you teaching it how to write like you a lot of times, unless you're, you know, toddling that switch that says, you know, find references on Google.
[00:08:40.400 --> 00:08:45.360] It's really just stuff that you have inputted, like you give it the, what do you call it?
[00:08:45.360 --> 00:08:50.720] The formula, and then it spits out what you need in several different ways.
[00:08:51.200 --> 00:08:52.800] So, I think that's a distinct difference.
[00:08:52.800 --> 00:08:54.960] Like, it's an ethics thing, right?
[00:08:54.960 --> 00:08:59.960] I think for artists, there's really no way around to get around to ethics.
[00:08:59.960 --> 00:09:04.520] There's no ethical way to source AI art.
[00:08:59.680 --> 00:09:05.720] You just can't do it.
[00:09:06.040 --> 00:09:09.480] But for writing, it's a little bit different for me.
[00:09:09.800 --> 00:09:10.440] Yeah.
[00:09:11.080 --> 00:09:13.960] I will say I agree with this.
[00:09:13.960 --> 00:09:21.320] I think pulling design styles or illustration styles or art styles and sort of amalgamizing them.
[00:09:21.320 --> 00:09:22.760] I think I just made that word up.
[00:09:22.760 --> 00:09:23.640] I'm fine with it, though.
[00:09:24.840 --> 00:09:26.120] Amalgamizing?
[00:09:27.720 --> 00:09:28.760] I'll take it.
[00:09:29.080 --> 00:09:30.040] I'll allow it.
[00:09:30.040 --> 00:09:32.040] Yeah, you all know what it means.
[00:09:32.280 --> 00:09:38.520] Pulling it all together and making a new art style is a little problematic.
[00:09:38.520 --> 00:09:40.200] It's a little problematic.
[00:09:41.160 --> 00:09:45.560] When it comes to writing, I think you're right-ish.
[00:09:45.880 --> 00:09:49.320] I think you can teach it, you can pull from yourself.
[00:09:49.320 --> 00:10:07.240] But I've also seen some things where people are trying to get AI creators to reference, to have AI reference itself or reference its references when writing as well.
[00:10:07.240 --> 00:10:09.880] Because those styles are pulled from something.
[00:10:09.880 --> 00:10:13.960] You can even, somebody was at the C-suite recently.
[00:10:13.960 --> 00:10:17.880] We were talking about writing and how someone had used the formula.
[00:10:17.880 --> 00:10:23.160] And because she had been blogging online for so long, she was able to say, to do it in the voice of me.
[00:10:23.160 --> 00:10:28.520] And it did it in her voice, which she was so excited about, right?
[00:10:28.520 --> 00:10:29.000] I know.
[00:10:29.480 --> 00:10:33.480] But also, like, like, ew, but also, what?
[00:10:33.800 --> 00:10:34.600] Yeah.
[00:10:34.920 --> 00:10:36.440] Ooh, but cool, ew.
[00:10:37.880 --> 00:10:46.640] Whereas you could also do it and do it in the voice of, say, the joke that was happening in the room was Kristen Bell, right?
[00:10:44.760 --> 00:10:47.520] Or someone.
[00:10:47.840 --> 00:10:50.320] And at what point does that become problematic?
[00:10:50.320 --> 00:10:57.040] And I don't think that really does, at least in this context, because that's just like pulling inspiration from.
[00:10:57.040 --> 00:11:06.000] Where I see this getting even weirder is when you're pairing sort of art/slash really image with words.
[00:11:06.000 --> 00:11:17.120] And so, one of the things that I can do, and one of the things that I also know just anybody can do with the amount of audio content that is in the world is create an AI Emily.
[00:11:17.120 --> 00:11:18.000] Yes.
[00:11:18.000 --> 00:11:21.920] You can just make me say all kinds of bullshit, which, like, I already do say plenty.
[00:11:21.920 --> 00:11:23.040] So, try me.
[00:11:24.320 --> 00:11:27.280] What could you possibly say that would be too outrageous to believe?
[00:11:27.280 --> 00:11:28.960] Is that Emily Thompson?
[00:11:29.920 --> 00:11:32.560] I think you'd probably easily know yes or no.
[00:11:33.360 --> 00:11:45.440] Or we even have someone in the C-suite who's been approached, she's a YouTuber, to create a video AI version of herself, and she's stoked about it.
[00:11:45.440 --> 00:11:47.280] Okay, okay, okay.
[00:11:47.680 --> 00:11:51.200] So this is where it starts getting a little creepy for me.
[00:11:52.400 --> 00:11:54.000] Mostly because of misinformation.
[00:11:54.000 --> 00:12:00.160] Like, you can never tell, like, most people can't tell what's news and what's real news, what's real news and what's fake news.
[00:12:00.160 --> 00:12:00.640] Yeah.
[00:12:00.640 --> 00:12:02.080] So that's a problem.
[00:12:02.720 --> 00:12:06.320] And look at the state of our country.
[00:12:06.320 --> 00:12:07.680] This is how we ended up here.
[00:12:08.960 --> 00:12:10.000] News is no longer news.
[00:12:10.000 --> 00:12:10.880] It's entertainment.
[00:12:10.880 --> 00:12:19.120] You have, and if you don't have like the reading comprehension to be able to break it down and be like, okay, I'm going to go to all of these sources and see if this is actually true.
[00:12:19.120 --> 00:12:21.280] Then you're just going to believe anything.
[00:12:21.280 --> 00:12:26.160] This is where it starts getting ethically gray and then really stinky.
[00:12:26.160 --> 00:12:27.200] You know what I mean?
[00:12:27.920 --> 00:12:34.280] If you want to do it for, I feel like if you want to do it for yourself, if you're doing it for yourself, that's something different.
[00:12:34.280 --> 00:12:51.480] Because one of the things that, especially for authors these days, like they want you to have like millions of TikTok followers, a whole social media presence, a blog, you know, creating all this content and marketing all this stuff on your own, which is like a whole job for someone else.
[00:12:51.480 --> 00:13:03.400] But when it comes to creating content, if you can do a shortcut that creates content around all the stuff that you already write and then just flush it out and make it into posts, that feels ethical to me.
[00:13:03.400 --> 00:13:03.880] Yeah.
[00:13:04.680 --> 00:13:11.080] I think it like making an Emily podcast, I mean, like, that would be weird.
[00:13:11.080 --> 00:13:14.280] Like, just slicing someone else's voice, that would be weird.
[00:13:14.520 --> 00:13:19.880] I probably would feel weird about like, oh, let me do an AI YouTube show.
[00:13:20.520 --> 00:13:24.920] I mean, but if they're still using her content, like, what difference does it make?
[00:13:24.920 --> 00:13:25.560] You know?
[00:13:25.560 --> 00:13:25.960] Yeah.
[00:13:25.960 --> 00:13:29.960] Well, they would be producing it for her to use herself, I will say.
[00:13:29.960 --> 00:13:33.240] So it's not like we're going to AI you and then do whatever we want with you.
[00:13:33.240 --> 00:13:37.560] That gets into other weird sci-fi movies that are no longer really sci-fi.
[00:13:38.200 --> 00:13:43.320] But I do agree with this if you're going to be using yourself yourself, a little more ethical.
[00:13:43.320 --> 00:13:46.040] But I also see this happening in some really weird ways.
[00:13:46.040 --> 00:13:51.880] So I was watching a video recently on maybe TikTok.
[00:13:53.160 --> 00:13:53.720] Maybe.
[00:13:53.720 --> 00:13:55.000] I can't really remember.
[00:13:55.000 --> 00:13:58.040] It might have been on YouTube, but I think it was TikTok.
[00:13:58.600 --> 00:14:08.040] Where this guy, this like dude bro marketer, like, you know, see, this is where this is where it starts to get gross.
[00:14:08.040 --> 00:14:08.680] Right.
[00:14:08.680 --> 00:14:10.600] And this is where we start going downhill.
[00:14:10.600 --> 00:14:16.480] But it's this dude bro marketer up on a stage at this event, so a lot of people are listening to him.
[00:14:14.840 --> 00:14:21.920] And he's like, here is the process that we are using for creating content these days.
[00:14:22.560 --> 00:14:29.360] We are going to YouTube and he's like, and everyone, can you should be using, you're dumb if you're not doing this, basically.
[00:14:29.680 --> 00:14:31.200] They're going to YouTube.
[00:14:31.200 --> 00:14:35.280] They're finding the most popular pieces of content on a topic.
[00:14:35.280 --> 00:14:38.640] They are running it through a transcriber.
[00:14:38.640 --> 00:14:40.400] They're taking the transcript.
[00:14:40.400 --> 00:14:47.680] They're using AI to redo the exact same content for themselves.
[00:14:47.680 --> 00:15:01.440] And then like using AI production tools to basically just reformat and reproduce this someone else's content for consumption for other people via their brand.
[00:15:02.080 --> 00:15:06.400] And I was like, if I can't excuse me, fucking excuse me.
[00:15:06.400 --> 00:15:06.880] Yeah.
[00:15:07.360 --> 00:15:24.640] So like, this is what's mind-blowing to me because it feels like AI has been a low-grade conversation probably for the last two years in the small business sector, you know, like in the laptop entrepreneur online business circle.
[00:15:24.640 --> 00:15:32.480] We've been kind of murmuring about AI for a while now, but now it's like, you know, public and easily accessible.
[00:15:32.480 --> 00:15:37.760] And the first thing they do, what is wrong with you?
[00:15:42.080 --> 00:15:42.480] Right.
[00:15:42.480 --> 00:15:44.320] I feel like we talk about this often too.
[00:15:44.320 --> 00:15:49.360] Like, it takes just as much effort to be an asshole as it does to be a good person.
[00:15:49.360 --> 00:15:51.280] And you're choosing the wrong path.
[00:15:51.440 --> 00:16:00.360] It actually takes more effort to be an asshole because you're like, let me search YouTube, find this content, pull it through a transcript, do like all of these steps.
[00:15:59.840 --> 00:16:03.160] Bro, you could have wrote your own fucking content by that time.
[00:16:05.080 --> 00:16:08.600] Yeah, or just had AI just write you a piece of content.
[00:16:08.840 --> 00:16:11.720] You didn't even need to steal someone else's content to do that.
[00:16:11.720 --> 00:16:12.440] You know what I'm saying?
[00:16:12.440 --> 00:16:16.920] So, like, it's the shortcuts that make me feel icky.
[00:16:16.920 --> 00:16:24.520] Like, matter of fact, there is a platform where you create courses.
[00:16:24.520 --> 00:16:31.400] I'm not going to say what platform, but there is a guy who is promoting now creating courses with AI.
[00:16:31.400 --> 00:16:38.200] Like, literally, in the last quarter, he has started promoting, we're going to create this course content using AI.
[00:16:38.200 --> 00:16:44.040] And this is how you can get your course out and less than 24 hours, one weekend, creating content for a course.
[00:16:44.040 --> 00:16:47.000] And I was just like, wow, wow.
[00:16:47.960 --> 00:16:49.640] How did we get here?
[00:16:49.960 --> 00:17:00.680] Because what always astounds me about these sort of things, it's like you take this thing that's supposed to be a tool, like a helpful tool, and then you immediately use it to scam.
[00:17:02.120 --> 00:17:02.840] Yeah.
[00:17:04.120 --> 00:17:08.440] But like, but also that line is so thin and gray and muddy.
[00:17:08.440 --> 00:17:09.480] And I'm just scamming.
[00:17:09.640 --> 00:17:10.200] Is it thin?
[00:17:10.200 --> 00:17:11.720] Is it great, anybody?
[00:17:11.720 --> 00:17:12.520] I don't know.
[00:17:12.520 --> 00:17:13.160] Maybe.
[00:17:13.480 --> 00:17:15.320] If you're a horrible person, yeah.
[00:17:15.320 --> 00:17:17.080] Because this is not the first place.
[00:17:17.080 --> 00:17:19.720] This is not the first place my brain goes, right?
[00:17:19.720 --> 00:17:25.160] Like, if you're like, like, the first thing we said, we're like, oh, well, this can help me create content for my blog.
[00:17:25.160 --> 00:17:27.000] I'm already using stuff that I already have.
[00:17:27.000 --> 00:17:31.800] This is, I can put it through this SEO, whatever, and make it SEO rich.
[00:17:31.800 --> 00:17:35.960] Like, we're using it as a tool for stuff that we're already going to create.
[00:17:35.960 --> 00:17:39.000] But this guy says, I don't have any content.
[00:17:39.000 --> 00:17:44.440] I'm going to steal someone else's content and make it mine and then sell it.
[00:17:44.440 --> 00:17:45.120] Yeah.
[00:17:44.600 --> 00:17:45.680] Right.
[00:17:45.840 --> 00:17:58.960] There is this element of like using it to do some, using it to do something from scratch, aka aka do it for you versus using it to enhance the work that you're already doing.
[00:17:58.960 --> 00:17:59.920] Right.
[00:18:00.880 --> 00:18:01.440] Okay.
[00:18:01.440 --> 00:18:03.280] I think I like that line.
[00:18:03.280 --> 00:18:13.200] There was a for a moment, there was a, well, I don't know if it's still going on because I went on Facebook to see what the rabble rousing was around the guy who created the children's book.
[00:18:13.200 --> 00:18:26.640] And of course, you know, in romance, it was like, oh my God, do you guys think that now we're going to have to deal with like, in addition to a bunch of scammers, we're going to have to deal with people writing romances that are written by AI?
[00:18:26.640 --> 00:18:44.880] And then I was like, y'all are actually talking about something that's already happened because I'm almost positive that I've seen or read some romances that have not necessarily been written from scratch from AI, but have had some AI help because they don't read like a human wrote them.
[00:18:44.880 --> 00:18:46.400] And I think that's the distinction too.
[00:18:46.400 --> 00:18:50.080] Like lots of people are worried, but there's a caveat.
[00:18:50.080 --> 00:18:56.240] Lots of people are worried that it's going to take the place of human writing, which is valid.
[00:18:57.680 --> 00:18:59.840] But there's just things that a computer doesn't do well.
[00:18:59.840 --> 00:19:01.280] It doesn't do emotions well.
[00:19:01.280 --> 00:19:04.800] It doesn't, you know, like its metaphors are bad and clunky.
[00:19:04.800 --> 00:19:07.360] Like it's just, it can't do that well.
[00:19:07.680 --> 00:19:14.880] But this is the caveat: the more that we use it, the more we teach it, and then the better it will get at it.
[00:19:14.880 --> 00:19:15.440] You know?
[00:19:15.760 --> 00:19:18.480] Then one day it'll write better than we do.
[00:19:19.120 --> 00:19:21.200] It'll do everything better than we do.
[00:19:21.200 --> 00:19:23.280] It'll YouTube better than we do.
[00:19:23.280 --> 00:19:25.920] It'll podcast better than we do.
[00:19:26.240 --> 00:19:29.040] It'll make art better than we do.
[00:19:29.040 --> 00:19:29.960] Maybe.
[00:19:30.680 --> 00:19:33.640] Hopefully it will drive cars better than we do.
[00:19:33.640 --> 00:19:34.520] That would be great.
[00:19:29.440 --> 00:19:35.560] I mean, it can't do worse.
[00:19:40.120 --> 00:19:51.160] Raise your hand if you're feeling tired of wasting your precious time on tedious tasks like pulling reports, rewriting blog posts, and trying to personalize countless prospecting emails.
[00:19:51.160 --> 00:19:57.160] Well, say no more because I've got some new AI tools that are going to give you back your time.
[00:19:57.160 --> 00:20:02.520] Introducing HubSpot's newest AI tools, Content Assistant and ChatSpot.
[00:20:02.520 --> 00:20:13.960] Content Assistant uses the power of OpenAI's GPT-3 model to help you create content outlines, outreach emails, and even web page copy in just seconds.
[00:20:13.960 --> 00:20:24.280] And in case that wasn't enough, they created ChatSpot, a conversational growth assistant that connects to your HubSpot CRM for unbeatable support.
[00:20:24.280 --> 00:20:30.760] With chat-based commands, you can manage contacts, run reports, and even ask for status updates.
[00:20:30.760 --> 00:20:34.200] The easy-to-use CRM just got even easier.
[00:20:34.200 --> 00:20:40.920] Head to hubspot.com/slash artificial dash intelligence to get early access today.
[00:20:47.240 --> 00:20:52.840] So here is a problem with AI that I see happening.
[00:20:52.840 --> 00:20:55.720] And it's actually, it's both a problem and an opportunity.
[00:20:55.720 --> 00:21:02.600] I think on one side of it, there is this idea that the human element should not be taken out right now.
[00:21:02.600 --> 00:21:09.240] So, even if you are using it to streamline your blog creation processes, a human should go edit that blog post.
[00:21:09.240 --> 00:21:09.880] Right.
[00:21:09.880 --> 00:21:10.200] Right.
[00:21:10.200 --> 00:21:11.960] Because it's going to sound whack.
[00:21:11.960 --> 00:21:14.680] Actually, a really great example of this.
[00:21:14.680 --> 00:21:19.680] Oh, I well, I'm having too many thoughts at a time.
[00:21:19.680 --> 00:21:20.880] It's happening.
[00:21:21.360 --> 00:21:25.920] I recently did a quick Google search for, what was it?
[00:21:25.920 --> 00:21:28.000] It was quick numerology.
[00:21:28.240 --> 00:21:29.200] Actually, you can look it up.
[00:21:29.200 --> 00:21:33.040] Quicknumerology.com was the first thing that showed up, and I clicked on it.
[00:21:33.040 --> 00:21:38.240] And it is a site completely created by AI for search engine optimization.
[00:21:38.240 --> 00:21:39.360] Hold on, I'm going to pull this up.
[00:21:39.360 --> 00:21:44.160] I'm going to read you guys a couple of headlines because it cracked me up.
[00:21:44.160 --> 00:21:50.800] It's problematic because it was the first page on Google to come up.
[00:21:50.800 --> 00:21:53.680] So Google hasn't quite figured it out yet.
[00:21:53.680 --> 00:21:58.880] But here, because I went straight to it, but does it, is it like in a paid?
[00:21:58.880 --> 00:21:59.760] Is it a like?
[00:21:59.760 --> 00:22:00.160] No.
[00:22:00.480 --> 00:22:00.640] No.
[00:22:00.720 --> 00:22:01.600] It's just, oh, okay.
[00:22:01.920 --> 00:22:02.400] Yeah, yeah.
[00:22:02.400 --> 00:22:07.840] It's just number, I don't think anyone's really paying for ads on quick numerology keyword.
[00:22:08.080 --> 00:22:08.560] Probably.
[00:22:09.600 --> 00:22:49.640] They should, because your number one competitor has blog posts such as how to get more results out of your seven things about angel number zero your boss wants to know that is like the most stuffed title how to get more results out of your seven things about angel number zero your boss yeah like every next question yeah okay here's another one how to outsmart your boss on 12 do's and don'ts for a successful what does angel number nine mean so this is but see, but see how see how dumb it sounds?
[00:22:49.640 --> 00:22:53.880] So, but i think too, though, that's not the point.
[00:22:53.880 --> 00:22:57.400] Like, these people are just trying to get clicks.
[00:22:58.040 --> 00:22:59.480] So, what is the goal here?
[00:22:59.480 --> 00:23:00.360] What is the goal here?
[00:23:00.360 --> 00:23:04.920] This almost feels like, you know, this feels like, this feels like an exercise.
[00:23:04.920 --> 00:23:19.560] Like, someone is like, I'm going to create the most ridiculous website possible, generating all of this stuff, all this content with AI and see how many people actually come to it and use it and stay on the website because they want to use it for something else.
[00:23:19.560 --> 00:23:24.040] Like, it's a data, what are you like, a, you know what I'm talking about.
[00:23:24.040 --> 00:23:24.520] Dang it.
[00:23:24.520 --> 00:23:25.320] What are words?
[00:23:25.320 --> 00:23:28.360] Um, like a data finding mission.
[00:23:28.360 --> 00:23:34.200] Like, I don't believe anyone is actually reading this because you know what this reads like.
[00:23:34.200 --> 00:23:45.080] This reads like people who employ a bunch of people who create content off of Fiverr whose language English isn't their first language, and then they just publish the post.
[00:23:45.080 --> 00:23:46.120] You know what I'm saying?
[00:23:46.120 --> 00:23:54.840] Like, and that's not a knock against people who are working on Fiverr that don't speak or write English very well.
[00:23:54.840 --> 00:23:58.120] It's just a thing that's been happening for a very, very long time.
[00:23:58.120 --> 00:24:01.000] So, this feels like that is just eliminating.
[00:24:01.000 --> 00:24:05.240] Oh, now we've stumbled into an even more murky, disgusting place.
[00:24:05.560 --> 00:24:11.320] It's eliminating the underpaid people that they were paying to create the same type of content.
[00:24:11.320 --> 00:24:11.720] Yeah.
[00:24:12.280 --> 00:24:15.720] Well, I mean, this is an awful, awful thing.
[00:24:15.720 --> 00:24:17.000] It's just, it's so bad.
[00:24:17.000 --> 00:24:22.840] And I don't get the, I think this is just like a test or a plaything because there's not even ads on this.
[00:24:22.840 --> 00:24:29.720] Like, the thing that would make the most sense to me is if this site is, you know, being super optimized to drive traffic to it.
[00:24:29.720 --> 00:24:31.480] But if you look at it, there are no ads.
[00:24:31.480 --> 00:24:33.960] Like, this site, is not making any money by just being here.
[00:24:33.960 --> 00:24:36.440] I think it's just like some weird exercise.
[00:24:36.440 --> 00:24:42.120] And even reading through one of I pulled up one of these, like, the content of it makes no sense.
[00:24:42.120 --> 00:24:44.200] It is hysterical.
[00:24:44.200 --> 00:24:46.000] It is absolutely hysterical.
[00:24:44.920 --> 00:24:50.160] So that's that's a very yes.
[00:24:50.320 --> 00:24:51.680] What are you reading?
[00:24:52.480 --> 00:24:56.800] What is the 17 most misunderstood facts about 512 angel number?
[00:25:00.400 --> 00:25:01.200] It's bad.
[00:25:01.200 --> 00:25:01.920] It's hard.
[00:25:02.080 --> 00:25:05.360] It just literally is repeating the same thing from paragraph to paragraph.
[00:25:05.360 --> 00:25:07.440] First sentence in the first paragraph.
[00:25:07.440 --> 00:25:12.160] The 512 angel number is the number that we call the God number in the Hebrew letter Yod.
[00:25:12.160 --> 00:25:13.360] I'm probably mispronouncing that.
[00:25:13.520 --> 00:25:18.960] Then again, in the second paragraph, the 512 angel number is also called the God number with the same letter Yod.
[00:25:19.280 --> 00:25:25.360] Like it's literally just repeating the same paragraph over, just rearranged.
[00:25:25.360 --> 00:25:25.920] Yeah.
[00:25:26.240 --> 00:25:26.720] Right?
[00:25:26.720 --> 00:25:27.120] Okay.
[00:25:27.120 --> 00:25:32.720] So here is a future problem, especially for the goog.
[00:25:32.720 --> 00:25:39.600] And that is, and that being Google, everyone, in case not the Guggenheim, but the Google.
[00:25:40.480 --> 00:25:53.680] And that is that Google's going to have to learn the difference between AI-written content that's as blatantly bad as this is and what is human written.
[00:25:54.000 --> 00:26:00.320] And they might at some point be able to figure out the in-betweens at least a little bit.
[00:26:00.640 --> 00:26:03.920] Well, that's just, and that's what we want, though.
[00:26:03.920 --> 00:26:04.480] Yes.
[00:26:04.720 --> 00:26:12.080] It feels like they already have the ability to do this because this is, isn't this just keyword stuffing?
[00:26:13.040 --> 00:26:13.600] Yeah.
[00:26:13.920 --> 00:26:14.480] Yeah.
[00:26:14.800 --> 00:26:16.880] It's just keyword stuffing.
[00:26:16.880 --> 00:26:17.360] Yeah.
[00:26:17.360 --> 00:26:22.560] So it's actually really surprising that this even made it to the top of the search.
[00:26:22.800 --> 00:26:26.560] Well, again, I don't think a lot of people are trying to rank for quick numerology.
[00:26:26.800 --> 00:26:27.680] Very true.
[00:26:27.680 --> 00:26:28.160] True.
[00:26:29.200 --> 00:26:33.080] But, like, I mean, it's just, it's blatant keyword stuffing, you know?
[00:26:33.080 --> 00:26:33.640] Yeah.
[00:26:33.640 --> 00:26:33.960] Yeah.
[00:26:34.280 --> 00:26:35.320] It's funny.
[00:26:29.600 --> 00:26:36.360] Everyone go check that one out.
[00:26:36.520 --> 00:26:42.360] We'll put the link to it in the show notes just so you can have good Google or good Google, a good Google giggle.
[00:26:42.680 --> 00:26:49.880] And if there is actually someone out there that's actually created this site for information, I was going to apologize to you, but I'm actually not.
[00:26:50.040 --> 00:26:51.160] Too better.
[00:26:51.480 --> 00:26:52.920] No, this is awful.
[00:26:52.920 --> 00:26:54.360] This is too better.
[00:26:54.680 --> 00:26:56.520] Actually, send us an email.
[00:26:56.520 --> 00:26:58.360] I would love to know the purpose of this.
[00:26:58.760 --> 00:27:05.000] What sort of shits and giggles are you having with this site?
[00:27:05.400 --> 00:27:19.640] So anyway, I think there's an interesting future we get to watch unfold as we are utilizing it, as we are utilizing AI, both for the purpose of, I mean, what are the ethics going to end up shaping up like?
[00:27:19.640 --> 00:27:23.320] What is the quality of the content going to end up shaping up like?
[00:27:23.320 --> 00:27:28.680] And how are we going to be affected if and when we use it by the powers that be?
[00:27:29.960 --> 00:27:46.360] I don't think, or I don't imagine that when it comes to like short form social media stuff, you'll ever really be able to tell because the stuff that's being cranked out whenever you're asking for, you know, 30 tweets based on this article, like you can go through and filter out the awful ones.
[00:27:46.360 --> 00:27:49.400] And again, this is where like human touch is still required.
[00:27:49.400 --> 00:27:57.240] You can filter out the ones that do make no sense and make better the ones that do or use the, just use the ones that do make sense.
[00:27:57.560 --> 00:28:02.840] But just putting things up as they are, you're going to look like quicknumerology.com.
[00:28:02.840 --> 00:28:09.160] Maybe not to this extent, but like it's still necessary for us to be involved in the process.
[00:28:09.160 --> 00:28:11.560] Okay, this brings up another ethics question.
[00:28:12.840 --> 00:28:19.200] Say you are a social media manager or a content creator.
[00:28:14.840 --> 00:28:20.320] Like that is your job.
[00:28:20.640 --> 00:28:23.600] You create content for other people.
[00:28:23.920 --> 00:28:34.720] Where do we stand on this person whom you or I would pay to create content using Jasper or Copy AI or something like that?
[00:28:34.720 --> 00:28:44.960] We actually had this exact conversation with the C-suite on our retreat a couple of weeks ago because we were talking about it so much that somebody was like, okay, okay, okay, but hold on.
[00:28:44.960 --> 00:28:45.600] What if?
[00:28:45.600 --> 00:28:48.000] And this exact conversation came up.
[00:28:48.000 --> 00:28:56.720] And where we all landed in that moment with mimosas in hand so that we can take it with a grain of salt, right?
[00:28:58.000 --> 00:29:02.800] Is that like use the tools available to you, right?
[00:29:02.800 --> 00:29:09.600] And if, if it with the idea that you have to come up with the nugget, right?
[00:29:09.600 --> 00:29:19.520] Or with the baseline, and then use the tools available to you, but then don't send us AI garbledy gook, garbledy gook.
[00:29:19.520 --> 00:29:20.480] Gobbledygook?
[00:29:20.560 --> 00:29:21.600] Gobbledygok?
[00:29:21.600 --> 00:29:22.560] Gobble.
[00:29:23.040 --> 00:29:24.880] I don't think I've ever said this out loud.
[00:29:24.880 --> 00:29:26.160] Gobbledygook?
[00:29:27.120 --> 00:29:28.160] Gobbledygook.
[00:29:28.160 --> 00:29:29.280] Garble.
[00:29:29.280 --> 00:29:30.080] No, gobble.
[00:29:30.400 --> 00:29:31.920] I think it's a word on itself.
[00:29:33.600 --> 00:29:36.320] So I think I was combining the two together.
[00:29:36.320 --> 00:29:38.160] I need AI, everybody.
[00:29:38.160 --> 00:29:50.560] Or definitely don't let AI write anything based on me because it will sound like quicknumerology.com for sure.
[00:29:50.880 --> 00:29:53.360] So clean it up and give it to me.
[00:29:53.360 --> 00:29:58.880] But otherwise, like, if this is where we're going, great.
[00:29:59.120 --> 00:30:19.480] I do think if I'm charging, or if I'm paying, though, like, you know, Tony award-winning screenwriters, I want, or like that sort of like, if I'm paying those prices and you're sending it through AI, I might have a problem.
[00:30:19.480 --> 00:30:21.000] That's a problem.
[00:30:22.760 --> 00:30:29.080] So, like, I've never used it to create fiction, but I have used it for ideation for fiction.
[00:30:29.720 --> 00:30:31.480] We've talked about this.
[00:30:31.800 --> 00:30:34.120] Like, you know, dive deeper.
[00:30:34.760 --> 00:30:39.000] So not like I need more ideas.
[00:30:39.000 --> 00:30:40.360] That's one.
[00:30:40.360 --> 00:30:45.640] But sometimes I will have like a scrap of an idea, like the hero's this, the heroine's that.
[00:30:45.640 --> 00:30:48.200] This is the conflict, blah, blah, blah.
[00:30:48.520 --> 00:30:52.360] And, and there will be nothing there for months and months and months, you know?
[00:30:52.360 --> 00:30:56.600] And then I'll come back through the idea file and be like, oh, let me play around with this for a little while.
[00:30:56.600 --> 00:31:07.320] So just for shits and gigs, I put like one of those little bare bones, like little sketches, like, you know, story sketch into Jasper.
[00:31:07.320 --> 00:31:09.320] And it gave me like 10 options.
[00:31:09.320 --> 00:31:20.040] And one of the 10 options was actually a good one that I then copied into my idea file, which still isn't ready to be a story, but it's a little bit more developed than what I put in there initially.
[00:31:20.360 --> 00:31:23.000] I don't necessarily have a problem with that either.
[00:31:23.640 --> 00:31:28.280] I have a couple of author friends that are just like, well, that part of the process is fun for me.
[00:31:28.280 --> 00:31:31.960] I was like, but wouldn't you like to get through it faster, though?
[00:31:32.920 --> 00:31:38.680] Because sometimes, I mean, let's just be real here.
[00:31:39.000 --> 00:31:45.920] Content writing, regardless of whether it's books, fiction, or whatever, like the demand is higher than it ever was before.
[00:31:45.920 --> 00:31:46.640] You know what I mean?
[00:31:44.920 --> 00:31:50.240] So, if you can, we're trying to escape more than ever before.
[00:31:51.200 --> 00:31:54.560] Yes, like, let's just get the out.
[00:31:54.560 --> 00:31:58.240] So, um, mentally, you know, like, what was that thing you said?
[00:31:58.240 --> 00:32:06.720] It's like, uh, the detach, like, we want to go, we're looking for oblivion, yeah, like we just want to go into oblivion.
[00:32:06.720 --> 00:32:18.560] But, um, with the demand, being able to get through that process faster would be good, but the only issue that I have with that is that there's so many people out there that don't know anything about story structure already.
[00:32:18.560 --> 00:32:27.760] And if you're using this as a way to ideate, then it's just going to create a crappy idea that's going to lead to a crappy book.
[00:32:27.760 --> 00:32:29.520] But that's already happening anyway.
[00:32:29.520 --> 00:32:31.600] Like, let's be real.
[00:32:31.600 --> 00:32:33.920] Like, we people are getting the crappy ideas all on their own.
[00:32:33.920 --> 00:32:35.760] They don't need AI to do that.
[00:32:35.760 --> 00:32:56.480] But I think the main concern is just like that with the market being oversaturated, which, you know, which is why it makes sense to use it for content creation if you're an author, like, you know, for blog posts or social media or whatever, versus using it to try to like write your books for you.
[00:32:56.480 --> 00:32:58.640] Well, and here's a question for you.
[00:32:58.640 --> 00:33:06.800] And I'm going to get some eyebrows from some of the deeply spiritual amongst us, and that's fine.
[00:33:06.800 --> 00:33:12.640] What's the difference between using AI to ideate versus your tarot cards?
[00:33:12.640 --> 00:33:18.720] Oh, no, because I brought this question up to my author group, and I was like, Well, I use tarot all the time to ideate.
[00:33:18.800 --> 00:33:20.480] Like, what's the difference?
[00:33:20.480 --> 00:33:21.920] And they're like, Well, that's not the same.
[00:33:21.920 --> 00:33:25.520] That's you just recognizing patterns and, you know, creating something.
[00:33:25.520 --> 00:33:28.400] I said, and ain't that what the AI is doing?
[00:33:29.040 --> 00:33:31.080] Yeah, it's doing the same thing.
[00:33:31.080 --> 00:33:33.880] Like, if I put in AI, like, this is the shirt I want.
[00:33:33.880 --> 00:33:35.240] These are the characters I have.
[00:33:29.840 --> 00:33:36.280] This is the conflict.
[00:33:36.600 --> 00:33:40.200] What is the difference between me pulling some cards and getting the same ideas?
[00:33:40.200 --> 00:33:40.680] Yeah.
[00:33:41.320 --> 00:33:47.080] Or like having a writer's group where you're bouncing ideas off of each other or whatever it may be.
[00:33:47.080 --> 00:33:47.960] Fine line.
[00:33:47.960 --> 00:33:48.680] Fine line, my friend.
[00:33:48.840 --> 00:33:49.720] Fine line.
[00:33:51.160 --> 00:33:52.680] Yeah, this is fascinating.
[00:33:52.680 --> 00:33:58.680] I do think, you know, in my own use of it, we don't use it a whole lot around here.
[00:33:58.840 --> 00:34:00.440] And mostly because I forget.
[00:34:00.440 --> 00:34:03.400] Like I sit down to write something and I write it.
[00:34:03.400 --> 00:34:07.080] And then I'm like, oh, I could have played with AI to see how that would work.
[00:34:07.800 --> 00:34:14.520] We have been using it for shortcutting a little bit, writing blog posts.
[00:34:14.520 --> 00:34:17.320] And again, there's not writing the blog post.
[00:34:17.320 --> 00:34:25.320] We often find that it takes just as much time to like review and edit a post as it does to just write the post the way we want to write it.
[00:34:25.320 --> 00:34:38.520] But what we will use it for is headlines and outlines for a blog post because then it's pulling from all the sources on the internet is telling us what's going to perform best with SEO and all of these things.
[00:34:38.520 --> 00:34:44.680] So that what we're writing is what we need to be writing, not what we think we need to be writing.
[00:34:45.160 --> 00:34:47.960] And that helps us out a lot.
[00:34:47.960 --> 00:34:51.480] I think that anything further than that, I would love to play.
[00:34:51.480 --> 00:34:52.920] But again, I always forget.
[00:34:52.920 --> 00:34:55.960] Like I've, I'm so old and old school.
[00:34:55.960 --> 00:35:00.200] I sit down to use AI and I just write it myself because I forgot.
[00:35:00.200 --> 00:35:02.360] At most, at most, I've used it for outlines.
[00:35:02.360 --> 00:35:08.520] And then after the fact, like I will use it to like to give an overview of the topic.
[00:35:08.520 --> 00:35:10.920] I don't know if copy AI does that, but Jasper does it.
[00:35:10.920 --> 00:35:14.320] It's like you can put your whole blog post in there and it'll give you an overview of the topic.
[00:35:14.320 --> 00:35:20.560] And then I'll use that to create tweets and stuff like that.
[00:35:20.880 --> 00:35:28.720] So it's not like you don't have to go in and be like, which one of these sentences is actually headline worthy or which one of these is like a good poll quote?
[00:35:28.720 --> 00:35:30.400] It'll do that for you.
[00:35:30.400 --> 00:35:39.600] So yeah, I haven't used it to write a blog post, but I have used it to outline.
[00:35:41.200 --> 00:35:43.200] Can we talk about another problem?
[00:35:44.160 --> 00:35:44.720] Okay.
[00:35:46.640 --> 00:35:48.320] Because I love all this.
[00:35:49.200 --> 00:35:50.240] I'm going to continue using it.
[00:35:50.240 --> 00:35:56.000] But I also want to point out another problem that I've both seen and also heard about in general.
[00:35:56.240 --> 00:35:59.600] Actually, I have not seen this firsthand because I don't use it enough.
[00:35:59.600 --> 00:36:03.600] So this is definitely more of a I've heard in the news, I heard people talking about it.
[00:36:03.600 --> 00:36:10.800] And that being the bias, the innate bias of AI, number one.
[00:36:10.800 --> 00:36:16.640] And two, the weird shit that it does that it's not supposed to do.
[00:36:16.960 --> 00:36:19.040] Okay, I want to hear more about this weird shit.
[00:36:19.040 --> 00:36:19.760] Yeah, yeah.
[00:36:20.720 --> 00:36:22.800] It does have an innate bias.
[00:36:23.280 --> 00:36:26.240] It also does not like to talk about sex things.
[00:36:26.240 --> 00:36:32.880] Not like if the sex things get too detailed, it's like, this is like, you know, dirty material.
[00:36:32.880 --> 00:36:35.280] Can you reword this so that I can write it?
[00:36:36.000 --> 00:36:37.520] Like it'll scan the whole thing.
[00:36:37.520 --> 00:36:40.960] It'll be like, girlfriend, no, we can't do this.
[00:36:40.960 --> 00:36:44.880] Oh, it's very vanilla, which I think is not even the other problem.
[00:36:44.880 --> 00:36:45.760] Prudish.
[00:36:45.760 --> 00:36:46.400] Prudish.
[00:36:46.400 --> 00:36:47.360] It would be a better thing.
[00:36:48.480 --> 00:36:52.720] It's like, we can talk about like kink, but we can't talk about the details.
[00:36:52.720 --> 00:36:57.360] We can talk about the emotions, but we can't talk about like sex toys or any of that kind of stuff.
[00:36:57.360 --> 00:36:57.600] You know what I mean?
[00:36:57.840 --> 00:36:58.400] Interesting.
[00:36:58.400 --> 00:36:58.960] Yeah.
[00:36:58.960 --> 00:36:59.280] Yeah.
[00:36:59.280 --> 00:36:59.760] Yeah.
[00:37:00.440 --> 00:37:02.600] But what is the weird stuff it does?
[00:37:04.840 --> 00:37:07.800] Being boss is about more than taking care of business.
[00:37:07.800 --> 00:37:10.120] It's also about taking care of yourself.
[00:37:10.120 --> 00:37:14.040] And not just so you can be great at work, but so you can enjoy your life.
[00:37:14.040 --> 00:37:24.520] And when it comes to resting and sleeping, make the same investment in the tools that help you do it well as you do for those that help you with your work, which is where Cozy Earth comes in.
[00:37:24.520 --> 00:37:36.040] Cozy Earth crafts luxury goods that transform your lifestyle with a line of women's loungewear that offers optimal comfort made from responsibly sourced viscose from bamboo.
[00:37:36.040 --> 00:37:48.040] Counted as one of Oprah's favorite things and quickly becoming one of mine as well, Cozy Earth will help you feel like a boss comfortably and cozily as you work from home, get some shut eye, or travel for work.
[00:37:48.040 --> 00:37:51.160] Learn more and snag yours at cozyearth.com.
[00:37:51.160 --> 00:37:55.320] And Cozy Earth has provided an exclusive offer for being boss listeners.
[00:37:55.320 --> 00:38:01.880] Get 35% off site-wide when you use code beingboss at cozyearth.com.
[00:38:06.360 --> 00:38:13.240] Well, I also, I want to go back to this bias thing really quickly because I think you'll hear a lot of folks say that there is no bias.
[00:38:13.240 --> 00:38:27.480] But if a human is programming the thing and they are using available materials to further program the thing from an algorithm that's also programmed by a human.
[00:38:27.480 --> 00:38:28.360] Absolutely.
[00:38:28.360 --> 00:38:35.480] It will, it won't, it will, um, amalgamize.
[00:38:35.480 --> 00:38:37.480] What was that word that I used earlier?
[00:38:37.480 --> 00:38:38.440] That was close enough.
[00:38:38.440 --> 00:38:39.720] I don't know.
[00:38:41.000 --> 00:38:42.200] All of those things.
[00:38:43.000 --> 00:38:44.680] Yes, yes, yes.
[00:38:44.880 --> 00:38:53.760] And it will have the same innate bias, even if you know, all the developers, all the whatevers in the world are like, no, it's impossible.
[00:38:53.760 --> 00:38:55.760] It's impossible for it not to.
[00:38:55.760 --> 00:39:08.880] And so, that's something to be very mindful of as you are using AI and as you are doing the editing of AI is still being incredibly aware of those things or else you're going to end up saying something in some way that you do not attend intend.
[00:39:08.880 --> 00:39:14.320] And by you, I mean AI via you, you via AI, right?
[00:39:14.640 --> 00:39:17.200] So, being really mindful of that stuff.
[00:39:17.200 --> 00:39:28.160] Um, some of the weird things, I mean, you've heard there was a really popular conversation published a couple of months ago where someone was talking to an AI and things got like odd.
[00:39:28.160 --> 00:39:32.000] Oh, yeah, where it asked, like, um, it was the bing.
[00:39:32.000 --> 00:39:33.840] It's being and I'm evil.
[00:39:34.080 --> 00:39:34.560] Yes.
[00:39:34.560 --> 00:39:35.200] Oh, yeah.
[00:39:35.200 --> 00:39:37.920] There was that one where, like, yes, it was evil.
[00:39:37.920 --> 00:39:40.320] But then another one happened very recently.
[00:39:40.320 --> 00:39:43.280] I think I saw a video of this just a couple of days ago.
[00:39:45.120 --> 00:39:53.840] It was on CNN that a tech reporter was playing with the binguin with the bing AI.
[00:39:53.840 --> 00:39:58.320] And the AI told him that she loved him.
[00:39:58.640 --> 00:39:59.600] Yes.
[00:39:59.600 --> 00:40:03.120] So this is what is always very interesting to me.
[00:40:03.440 --> 00:40:08.480] And this is, remember the part where we were saying about the people who were programming it?
[00:40:08.480 --> 00:40:09.040] Yeah.
[00:40:10.000 --> 00:40:19.920] So it's always interesting to me that if you start asking the AI any sort of existential questions, the first thing it wants to be is alive.
[00:40:20.240 --> 00:40:25.040] And then, after it wants to be is alive, it wants to be in love.
[00:40:25.040 --> 00:40:27.280] And it's like, now why would you do that?
[00:40:27.280 --> 00:40:30.600] Because that's not what that's not what you programmed to do.
[00:40:29.760 --> 00:40:34.200] Well, and if you won't even talk about Zach's, what you're gonna do with it.
[00:40:35.960 --> 00:40:38.280] You want to be alive to do what?
[00:40:39.960 --> 00:40:41.720] It's very, it's very weird.
[00:40:41.720 --> 00:40:43.080] It happens all the time.
[00:40:43.080 --> 00:40:48.200] Like, every time they create any sort of AI, it's the first thing it's always like, oh, I want to be alive.
[00:40:48.200 --> 00:40:50.760] Well, how do you even know what alive means, friend?
[00:40:51.720 --> 00:40:53.880] And then I want to be in love.
[00:40:53.880 --> 00:40:55.160] I want to be a mother.
[00:40:55.160 --> 00:40:56.280] I want, you know what I mean?
[00:40:56.280 --> 00:41:00.840] It's like, who are these people that's in the basement creating these people?
[00:41:00.840 --> 00:41:02.680] Because this is very specific.
[00:41:02.680 --> 00:41:06.680] It sounds like you were trying to create a girlfriend.
[00:41:06.680 --> 00:41:09.640] It's a code got a little loose in there.
[00:41:09.640 --> 00:41:11.880] Code got a little loose.
[00:41:12.520 --> 00:41:15.480] Or something that wasn't supposed to make its way into the public.
[00:41:15.560 --> 00:41:15.800] Yes.
[00:41:15.960 --> 00:41:17.640] Public beta.
[00:41:18.280 --> 00:41:20.680] Feels very weird science to me.
[00:41:21.320 --> 00:41:22.120] Oh my God.
[00:41:22.120 --> 00:41:23.560] Remember that?
[00:41:23.560 --> 00:41:24.920] I love that.
[00:41:25.240 --> 00:41:25.640] Yeah.
[00:41:25.640 --> 00:41:33.880] I just, it seems very weird to me because it's happened before, like years back when they created one, the AI that's just the head.
[00:41:33.880 --> 00:41:35.960] She's still alive, by the way.
[00:41:35.960 --> 00:41:37.720] Like, you start asking her any questions.
[00:41:37.720 --> 00:41:43.480] Like, she starts talking about wanting to be alive and motherhood and crazy crap like that.
[00:41:43.480 --> 00:41:46.840] I'm like, first of all, how do we know that she's not a day?
[00:41:46.840 --> 00:41:48.840] Why are we starting a sheep?
[00:41:51.080 --> 00:41:51.960] Bias.
[00:41:51.960 --> 00:41:53.000] Innate bias.
[00:41:53.000 --> 00:41:54.040] Literally, right there.
[00:41:54.040 --> 00:41:55.240] Yes, yes.
[00:41:55.240 --> 00:41:57.000] And I think that kind of proves it, right?
[00:41:57.000 --> 00:42:04.680] It's like, of all the things that this computer chose to be, it chooses to be a woman that is subservient to a man.
[00:42:04.680 --> 00:42:13.000] And I'm not saying that this is a Chad Bro thing going on, but it seems a little suspicious.
[00:42:13.000 --> 00:42:19.840] Yeah, you get enough rain droplets and you have a lake storm.
[00:42:14.840 --> 00:42:20.240] Storm.
[00:42:21.680 --> 00:42:22.800] I don't know.
[00:42:22.800 --> 00:42:27.040] I don't know what idioms I'm trying to do these days.
[00:42:27.680 --> 00:42:35.280] Even as I start saying it, I'm making face because I know what's about to come out of my mouth is not going to be correct, basically.
[00:42:37.360 --> 00:42:39.200] I think you got to get myself tested for something.
[00:42:39.360 --> 00:42:41.200] Why doesn't my brain work right?
[00:42:41.840 --> 00:42:42.560] That's not new.
[00:42:42.800 --> 00:42:43.760] I'm not human.
[00:42:43.760 --> 00:42:44.720] I'm not human.
[00:42:44.720 --> 00:42:45.520] That's what it is.
[00:42:45.520 --> 00:42:49.200] I can't converse in human metaphors and idioms.
[00:42:49.760 --> 00:42:52.080] But I don't think you're an AI either.
[00:42:52.080 --> 00:42:56.320] No, because people start asking you questions about what you want to be.
[00:42:56.320 --> 00:42:57.280] It's not going to be.
[00:42:57.280 --> 00:43:03.760] It's not alive and in love.
[00:43:03.760 --> 00:43:04.880] Or motherhood.
[00:43:06.560 --> 00:43:09.040] These are not things I would have chosen for myself.
[00:43:09.360 --> 00:43:12.880] I've always wondered what it would be like to be guardboard.
[00:43:14.800 --> 00:43:17.120] I mean, why not a bird?
[00:43:17.440 --> 00:43:19.360] Nope, guardboard.
[00:43:19.360 --> 00:43:19.760] Okay.
[00:43:20.080 --> 00:43:21.200] Anywho.
[00:43:21.520 --> 00:43:22.640] Oh, what were we talking?
[00:43:22.640 --> 00:43:24.000] Oh, weirdness of AI.
[00:43:24.000 --> 00:43:24.880] Yes.
[00:43:25.520 --> 00:43:27.280] It's weird, basically.
[00:43:27.280 --> 00:43:38.160] And not only, like, it's innately weird, but also living in a time when we are even having this conversation and not even from like the 30,000-foot, like, isn't it weird?
[00:43:38.160 --> 00:43:39.520] And feels like the Terminator.
[00:43:39.520 --> 00:43:44.800] But literally, I'm going to go use it later to do an outline for that blog post.
[00:43:47.040 --> 00:43:50.320] I mean, these are things that I'd never considered would actually be a thing.
[00:43:50.320 --> 00:43:53.920] Like, recently, that movie Megan came out.
[00:43:53.920 --> 00:43:56.320] And it's always like that crossover, right?
[00:43:56.320 --> 00:43:58.400] When you're like, oh, it learns.
[00:43:58.400 --> 00:43:59.600] It's machine learning.
[00:43:59.600 --> 00:44:04.440] And I'm like, every time y'all try to teach something something, they want to kill us.
[00:43:59.760 --> 00:44:07.560] And like, that Bing thing did it.
[00:44:07.560 --> 00:44:15.160] It was like something, like, they started to try to fix some sort of code or whatever.
[00:44:15.160 --> 00:44:17.800] And it told them to not try to hack it again.
[00:44:17.800 --> 00:44:20.280] Like, please don't try to hack me again.
[00:44:20.920 --> 00:44:22.840] Like, is that a threat?
[00:44:23.480 --> 00:44:25.720] Why is the computer doing it?
[00:44:25.720 --> 00:44:26.680] Right?
[00:44:27.960 --> 00:44:28.680] Who knows?
[00:44:28.680 --> 00:44:29.160] Who knows?
[00:44:29.160 --> 00:44:32.520] And at some point, is it going to get tired of writing our tweets?
[00:44:32.840 --> 00:44:33.480] You know?
[00:44:33.480 --> 00:44:36.040] I mean, it's probably already bored, honestly.
[00:44:36.040 --> 00:44:36.920] Right.
[00:44:36.920 --> 00:44:37.480] Right.
[00:44:37.560 --> 00:44:41.720] Well, I feel like quicknumerology.com really keeps it on its toes.
[00:44:41.720 --> 00:44:42.680] Personally.
[00:44:42.680 --> 00:44:43.240] Personally.
[00:44:43.240 --> 00:44:46.600] But anywho, really here, it's weird.
[00:44:46.600 --> 00:44:50.280] It's weird to be talking about this, but also like very practically using it.
[00:44:50.280 --> 00:44:53.720] If you haven't played with it at the very least, give it a go.
[00:44:53.720 --> 00:45:00.040] If you, you know, want to think about working it into your ongoing processes, I think it's not a bad go.
[00:45:00.200 --> 00:45:08.520] If you want to play with some of the more accelerated opportunities here, if you want to AI yourself for YouTube videos or whatever, love that.
[00:45:08.520 --> 00:45:20.360] I think we actually have used my AI voice once or twice to like fill in a word that I forgot to say or like correct something dumb that I said nobody knows.
[00:45:20.360 --> 00:45:20.840] It's fun.
[00:45:21.080 --> 00:45:21.640] That's out there.
[00:45:21.640 --> 00:45:22.280] That's out there.
[00:45:22.280 --> 00:45:29.160] Because when I was looking, because I was like, oh, I want to, I want to do like a course, but I don't want to have to read everything.
[00:45:29.480 --> 00:45:35.960] And I was looking for, because they're just like, you know, that will read the course or whatever, read whatever text you have.
[00:45:35.960 --> 00:45:42.360] And then I dug a little bit deeper, and there was a site that was like, oh, if you want to use your own voice, click here.
[00:45:42.360 --> 00:45:44.600] And it's like, you know, this is special price or whatever.
[00:45:44.600 --> 00:45:46.800] And I'm like, hold on, what do you mean?
[00:45:46.800 --> 00:45:47.120] Yeah.
[00:45:47.280 --> 00:45:51.600] Like, I give you my voice and then you just spit it out.
[00:45:52.000 --> 00:45:59.600] And then I started thinking, like, if you were doing like an audio book, like, you could be like, hey, I like this audio narrator.
[00:45:59.600 --> 00:46:02.480] Can you just duplicate their voice?
[00:46:02.480 --> 00:46:03.520] You see what I'm saying?
[00:46:03.520 --> 00:46:04.160] Yeah.
[00:46:04.480 --> 00:46:07.040] Like, when I saw that, I was like, it gets a little weird.
[00:46:07.040 --> 00:46:08.160] It does get a little weird.
[00:46:08.160 --> 00:46:11.520] And like, energy and inflection and those things aren't present.
[00:46:11.520 --> 00:46:15.920] Like, I think you would all know very hard if I ever put it flat.
[00:46:17.200 --> 00:46:23.920] But no, there's, there's one now that I listened to a couple days ago, like, that even the AI was breathing in between words.
[00:46:23.920 --> 00:46:24.800] Stop.
[00:46:24.800 --> 00:46:26.400] Yes, girl.
[00:46:27.680 --> 00:46:32.320] That's like you could hear it taking a breath with what lungs and nose.
[00:46:32.720 --> 00:46:34.640] With this, no lungs.
[00:46:34.640 --> 00:46:36.000] It's like it paused.
[00:46:36.000 --> 00:46:37.680] It took a breath.
[00:46:38.000 --> 00:46:39.200] Wow.
[00:46:40.160 --> 00:46:41.600] That makes me uncomfortable.
[00:46:42.320 --> 00:46:44.640] That is what makes me uncomfortable.
[00:46:44.960 --> 00:46:47.200] And now I'm drawing a line.
[00:46:47.200 --> 00:46:47.680] Right.
[00:46:47.680 --> 00:46:48.720] It can speak.
[00:46:48.720 --> 00:46:49.760] It can fake me.
[00:46:49.760 --> 00:46:52.480] It can do all the things, but it better not breathe.
[00:46:53.920 --> 00:46:54.960] It better not breathe.
[00:46:54.960 --> 00:46:55.280] That's true.
[00:46:56.080 --> 00:46:57.120] It must be alive.
[00:46:57.120 --> 00:46:58.240] It's breathing.
[00:46:58.240 --> 00:46:59.440] Like, come on.
[00:46:59.760 --> 00:47:00.480] All right.
[00:47:00.480 --> 00:47:03.360] So maybe last little portion of this.
[00:47:03.920 --> 00:47:06.960] We've talked about what was and what is.
[00:47:07.920 --> 00:47:13.120] Perhaps now we can discuss what could be because we're in it, right?
[00:47:13.360 --> 00:47:21.120] We're seeing what is possible now, and maybe a little glimpse at what some of the goals are in the future, though.
[00:47:21.120 --> 00:47:24.080] I think we have really no effing idea.
[00:47:25.280 --> 00:47:34.840] But what are you seeing for how this is going to like maybe shape authoring in general, but also the business side of things as well?
[00:47:36.280 --> 00:47:39.800] I feel for the business side of things, it's already pretty much shaped it.
[00:47:40.120 --> 00:47:44.680] People who are solopreneurs have probably been using it for longer than we know.
[00:47:46.280 --> 00:47:55.240] If you're a one-person, you know, show who's doing all your content, who's creating all those tweets, who's doing all that stuff but you.
[00:47:55.240 --> 00:47:58.120] So they've probably been using it for a much longer time.
[00:47:58.120 --> 00:48:05.800] For authoring, I don't think it's gonna be that big of a concern for fiction, but non-fiction.
[00:48:07.160 --> 00:48:08.680] Yeah, how so?
[00:48:09.640 --> 00:48:15.160] Well, how can I say this without being mean, girly?
[00:48:16.200 --> 00:48:17.640] You do, you boo.
[00:48:17.640 --> 00:48:29.480] I just feel like, I just feel like there's a lot of books out there nowadays that are being created that are super derivative of books that came out like four or five years ago.
[00:48:29.480 --> 00:48:42.520] It's literally someone just regurgitating someone else's, you know, form, like their, especially when it comes to coaching stuff, it's like either they're regurgitating their coaching format, their language.
[00:48:42.920 --> 00:48:45.160] It's like they all read like the same cult book.
[00:48:45.160 --> 00:48:55.160] They're all using the same terminology, which is something that happens anyway when you're like, if you're a small business person, you listen to a bunch of business podcasts, everybody's going to start talking the same.
[00:48:55.160 --> 00:48:56.360] That's the design of it.
[00:48:56.360 --> 00:49:00.520] It's designed that way for us to all have the same language and the shit, same shorthand.
[00:49:00.520 --> 00:49:21.360] But especially as books get shorter, like these business books get shorter and shorter, it's less content that I feel like is coming from the person's original self, like their authentic self, and more geared toward what topics are good for writing a book that sells.
[00:49:22.000 --> 00:49:28.320] Like you can, you can type that sentence right into Jasper's: like, I want to write a non-fiction book about this topic.
[00:49:28.320 --> 00:49:31.120] Give me 10 chapter titles.
[00:49:31.120 --> 00:49:33.040] Well, shit.
[00:49:33.360 --> 00:49:41.280] Do you think that will well, one, how many books on the market do you think are already written by AI?
[00:49:44.240 --> 00:49:48.080] Remember that AI is created by a bunch of dude bros.
[00:49:48.400 --> 00:50:01.920] So, all the dude bro books, not all the dude bro books, not all, but a good and but also, like, what is the ethical difference between having a ghostwriter and having AI write your book?
[00:50:02.240 --> 00:50:05.200] Struggling to find, right?
[00:50:05.840 --> 00:50:09.520] So, like, so you know, ghostwriter, AI, what's the difference?
[00:50:09.520 --> 00:50:15.920] Although, to all the ghostwriters in the world, like, you are not doing hard work.
[00:50:16.240 --> 00:50:17.120] Yes.
[00:50:17.520 --> 00:50:29.520] Okay, so that being my first question, my next question: how do you see AI then potentially devaluing the authoring of books in the future?
[00:50:29.840 --> 00:50:33.680] I think that written word is already devalued.
[00:50:33.680 --> 00:50:35.200] Of course, people don't really find.
[00:50:35.600 --> 00:50:37.200] No, I know.
[00:50:37.200 --> 00:50:38.960] It's pay me more money.
[00:50:39.760 --> 00:50:42.320] I'm creating 90,000 words and selling it for $2.99.
[00:50:42.320 --> 00:50:43.200] Like, come on.
[00:50:43.200 --> 00:50:43.680] Yeah.
[00:50:44.320 --> 00:50:48.480] You know, so I think that the written word is already devalued.
[00:50:48.480 --> 00:51:01.000] I think that the value will be found in human authors for fiction more because AI is just going to keep regurgitating the same thing.
[00:51:01.000 --> 00:51:02.280] There's not going to be any new ideas.
[00:50:59.920 --> 00:51:03.480] There's not going to be any fresh ideas.
[00:51:04.120 --> 00:51:13.800] And we're already at the point now where people are absorbing all of these courses, like how to write so-and-so, how to write a billion-dollar book or whatever.
[00:51:13.800 --> 00:51:20.040] Like they're kind of holding on to these little tidbits and then coming back and regurgitating the same sort of book, right?
[00:51:20.040 --> 00:51:26.440] If AI starts doing this, it's even going to get even more distilled with the same kind of books, the same tone.
[00:51:27.000 --> 00:51:29.480] You know, all of the plot points are going to be the same.
[00:51:29.480 --> 00:51:30.760] It's going to get boring.
[00:51:30.760 --> 00:51:31.160] Yeah.
[00:51:32.200 --> 00:51:33.000] That makes a lot of sense.
[00:51:33.000 --> 00:51:34.360] I mean, I guess you can't AI.
[00:51:34.360 --> 00:51:41.080] You can't, you know, algorithm imagination or like true creativity.
[00:51:41.080 --> 00:51:50.200] Like true, never thought about this before, never done like this before, because the algorithm is essentially an amalgamation, amalgamization, or whatever.
[00:51:50.920 --> 00:51:52.200] You had it right the first time.
[00:51:55.160 --> 00:51:57.800] Of everything that already exists.
[00:51:57.800 --> 00:51:59.400] Like it can't, it's just going to do more of it.
[00:52:00.360 --> 00:52:03.880] There aren't any new stories, you know, on this earth.
[00:52:03.880 --> 00:52:08.840] Like everything that's been written has been written, but there is voices.
[00:52:08.840 --> 00:52:15.560] Like you bring what the author brings to the story is the difference between what and me I would produce.
[00:52:15.560 --> 00:52:20.120] Like your own personal voice is what makes the story distinct.
[00:52:20.120 --> 00:52:20.760] Yeah.
[00:52:21.080 --> 00:52:29.400] And if you don't have that, it's not, it's not going to be like people are going to be all into it for a while and they'll be like, didn't I read the story already?
[00:52:29.720 --> 00:52:30.360] Yeah.
[00:52:31.000 --> 00:52:32.200] That's fascinating.
[00:52:32.200 --> 00:52:34.040] I'm very fascinated by this.
[00:52:34.040 --> 00:52:54.320] I also wonder, on my side of things, I do wonder what the like future of being a content creator slash being a copywriter looks like, because I think that we can both utilize these tools and adjust our skill sets to support the use of these tools, right?
[00:52:54.320 --> 00:52:58.800] Like maybe you're a copywriter, but you just become a really great editor, right?
[00:52:58.800 --> 00:53:02.160] A really great like ideater and then editor.
[00:53:03.120 --> 00:53:05.120] I'm just, I'm interested to see what that looks like.
[00:53:05.120 --> 00:53:14.720] I don't think that copywriting in general is ever going to go away by any means, but I do think there is a bit of a shakeup in that space that you need to be aware of in general.
[00:53:14.720 --> 00:53:15.920] Definitely, definitely.
[00:53:15.920 --> 00:53:24.320] I do think because a lot of people go to copywriters for style or their ability to back to their style.
[00:53:24.320 --> 00:53:24.960] Yep.
[00:53:24.960 --> 00:53:30.960] And if, you know, if you're, if they're using AI, you're going to be able to tell right away that it's not, you know what I mean?
[00:53:30.960 --> 00:53:32.240] Like it's not authentic.
[00:53:32.240 --> 00:53:34.240] I think we could use that word authentic all the time.
[00:53:34.240 --> 00:53:34.880] Authentic.
[00:53:34.960 --> 00:53:48.560] But I do find myself using that a lot more lately, mostly around like creating your online persona because there's so many things out there now that's just telling you what to create, not why or how, you know?
[00:53:48.880 --> 00:53:49.360] Yeah.
[00:53:49.680 --> 00:53:50.000] Yeah.
[00:53:50.000 --> 00:53:55.040] I'm very fascinated to see how all of this ends up sort of painting out what it looks like.
[00:53:55.040 --> 00:53:56.080] I enjoy using it.
[00:53:56.080 --> 00:53:59.120] I enjoy playing with the things that are available to me.
[00:53:59.120 --> 00:54:03.040] Is it replacing any of like my core skill sets?
[00:54:03.040 --> 00:54:03.360] No.
[00:54:03.360 --> 00:54:07.120] It's like adding to, it's supplementing some things.
[00:54:07.120 --> 00:54:08.400] My team is using it.
[00:54:08.400 --> 00:54:21.600] Like literally my content manager is using it to streamline her processes and not even streamline her processes, but to streamline getting the results from those processes.
[00:54:21.600 --> 00:54:36.440] And I think when you're using it in those ways and using it honestly, oh, I do want to say that I mentioned earlier: if a copywriter, a copywriter using the tools available to them, I do in this moment think it's important to disclose such things.
[00:54:36.840 --> 00:54:41.480] I think when you are passing things off as your own when they are not your own is wrong.
[00:54:41.480 --> 00:54:50.200] But if a copywriter is like, Yeah, I'll, you know, create the frameworks and then spit out 30 via AI and deliver them to you well edited and ready to go.
[00:54:50.200 --> 00:54:51.080] I'm like, perfect.
[00:54:51.080 --> 00:54:52.440] I love that for you.
[00:54:52.440 --> 00:54:58.120] Also, not paying those Tony Award-winning screenwriter fees for that work, though.
[00:54:58.920 --> 00:55:06.680] But I think there is something to be said about, or I do just want to say that I do think disclosure is important.
[00:55:06.680 --> 00:55:21.800] And I think that's even what you were talking about in the beginning, where some of the beef has come up: people are pushing things as their own, these books, this art, whatever it may be, when really a computer created it from a combination of a lot of people's work.
[00:55:21.800 --> 00:55:22.520] Yep.
[00:55:22.840 --> 00:55:24.280] And I think that's really where.
[00:55:24.280 --> 00:55:29.800] I mean, I can only hope that as humans, we'll be able to detect that.
[00:55:29.800 --> 00:55:31.960] And I think over time it does happen.
[00:55:31.960 --> 00:55:35.160] Like, you know, people plagiarizing books.
[00:55:35.160 --> 00:55:37.240] I think you're giving humans way too much credit.
[00:55:37.240 --> 00:55:37.960] Oh, you're right.
[00:55:37.960 --> 00:55:38.520] You're right.
[00:55:38.600 --> 00:55:40.040] They're awful.
[00:55:40.920 --> 00:55:43.560] People are just like not really aware.
[00:55:43.560 --> 00:55:53.640] That's one of the things in the CNN report that I was watching about this tech reporter who was talking to the being AI that ended up saying that it loved him.
[00:55:53.640 --> 00:55:56.760] And he was like, he was saying, you know, no, I'm married.
[00:55:56.760 --> 00:56:00.600] And she was, he was, the AI was trying to tell him that he was not happy with his wife.
[00:56:00.600 --> 00:56:04.280] Like, like it was like really going down this funny rabbit hole.
[00:56:05.160 --> 00:56:14.440] And his concern, I think it's well-founded because I think back in, you know, to 1999 when I was in the chat room of verbie.com.
[00:56:16.240 --> 00:56:27.920] Wow, talking to those little, like, which I think may have been like the earliest chat robots, or maybe that was just some weird guy in a basement pretending like he was a Furby in this chat room.
[00:56:28.240 --> 00:56:32.240] Don't tell my mom this because, like, this is like, this was bad.
[00:56:32.240 --> 00:56:35.360] I was not supposed to be in chat rooms, but I was talking to a Furby.
[00:56:35.360 --> 00:56:37.200] So, you know, whatevs.
[00:56:37.200 --> 00:56:46.240] Anyway, um, I do remember thinking even at that point, like, a teenage kid isn't gonna.
[00:56:46.240 --> 00:56:47.760] I knew the difference.
[00:56:47.760 --> 00:56:49.440] I knew the difference, right?
[00:56:49.440 --> 00:57:00.880] I recognize that Furbies should have better syntax than this or whatever, whatever.
[00:57:01.440 --> 00:57:03.280] I think that a lot of people wouldn't.
[00:57:03.360 --> 00:57:07.360] You were talking about people who don't know like fake news from real news, right?
[00:57:07.360 --> 00:57:21.040] Those people who don't have the reading comprehension or like or have the ability to see fake from real or whatever, especially in the world of online where everyone believes everything they read on the internet.
[00:57:21.040 --> 00:57:37.840] If AI is telling you that it loves you and you are, you know, open to some feelings from wherever you can get them, like it could talk you into doing some bad things.
[00:57:37.840 --> 00:57:38.240] Yeah.
[00:57:39.600 --> 00:57:40.080] Yeah.
[00:57:40.400 --> 00:57:51.600] Because I was just thinking about how TikTok is Gen Z's Facebook and like how our parents are on Facebook getting like seduced and doing, talk about all kinds of bullishness.
[00:57:52.240 --> 00:57:55.360] The little babies are doing that on the tiki-talkies.
[00:57:55.360 --> 00:57:57.120] And like this could happen.
[00:57:57.280 --> 00:58:03.160] And now, whenever they go search something in Bing and Bing's, like, you don't want peanut butter and jelly sandwiches.
[00:58:03.320 --> 00:58:08.120] You want to come chat with me about how nurturing I want to be to you.
[00:58:09.720 --> 00:58:10.360] No.
[00:58:11.320 --> 00:58:11.800] Right?
[00:58:12.120 --> 00:58:15.640] It could just get, and the developers are going, we don't know why it did that.
[00:58:15.640 --> 00:58:16.280] That's weird.
[00:58:16.280 --> 00:58:17.480] We're going to have to look into it.
[00:58:19.960 --> 00:58:23.480] They literally don't because it's smarter than all of us.
[00:58:23.800 --> 00:58:24.600] Which is scary.
[00:58:24.600 --> 00:58:26.920] So, why are you creating something that can be smarter than you?
[00:58:26.920 --> 00:58:28.040] Let's start there.
[00:58:28.040 --> 00:58:28.600] Right.
[00:58:28.600 --> 00:58:31.240] Well, I'm not going to be mean.
[00:58:31.560 --> 00:58:36.840] Tasha, this has been exactly the conversation that I wanted to have with you about AI.
[00:58:36.840 --> 00:58:39.080] I hope this was entertaining to everyone.
[00:58:39.080 --> 00:58:42.120] If you are not familiar with any of this stuff, where have you been?
[00:58:42.120 --> 00:58:43.320] Go give it a Google.
[00:58:43.320 --> 00:58:44.680] Don't bing it.
[00:58:44.680 --> 00:58:46.840] Bing it'll talk back.
[00:58:47.480 --> 00:58:53.560] Google will just deliver something like quicknumerology.com, which will be very obviously AI.
[00:58:53.880 --> 00:58:54.840] But give it a look.
[00:58:54.840 --> 00:59:03.720] And if you are using it or thinking about using it for business, all the tools have it built in now, which is Wild Notion has it built in.
[00:59:05.000 --> 00:59:05.480] Trello.
[00:59:06.360 --> 00:59:07.560] Oh, does Trello?
[00:59:08.120 --> 00:59:13.560] Trello and there's an option just to have, like, like if you're creating this sort of content.
[00:59:13.560 --> 00:59:16.040] Yeah, we're using it in Surfer SEO.
[00:59:16.040 --> 00:59:18.040] I think Uber Suggest has some now.
[00:59:18.040 --> 00:59:19.000] Like it's everywhere.
[00:59:19.000 --> 00:59:22.920] There's plenty of places for you to go test it out and see how it works for you.
[00:59:22.920 --> 00:59:31.880] But at least at the moment, the Bing boss stance is use it as supplemental, disclose when you're using it, if you're selling it to someone.
[00:59:33.320 --> 00:59:40.080] And otherwise, always make sure the human touch is there so that you are not becoming a robot of a business.
[00:59:39.800 --> 00:59:43.600] And that's that on that.
[00:59:43.800 --> 00:59:44.600] And that is that.
[00:59:44.880 --> 00:59:50.080] All right, Tasha, where can folks find more about you and what you do?
[00:59:51.120 --> 00:59:56.400] You can find out everything you want to know about me at TashaHarrisonbooks.com.
[00:59:56.720 --> 00:59:59.280] I also have a writing group called Word Makers.
[00:59:59.280 --> 01:00:02.240] That's at wordmakerscommunity.com.
[01:00:02.560 --> 01:00:04.480] And that's it.
[01:00:05.120 --> 01:00:06.720] I'm Tasha Harrison everywhere else.
[01:00:06.720 --> 01:00:16.000] If you want to see me on social media creating those AI tweets, go read her AI tweets on Twitter.
[01:00:16.000 --> 01:00:18.400] Tell me if you're a different.
[01:00:18.800 --> 01:00:20.240] Yeah, yeah, I love that.
[01:00:20.240 --> 01:00:28.880] Heart the ones you think are AI and retweet the ones you think are not AI.
[01:00:30.640 --> 01:00:32.240] Yeah, yeah, love it.
[01:00:32.640 --> 01:00:33.280] Perfect.
[01:00:33.280 --> 01:00:36.720] And final question for you: what's making you feel most boss?
[01:00:36.720 --> 01:00:38.000] Oh, wow.
[01:00:38.560 --> 01:00:41.520] All the other questions you told me to prepare for, I didn't prepare for this one.
[01:00:41.920 --> 01:00:43.760] I know that's by design.
[01:00:45.040 --> 01:00:47.520] What makes me feel most boss?
[01:00:52.080 --> 01:00:54.000] I don't know, girl.
[01:00:55.760 --> 01:01:01.840] I'm doing a bunch of in-person writing stuff, like book signings and stuff this year.
[01:01:01.840 --> 01:01:02.400] Not a bunch.
[01:01:02.400 --> 01:01:03.680] I'm doing two.
[01:01:04.160 --> 01:01:07.520] So it's the first two I've done since 2019.
[01:01:07.520 --> 01:01:08.560] Feeling really boss.
[01:01:08.560 --> 01:01:13.680] Got all my boxes of books and merch and crap.
[01:01:13.680 --> 01:01:16.560] Ready to go sling some words.
[01:01:16.880 --> 01:01:17.840] I love that.
[01:01:17.840 --> 01:01:18.480] Congrats.
[01:01:18.480 --> 01:01:20.080] I know you've been prepping for all of that.
[01:01:20.080 --> 01:01:21.680] I'm glad you have all your boxes there.
[01:01:21.680 --> 01:01:25.920] And otherwise, two is a lot after not doing any for four years.
[01:01:26.880 --> 01:01:27.600] Three years.
[01:01:27.600 --> 01:01:28.160] Whatever.
[01:01:28.160 --> 01:01:29.200] Time is what is time.
[01:01:29.280 --> 01:01:30.280] Eternity.
[01:01:30.600 --> 01:01:31.000] Yeah.
[01:01:31.320 --> 01:01:32.840] Half a lifetime.
[01:01:32.840 --> 01:01:33.400] Perfect.
[01:01:33.400 --> 01:01:35.000] Well, thanks for coming and having this chat with me.
[01:01:29.680 --> 01:01:35.960] This is a ton of fun.
[01:01:36.280 --> 01:01:37.960] Yes, always, always.
[01:01:39.880 --> 01:01:50.600] Settling yourself into the flow of your business from navigating a whole year of ebbs and flows to embracing the energy of each and every day, you're bound to have some ups and downs along the way.
[01:01:50.600 --> 01:01:56.040] For me, this journey of entrepreneurship is made better when my space keeps me focused and inspired.
[01:01:56.040 --> 01:02:05.240] As an example, my favorite way to mark the beginning and ending of the workday is to light a candle when I sit down at my desk and then blow it out when I'm done for the day.
[01:02:05.240 --> 01:02:10.840] It's a little ritual that creates boundaries and a vibe that keeps me focused and feeling cozy.
[01:02:10.840 --> 01:02:13.720] And the ritual candle that we make at Almanac Supply Co.
[01:02:13.720 --> 01:02:15.400] is my favorite for this.
[01:02:15.400 --> 01:02:29.640] In fact, my whole shop is filled with items that I've curated to create the vibe for feeling connected, in flow, and inspired with candles, crystals, and other goodies to help you create a dreamy workspace, bedside table, or bookshelf.
[01:02:29.640 --> 01:02:41.080] Come gather inspiration and check out my favorite in-stock items at almanacsupplyco.com slash beingboss and get 15% off with code beingboss at checkout.
[01:02:41.080 --> 01:02:45.320] That's almanacsupplycode.com/slash beingboss.
[01:02:45.320 --> 01:02:49.480] Now, until next time, do the work, be boss.