WP Product Talk
WP Product Talk
AI in Support: What Works, What Doesn’t, and What’s Next
Loading
/

AI is changing how WordPress product teams handle customer support—but not every experiment delivers results. Join us for “AI in Support: What Works, What Doesn’t, and What’s Next” as Aaron Edwards, Founder of Docsbot, sits down with co-hosts Matt Cromwell and Zack Katz to share real lessons from building and deploying AI in WordPress support.

We’ll explore what AI does best today, where it still falls short, and what’s coming next for product owners who want to scale smarter without losing the human touch. Tune in October 15 at Noon Eastern Time for this lively, future-focused discussion on the evolving world of AI in WordPress support.

Show notes:

Transcript

Show/Hide Transcript
Matt Cromwell
00:06-02:10
Hey, everyone. When AI first showed up in the technical support world, which is a world that I love and have been dedicated to for a long time, everyone thought it was going to replace our humans, our support technicians that we love. And I'll admit, as somebody who's been leading a hundred plus team of technicians, I was a bit concerned, honestly, which is why I decided to tackle AI and support directly over the past couple of years. I was like, if this thing's going to eat my staff, I'm going to have to figure it out first. But what I learned, though, is not what all the headlines were saying at all. AI is not replacing our humans at all, but it's amplifying them. It's making our support sharper and faster and honestly, even a lot more human than it's ever been before. That's why I love this topic, because it's not about chasing shiny tools. It's about learning how to make our people and our products better. And that's why I wanted to get a guest who sees this topic at scale from both sides, both as a support person and also as a business person and as somebody who's building a tool specifically for this person, for this purpose, who's built a product around the idea of AI in support and who I've come to really depend on a lot personally, actually. No one understands this topic better in WordPress than our guest today, Aaron Edwards, founder of DocsBot. And he's honestly one of the smartest AI support tools in the WordPress space. So excited to have him on today. See you soon. This is WP Product Talk. A place where every week we bring you insights, product marketing, business management and growth, customer experience, product development, and more. It's your go-to podcast for WordPress product owners by WordPress product owners. And now, enjoy the show. Hey everybody, I am Matt Cromwell at Stellar WP.

Zack Katz
02:11-02:13
Hey everybody, I am Zack Katz from GravityKit.

Matt Cromwell
02:15-02:26
That was a good impression. I totally picked up on that. And today we're excited to have with us Aaron Edwards from DocsBot. Aaron, welcome, welcome.

Aaron Edwards
02:27-02:37
Hey, everyone. Thanks for having me. Thanks for being on the show. And please tell the folks a bit about yourself. Yeah, sure. Well, I'm Aaron Edwards.

Aaron Edwards
02:38-03:45
I started in WordPress, gosh, version 2.6. I like to go by version numbers rather than years. And started learning to kind of be a plugin developer. And I have my own WordPress multi-site. And then I joined WPMU Dev and was with them for 12 years and starting as a plugin developer when we're only three employees there. And then until there was hundreds of employees and I was serving as the CTO for almost 10 years there. And then started building my own kind of WordPress products. So the first one I built with a co-founder was Infinite Uploads. And that was kind of a WordPress cloud storage plugin and ended up selling that last year. And now I'm doing DocsBot AI. And we've been going for almost three years now and was one of the first AI support tools, basically, in the space. And because I was from the WordPress community and know so many people there, we have a lot of customers that are WordPress product owners, whether it's Stellar WP or Automatic or Asimotive or a lot of the big names.

Aaron Edwards
03:45-03:48
Even Zack over at GravityKit, I think, is using DocsBot.

Aaron Edwards
03:49-03:58
So, yeah, so it's been exciting time and exciting time growing that business and learning a lot as I'm building an AI and as things change so rapidly.

Matt Cromwell
03:59-04:29
Yeah, absolutely. Thanks for being on. Folks, if you are here on YouTube watching live, feel free to use the comments for your questions or feedback. We would love to answer them live on the show. And also, do us one quick favor. Before we jump into everything, could you take a quick second and just click that subscribe button? Why? Because we're trying to get to 1,000 subscribers, and you could really help us out a lot. Thanks a bunch, folks.

Zack Katz
04:30-04:33
Yeah, and I thought I was subscribed to our show.

Matt Cromwell
04:34-04:35
And it turns out I wasn't.

Zack Katz
04:35-05:03
So I subscribed and I'm really happy to be notified whenever a new episode drops because there's so much to learn. And even on episodes I'm not participating in, I love this show and it has so much good stuff. And speaking of good stuff, Aaron, AI and customer support. Can you give us an idea of where we're at? What is the state of things in AI powered support?

Aaron Edwards
05:04-06:58
Yeah, gosh. As Matt was just kind of talking about is we have, like when I first came out, it was basically the same week that ChatGBT API became available at all publicly. You know, this was right after ChatGBT existed. And so it's been kind of, I was lucky enough to be kind of at the very beginning of it, you know, when kind of this new invention of LLMs and chatbots and being able to use RAG, which is Retrieval Augmented Generation, to kind of pull content from your documentation as context to provide answers. So I was basically one of the first, like I think maybe three of us launched within the first few weeks of that becoming a possibility. So that's really helped with the business, you know, growing and everything being one of the first to the space. But now the space is full of a million, you know, low quality competitors and some really high quality competitors that luckily are much more expensive than me. But it's been an interesting growth time and learning that and also trying to keep up with AI technology. I think I read like a stat that basically since we launched, like the amount of knowledge or intelligence that you get for your dollar has more than 100 X. And that was last summer that that stat came out. You know, now it's, I think, even more now that we have GPT-5 and things like that. the amount of knowledge and power that we're getting. So that's been good like financial wise, but also it makes it really hard to run a product when it's changing so rapidly and the technology is changing so rapidly. And then you have things that make my heart stop when you see, okay, Oh, chat GPT is releasing agent kit. Is that going to kill Doc's not, you know, back in the day it was chat GPT releasing GPTs. And then you have Google coming out with their own products. You know, it's like, how did I get in this place where I'm competing against Google?

Aaron Edwards
06:59-07:00
somewhat directly, right?

Aaron Edwards
07:02-07:41
But somehow we managed to stay ahead by keeping things very, I think, focused on the customer support use case and very opinionated, I think, in some of our architecture decisions to make sure that we're building tools that are actually useful for our ideal customer. So there's been a lot of adjustment and growing in there. And people like Matt have been super helpful as being a customer himself of giving the feedback and showing, okay, what do people who are actually in this customer support space really need? I want to kind of give a summary of what DocsBot does,

Zack Katz
07:41-09:15
because I don't know if people know. What do you mean DocsBot? Is it just ChatGPT or something? Like, well, kind of, but no. For GravityKit DocsBot, for example, you can go in there and configure a bot. You can configure multiple bots, but each bot is, sure, let's call it a custom GPT if you've used that before, where you can have its own instructions for how it should respond, and then you get to provide it with data sources. Now, with ChatGPT, you have to upload files and stuff. One of the amazing things about DocsBot is you can point it at dynamic sources and say, okay, I want you to read all my blog posts on my website. I want you to read, to check out our YouTube videos, to check out our Help Scout documentation articles. That's all cool, right? That's all public stuff. I also want you to read all of our Help Scout tickets and incorporate those into the responses. And I was worried about that, right? Because one of the problems when you're doing support is there's a lot of personally identifiable information. There's a lot of like email footers. I mean, Help Scout is pretty much 90% horrible email footers. I was worried about how well DocsBot would handle that. But one of the things that DocsBot does, and correct me if I'm wrong, Aaron, is that it kind of summarizes each one of the tickets and provides a summary response for each one of the tickets

Aaron Edwards
09:15-09:17
and uses that as the data source,

Zack Katz
09:17-09:56
not the actual raw ticket information. So if your customers are emailing you username and passwords, they shouldn't, they should be using trusted login instead, then that information doesn't get integrated into the DocsBot summaries. So you have one bot that is trained on all your different data sources, and then you can communicate with that bot. They have API integrations. they have help scout integrations using web hooks to like add auto responders and stuff like that so you can set that up if you want to um what am i missing in a great elevator speech stack

Aaron Edwards
09:57-10:25
um you got a screen share here should i pull it up yeah you can i can show just a little bit like here's my doc spot that we're using for our own dog food and our own customer support so you can I have it trained on like our latest new feature blog posts. So it's kind of up to date with like the latest features. We don't have it trained on every single blog post because that's just kind of too much general, you know, information and stuff.

Zack Katz
10:26-10:27
Okay, hold on a second, Aaron.

Aaron Edwards
10:28-10:28
Yeah.

Zack Katz
10:28-10:43
Let's get into this. What is the right amount of information to train on? because one of the things with AI is that it gets overwhelmed if you provide it with too much information and not enough context.

Aaron Edwards
10:43-10:47
It needs to have a good information to context ratio there.

Zack Katz
10:47-10:51
So you're only doing the latest news? Why?

Aaron Edwards
10:52-11:38
Yeah, I have it tied to only our feature updates thing because our entire blog, I mean, we have some articles that might be useful, but it's general stuff about AI, general stuff about customer support. You know, you don't want to pull in too much context. You want to kind of keep it focused to what the answers are, you know, for a bot. So we don't do the entire, like, blog. But then I can add, like, specific sitemaps. So, like, here's just, like, pages, specific pages. Like, on our site, we have, I could have it scan our entire website. But we have things like landing pages for specific industries. We have things like our AI model library, which is not really super useful for our customer support use case. But using DocsBot.

Zack Katz
11:39-11:54
Do people want a preset? I would be worried that they're missing out on something we've already written about. And if they're asking for an AI template, or if there's somebody that is in an industry that you already have a template done for them.

Aaron Edwards
11:55-12:29
Yeah, I mean, it varies. You can add anything. my general rule of thumb is add anything that you think could be useful you know for responses but what we found normally when people have like a blog or something it has a lot of stuff that often is out of date especially if you're doing like feature announcements things like that so limiting it by using for example our feed feature to only the latest like announcements that's useful in our case because we have feature announcements that go way back and those things have changed slightly, you know, since those initial announcements.

Matt Cromwell
12:30-12:30
Yeah.

Aaron Edwards
12:31-13:44
Yeah. So a lot of different ways of doing that. And then we have, just like you're talking about, we have it trained on our Help Scout tickets. So I can see I have it set, I think, going back the latest three months. And that's something that was tricky for us to learn as we were building these kind of features, where we can train it from connected to your help desk and train from past tickets is a lot of times your answers change over time, your staff's answers change over time, even if they are kind of like the gold standard, that changes, you know, as your product changes and things like that and as your team does. So you can automatically say, okay, I only want to index the last whatever X months of tickets. And then we'll go on through there. So I can see it actually found 1,100 tickets that it scanned. And then after passing all those through an AI to kind of summarize and try to turn them into FAQs, it turned that only into 750 individual kind of FAQ items that are generic and such. And then we also like read the images and things from the ticket content and stuff. And if that's kind of useful. So I love that.

Matt Cromwell
13:44-14:04
I also was like hyper concerned about the help scout part of like training the bot on our help scout. So I was like, we're actually in the middle of trying to make our technicians provide better answers. And so there's a lot of not great answers in there. So I don't know. It's like a little bit of a garbage in, garbage out concern.

Aaron Edwards
14:04-14:05
You know what I mean?

Matt Cromwell
14:06-14:11
But we first had it also at three months, but then we brought it down to one month.

Aaron Edwards
14:11-14:15
And actually the responses improved a ton when we did one month instead of three.

Matt Cromwell
14:16-14:55
And it maybe is just because we had a better concerted effort at improving our responses overall in that most recent four weeks. But the one thing that I thought was really fascinating was we what we'll do sometimes is if we notice a particular problem comes up, we'll quickly put together kind of like a canned response that feels that we know is accurate and that we know that customers have had success with. And so like, for example, with LearnDash, it's pro only. So you can't just roll back to the previous version very well.

Aaron Edwards
14:55-14:58
And so if there's a problem with a current release,

Matt Cromwell
14:58-15:03
will somebody just be like, hey, here's a zip of the previous release, roll back to this previous release.

Aaron Edwards
15:03-15:06
So we had a canned response where we were sending folks this thing of like,

Matt Cromwell
15:06-15:32
oh, this is the problem. We know what's going on and we're going to fix it. And here's a link to Dropbox where you can get this zip. The bot totally picked that up and was providing the link to Dropbox. And it was doing it accurately. Like when they were expressing the exact problem that we were trying to solve. And all of our support technicians were like, see, it's putting us out of a job.

Zack Katz
15:34-15:48
Well, and if your job is responding with an autoresponder, I think that's like, not that that's your job. I'm saying like that part of your job, you should be not doing. Nobody wants to do that. It's like, oh, no, I got to do that.

Matt Cromwell
15:48-15:57
Here we go, blah, blah, blah. Like that's not, you know, work that makes you excited. And that's the way I talk with my support folks too. We're kind of getting ahead. We're all support nerds here.

Aaron Edwards
15:58-16:00
So we're getting ahead of the game a little bit.

Aaron Edwards
16:00-16:08
I mean, the point is, if the answer is already known, you know, from your docs or from your recent staff responses, why would you not automate that?

Matt Cromwell
16:09-16:09
You know?

Aaron Edwards
16:09-16:22
Absolutely. That boring rote work that you just do over and over, and that gives you time to focus on better customer conversations, reaching out, dealing with the more complex debugging issues.

Zack Katz
16:24-17:41
One of the things that we've found to be challenging, and we're working on this, and it's something that I want to improve on for the rest of the business life. This is not just a fixed set in stone process. Is how to tell DocsBot and how to tell these other AI agents like HelpScout's built-in AI chat bot, which ones are the best responses? Which ones are the ones that should be used as the standard response for this question? Because I don't know about y'all, but when I review the chats that our customers have with DocsBot, and I embedded DocsBot on our support page above our support form so that people can interact with the chat bot and try to help themselves if they want to. When I review those chats, they are meandering. They are all over the place. They get one answer to one question, and then they just ask the next question. And it becomes about 15 different chats. And it's all in one thread. So how to turn that, like there's a lot of noise, but there's a lot of value in each one of those responses and each one of those questions. And yet breaking those into their own independent atomic units

Aaron Edwards
17:42-17:42
is kind of difficult.

Zack Katz
17:43-17:48
I love to show that off. Yeah, that's a very good point.

Aaron Edwards
17:48-18:07
So we kind of have two different ways. So we have like our conversation logs where you can see like the entire conversation user had with their bot. And that can cover multiple topics, as you said. But then we actually break down within that conversation, individual, like when the AI determines something is a question,

Aaron Edwards
18:08-18:11
or it needs to look up information, we break that down separately.

Aaron Edwards
18:11-18:20
So we actually have two log formats. And this, I don't think any competitor has this, where you have your questions and you also have your conversation logs.

Aaron Edwards
18:20-18:23
And so that way we can collect stats differently,

Aaron Edwards
18:23-19:02
depending on if it's conversation-based. So you want to say, okay, in a conversation base, we want to see kind of just overall, like how well the conversation was able to answer, the sentiment of that counts, deflection, whether that conversation led to escalating to human support, things like that. And then we also keep stats separately just by question, you know. So you can kind of look at it at two different ways. And that makes it kind of tricky to know exactly what you're looking at. We're trying to figure out how to communicate that better, you know, but it does help to have both of those data points to analyze.

Zack Katz
19:03-19:14
Yeah. And understanding support trends is really one of the things that we're trying to do. And I know I believe that DocsBot does some of this. If you have the business plan, perhaps it has the topics.

Aaron Edwards
19:15-19:57
Yeah, we just launched our conversation topics. Basically, like you can let the AI automatically determine topics or you can preset specific like categories. And then basically, after a conversation has been idle for a certain amount of time, we automatically summarize that into a summary phrase. You don't have to read the whole conversation in your logs. And then we also classify it by sentiment and also topics. So that way you can see overall like what your topics are. And then you can also see over time, like how those things are changing. So that's something we're playing with, like figuring out the best way to display that to users. But yeah, that's definitely useful stuff to know.

Aaron Edwards
19:59-19:59
Love it.

Zack Katz
20:03-20:44
So another question I have, Aaron. Okay. So we all have, hey, product owner, if you don't have good docs, you need good docs. um you that's like step one for the ai uh revolution is have good documentation allow your ai to be trained on it um but my question for you aaron and for matt is like okay so how do we format docs uh to be best uh uh uh ingested by ai at the moment what's the what's the latest thinking on how to write docs is it one big doc with everything is it little docs like how what's the best way to improve retrieval and search and things like that for your AI?

Aaron Edwards
20:44-20:51
Yeah. Well, first, let me correct your assumption because now that we can learn from tickets, you don't necessarily need docs.

Zack Katz
20:52-20:59
Well, that's true, but should we not then create docs for, based on the tickets that we have?

Aaron Edwards
21:00-21:06
We should. Yeah. But I think one of the first blog posts I ever wrote after DocsBot was like, customers never read your docs.

Aaron Edwards
21:06-21:10
Right? Yeah. Which is sad, but true.

Aaron Edwards
21:10-23:29
It's a very, very small percent. That's why AI Answers is so useful, because it can take the stuff you've written, that's maybe your goal set of truth, you know, and use that to answer, like, specific questions for the user automatically. But as far as, like, how to format your docs, definitely, like, what we found, almost every kind of what we call this retrieval augmented generation to where, well, let me break down what that is, basically. We're not training a whole new AI model on your documentation. Basically, we're indexing the data sources that you add, whether it's your docs, your past tickets, or scanning your website, uploading files, whatever it is. We have more than 20 different data sources that you can use, even podcast episodes and videos, YouTube, all that kind of stuff. So you input all that, and we create a vector database, basically. Basically, it's just a fancy way of searching those docs. So first of all, we're stripping all the extra junk, whether it's the headers and footers and nav in your website or in email threads, stripping all that information. And then we try to convert it all mostly to markdown format, which is basically a very heavy text base, but it maintains things like headings and lists and tables and stuff like that in a way that's easy for the AI to understand and it's really token efficient. So you have that kind of training database. And when a user asks a question, then the AI agent basically in the back and says, okay, I need to look up information to answer this question. And so based on the question, we're actually, we're using agent-based retrieval now. So where it's actually looking at the whole context of the conversation and it's saying, okay, I'm going to run anywhere between, we have minimum two search queries, but sometimes it can run 10 search queries in parallel. So if they have a complex multi-part question, the AI is actually determining, this is what something like perplexity does in the back end that you don't see, saying, OK, to answer this question, I need to look up, OK, what is pricing for the pro plan? I need to look up information about this specific feature and this specific feature and this integration. And so it's performing all these lookups on the back end. And then it's combining all that into kind of a list of results that's provided to the AI so that it can form its answer.

Aaron Edwards
23:31-23:35
So that's a basic description of how it works under the hood.

Aaron Edwards
23:36-24:38
But knowing that helps you kind of format your docs. Because sometimes we'll see with users where they are trying to use it kind of in a way that it's not really designed to. Maybe they have a huge long PDF file or something. And then they say, OK, I want you to summarize this entire PDF and extract the data points. Or even they'll upload a spreadsheet or something, and then they want you to perform math and advanced analysis on it. And it's like, that's not really what we're designed to do because it's never going to see that entire document at once. It's going to instead try to find the specific chunks of text that are most relevant to provide that answer. So basically, anything that's closer to an FAQ format or where you have especially well-designed, where you have headings and then maybe paragraph bodies that go along with that that help you separate those chunks and make sure it's very clear to the AI what the topic of that is. That's really helpful.

Matt Cromwell
24:39-24:55
Yeah. That's that last part there about FAQ format. To me, that's been the stuff that I've found the most success with is trying to formulate the content of the issue in a question and answer

Aaron Edwards
24:55-25:01
format as much as possible, particularly a question in a way that normal people are asking

Matt Cromwell
25:01-26:20
that question. And that only comes by analyzing a whole bunch of questions, which it makes it kind of like a circular process. Like we are monitoring our DocsBot logs like every day. And we look at the way that people ask questions and it's like, oh, they asked this question in just a different way enough that it didn't find the answer that all these other people did find um and then we were like okay well are we gonna um you know hone in on the bot with a docs but also has this really great qa feature uh that allows you to improve the bot responses by saying yeah basically giving it a an faq source um are we going to do that or are we going to go straight into our docs themselves and tweak the way that we word them so that they are closer to the way that these people are asking that question or not. That's what we found. But the reason why that works for us is because we don't have to worry about all the things that Aaron said that DocsBot does before it ingests all of our content, which is that he's stripping out all the content on the page that doesn't matter. And that's also, I think, a really big point and a big key is, you know, our websites are full of a whole bunch of information that's not necessarily relevant at all.

Aaron Edwards
26:20-26:20
Yeah.

Matt Cromwell
26:21-26:24
So it's a big benefit to what DocsBot does, I think.

Aaron Edwards
26:24-26:31
And I've heard that that's the same for like this GEO or like AI optimized SEO that everyone's talking about now.

Aaron Edwards
26:32-26:32
Yeah.

Aaron Edwards
26:32-26:43
They say a key is like reformatting your content to be much more kind of FAQ condensed question and answer based. And that's a big ranking factor for AI answers now.

Matt Cromwell
26:46-27:19
100%. Google for a while they have been talking about that EAT model and it's very very similar. You're looking for the same types of authority and authenticity in voice and tone and all that. I do think Google's general EAT model idea, even though it's kind of controversial because a lot of folks felt like their websites tanked from organic results when that rolled out, I do think that it's the right idea overall. And I think it's highly valuable to AI as well.

Zack Katz
27:21-28:28
I do wonder, so I completely agree. My question to Aaron and maybe Matt, if you know this as well. Okay. So do you do an FAQ doc that is one per product? Do you do one per question where the question itself is the title and then the answer is the body. How do you have them grouped? One of the things we've been discussing at GravityKit, okay, I want every single setting that we have in the entire plugin documented. Okay, so that's something we need to have documented just because that should be documented. But then questions related to each one of the settings as an FAQ kind of thing, it seems like maybe each one of the settings should have its own page. If there are any questions about the settings that are related to it, how do you think about structuring all this stuff? And related, because this isn't a big enough question yet, our co-host Ian asks, how does DocsBot handle situations where I'm supporting like 50 products from one inbox? Yep. Yeah, that's good.

Matt Cromwell
28:28-28:31
It's a conversation Aaron and I have had multiple times.

Aaron Edwards
28:32-29:01
- Yeah, so just answering yours first, Zack, is I just think like keeping it, like even just you're having like a webpage and you're just having like a header for the question and then a paragraph for the answer. I mean, that's usually great. We do have ways in DocsBot where you can like more, have more control over how the data is ingested. And we call that like a raw data format. And basically it's just a spreadsheet and it says, okay, here's the content, here's the title, here's if you want it to like link to a certain place.

Zack Katz
29:02-29:07
um if it's using this source um that gives you kind of full control those like saying the weight

Aaron Edwards
29:07-29:14
this more than the yeah yeah yeah currently prioritization is based purely off like semantic

Aaron Edwards
29:14-30:44
meaning so you have the step where it's doing the semantic search and some weight is given the keywords too um and then after that you have a re-ranking step where the ai kind of determines okay out of this big list of results what are the most likely to be relevant and then it actually gets passed to create the answer. So some of that is a little bit of a black box that we have little control over. Something we're going to release soon is like metadata filtering. So you can say-- and this speaks to that question that I think it was Ian just asked-- is that oftentimes you may have multiple products, and they have very different segments of your documentation. Right now, you have to train a different bot for each one. And then in your integrations, whether it's a widget or in help scout you can map okay let's say if i tag this conversation for this product then it gets routed to this specific bot to write an answer so that's the way it works now um because each bot has its own kind of library um but in the near future we're going to have actual like metadata so where you can categorize your sources and say okay this set of sources is for this product this set is for this product and then the ai naturally based on the conversation context or clarifying with the user can determine, okay, which filters do I need to use to only look up documentation on this specific thing? So that'll be very helpful in that, but it does work now by just doing different bots per product. Yeah. I will say I've had some success with this. So we

Matt Cromwell
30:44-32:09
earlier this calendar year launched a brand new product called Stellar Sites, which is the culmination of what Stellar has been working towards for a really long time, which is a hosted environment that comes out of the box with cadence and cadence blocks and cadence forms and solid security and solid backups. And often it will also come with Give and LearnDash. So all of these things are all in one environment and one support desk, essentially. So coming out of the gate with that, I wanted to make sure we had a great, strong AI first touch for all of our support there with DocsBot. And Aaron and I have been talking about that subject for quite a while. What I did in the end, which I find to be relatively successful so far, is I did two big things. One is I wrote a knowledge-based doc only for AI, a doc that was written for AI specifically, because we had docs for Stellar Sites, but not like a giant ton of docs. And we had a whole bunch of more features and little nuances and things that weren't ready for public docs yet, but were great for AI to know about. So I wrote that first. And then in our custom instructions, I explained really clearly that if from the context of the question, it's not clear

Aaron Edwards
32:09-32:20
to you which product they're talking about, ask them which product they're talking about before you give an answer and it actually did respond really well to that to those instructions um

Matt Cromwell
32:21-32:41
sometimes a little bit too well like that i feel like the context is perfectly clear like of course this is cadence because they're asking about a block um but it's it's not perfectly clear to the ai um but and so it would stop me like wait a second are you talking about give here what are you talking about and they're like no dummy cadence so uh but then the conversation moves

Aaron Edwards
32:41-33:20
forward from there and it worked out okay i we have a customer that does this too um they have um a whole bunch like more than 100 different products that are like uh like generators and things like that and each one is its own separate pdf manual and so they actually in their customer instructions say okay before answering ask them what the model number is you know and then it won't it won't answer or perform a search until they confirm what the model number is and then the instructions also say they control the search so they say now when you search the docs always include the model number in your search strings um and that helps find the relevant docs kind of in

Matt Cromwell
33:20-33:27
a more agent-based way yeah i i don't do that that's a good tip this relates to this conversation as

Zack Katz
33:27-34:07
well ian asks what mechanism should we use to identify missing documentation when i was developing gravity kits uh custom instruction set for our doc spot uh we have the ask you know uh if you have any questions about like if you don't know what the customer is talking about ask them what product they're talking about um and then make sure to only use any short codes from this specific list of short codes and i've hard coded a list of the short codes that we have available um we have a docs with filters one of the things that ai loves to do is make up things that don't exist that probably should. It's really good at coming up with short code names that should exist.

Aaron Edwards
34:07-34:08
And not hallucinating them, yeah.

Zack Katz
34:09-34:53
One of the things that I saw when I was reviewing the questions and the answers that DocsBot generated, and this isn't because of DocsBot, it's because of AI in general, is that it would often have different short code responses than existed. So not only clarifying what the product is that you're talking about, but also if you do find that you have issues with hallucinations or whatever, make sure to be clear on a given list of items that are available for the AI to respond with. Yeah. So Aaron, do you have some customer stats that you want to show us? I'd love to see more about, you know. Well, let me first answer Ian's question real quick,

Aaron Edwards
34:53-34:54
because I think that was a good one.

Aaron Edwards
34:54-35:45
He asked, how do you know, how do we use it to identify missing documentation? So two ways, the AI automatically classifies whether or not it was able to answer the user's questions. So in your question logs, you can filter to only questions that the AI couldn't find an answer to. And you can even export that as a spreadsheet, so you can analyze it. And then on our business plan, we have a question topics report. And so that actually, once a month, it looks at all the user questions that it couldn't answer or that had a poor rating by the user. And then it uses AI to kind of turn that into topics. So main themes and then sub-themes. You can say, okay, my documentation or maybe just my product user experience needs to be improved in these specific areas.

Matt Cromwell
35:46-36:32
Yeah. I want to pause right there and highlight this. something I keep trying to say in a lot of different contexts. I have found so far that the whole AI first touch chat bot, knowledge-based bot experience is not and never should be a set it and forget it kind of situation in my mind. All the things that Aaron was just highlighting are things that my team is doing on a daily basis, going through the logs and looking for the things that the bot can't answer and then trying to decide whether we train the bot deeper or we update our online documentation. And then they go and they update that documentation. And then that cycle keeps going back because now we have better docs and now it informs the bot better.

Aaron Edwards
36:33-36:35
Now hopefully it can answer the right question going forward

Matt Cromwell
36:36-37:00
until we find the next group of questions that it can't answer well. This to me is why I said at the outset that AI is great and amazing. It's helping our customers faster and better. It's not replacing my team so far really at all because it's given my team different and sometimes more meaningful work to be doing. And that's been great and excellent.

Aaron Edwards
37:02-37:05
Yeah. And you wrote a great blog post about that, Matt. I know, too.

Matt Cromwell
37:06-37:13
Yeah. I actually want to walk through that a little bit at some point. But we wanted to first show your customer data.

Aaron Edwards
37:13-37:54
Overall. Yeah. So actually I used the AI coding agent to make a script to kind of dump all the customer data just to figure out overall like what we're seeing stats wise. This is my chat GPT analysis. But basically I wanted to, it was hard because we had to filter out like people that aren't using it for customer support. We do have a lot of customers that use it for content generation or like for their internal team or personal research kind of use cases. So it's hard to kind of segment that. And then also we tried to only look at the most active bots that are being used for customer support. So when you look at a question-only based performance, we did more than--

Matt Cromwell
37:54-37:59
Can you zoom in a little bit on that, Aaron, just so it's a little bit more legible for folks? Yeah.

Aaron Edwards
37:59-38:02
Yeah. Yeah. Yeah.

Matt Cromwell
38:03-38:03
Nice.

Aaron Edwards
38:03-38:06
That's a 16 email guy. Yeah. Yeah.

Aaron Edwards
38:07-38:25
Yeah. There you go. There you go. Yeah. So more than a million questions process. was interesting so the AI determined it could answer 84% of it from the documentation it had been trained for um so that leaves 12% that that based on the training that people have set up the

Zack Katz
38:25-38:59
AI couldn't answer um and then we also I absolutely love DocsBot and I'm a huge proponent and uh it's a little confident on the answers like and I there's not anything wrong with DocsBot any AI is going to tell you this, like that, that, that it answered it successfully interviewing in my experience is more like probably 60% in terms of like an actual, like really good answer. That's not to say that that's not incredible because it is. And doc spot still saves us a ton of time. Like it's great. And like these numbers might be a little higher. I don't know. Like I,

Aaron Edwards
39:00-39:40
that's my, that's my experience at least. Yeah, definitely. Possibly. And you can tune like how sensitive that determination is from your like custom instructions or custom prompt oh interesting because some people say like by default i think the prompt is like that like only only classify as good answer if you were able to confidently get an answer from the provided sources you know um so i mean that's kind of the best we can do i think right um and some of it will depend on what AI model you're using, you know, whether you're using like a, like a mini one to save costs and be faster, or you're using like, like full GPT-5, for example, you know, which is better at

Matt Cromwell
39:41-40:00
classifying. That's a really good point. Because when we first did the help scout to get integration part, we immediately were like, well, these aren't great answers. And I was like, well, you know why folks, because we aren't giving great answers in the first place. So the problem wasn't the AI in

Aaron Edwards
40:00-42:42
that case it was the source material so yeah and then a big part as i think you mentioned is um we found that like escalation is a huge thing um a smooth escalation flow you know because you don't want ever to give a chance for users to be frustrated and so we have kind of built into our widget if we're using it as a tier one support to where it can seamlessly detect you know if they get frustrated or if it can't answer multiple times or if the user is simply asked to talk to a human because they don't like AI or whatever, it automatically says, okay, would you like me to connect you to a human? And then it just shows a big button there and you click it, it tracks that data and it automatically, it sends them to your support page or it opens up your live chat widget and can even pre-fill it with a ticket based on the conversation that the user has already provided. I'm using AI. And so, but we track all that. And so when you're doing it based on questions only. We're seeing 2% escalated. Now I think that's low. I tried to filter out everyone that doesn't have escalation enabled, but it's hard. There's a lot of different custom integrations people make with APIs and stuff to know. Yeah. And then I think probably conversation-based is the other thing that's interesting because with our new agent mode, we're recording the entire conversation and then we're doing classification based on the whole conversation, whether it was resolved or not. And there's an automatic kind of like feedback process where the AI is naturally asking, did that answer your question? And they can click positive or negative, or it just determines from their response after that whether or not the user is confirming that it answered their question or not. So that gives us a lot more tracking. Because when it was just a thumbs up, thumbs down thing, users almost only you just click thumbs down. No one's going to click thumbs up, even if it was a good answer. Yeah. So with that, you have a pretty high answer rate, 94% for the conversation. And then for that, we're tracking resolved using AI classification for the whole conversation. And then we also track if a user confirms if it's resolved or not. And then escalations and things like that, too. And sentiment is another big thing. So we're seeing like a five to one positive to negative sentiment when we classify the conversations. That's pretty good. We're going to try to do better like CSAT scores so you can actually see based on AI answers what the true customer satisfaction is based on their responses.

Matt Cromwell
42:43-44:29
Yeah. Love it. We actually got a couple more good comments here from the crowd. actually would love to hear let's get this one out of here yeah so Nick McLaughlin says regarding replacing support it seems more like a change in the interface of your self-service resources I'd love to hear more about how if this impacts the human support conversations that come after it's a really excellent question that's been a guiding question for me personally as I've rolled this out I've been just for context folks I've been working with Aaron and DocsBot for I mean shortly after his public launch almost the whole time honestly and we did a real slow rollout I really tried to refine our docs first before really having a lot of customer facing interactions so that the responses from the bot would be really secure and things like that. But once we really finally rolled it out, that's when I got the really valuable information. And I agree by and large with the question behind Nick's question here, which is like, aren't you just basically making them read your docs? Like that's what Aaron was saying earlier. Nobody reads your docs. It's true. Now they just talk to your docs with a chat bot essentially. But the truth is that's what we always wanted. Like how many times do we get a question that it was just like, well, if you just would have read the docs, it would have been fine. Like you wouldn't even be here. You wouldn't even be asking this question. You wouldn't have filled out the whole entire support form if you just would have read the docs. Right.

Aaron Edwards
44:30-44:31
So now they don't have to.

Matt Cromwell
44:32-44:51
They get to conversate with the docs. And I found overall that that has been very, very true, that they get answers that are easy to answer a lot quicker than they would have if they would have submitted a support ticket and waited for us to reply to them with essentially saying, here's what it says in our docs.

Zack Katz
44:54-46:05
One of the ways that we're changing our support structure is that as AI gets better and better at answering support questions, and we're working really hard to do that, we also are reshifting our support priorities in order to spend more time on each ticket to make sure that every ticket that comes through the AI answering filter gets addressed with either a feature enhancement or a new doc or a bug report. we want to make sure that each one of the tickets that we are addressing is addressed so thoroughly that that won't need to be answered again. And it's not like the support is going to go away. But I think that it's going from everything starts with tier one to kind of everything starts with tier two, or maybe even tier three in a year from now. And so we're going to have to go, we're trying to transition more so that if a bug report happens, an issue is created automatically or very easily by the support technician. And then our AI bot tries to fix the bug automatically. And then our QA bot tries to identify whether it fixes it.

Aaron Edwards
46:05-46:08
And then our QA person verifies whether the QA bot was correct.

Zack Katz
46:09-46:25
And then the customer within minutes even might be able to have a fix to the issue that they had reported. And that kind of experience is coming soon if we do it right. And if we do it wrong, it's coming a little bit later than that.

Matt Cromwell
46:25-46:26
A little bit later.

Aaron Edwards
46:26-46:51
Yeah. I mean, I'm doing that too. It's not that process isn't fully automated yet, but that's a great direction for DocsBot to where, I mean, that little feature request comes in or a little bug report, I just fire off an AI agent right then. And it's working in the background in the cloud on my code, like coding that fix, you know? The bottleneck I just got to get past now is the actual QA testing part. Yeah.

Matt Cromwell
46:51-47:34
Yeah, Nick's in the comments here affirming what we're saying. He says, yeah, that mentality of how can I prevent this question from coming up again is crucial. And that's the point. That's the purpose here for sure. Another good question we have is from Ian again. For smaller scale WordPress businesses like mine, is the cost savings significant enough to outweigh the potential customer success opportunities given up by resolving without the human touch? You know what? There's a couple parts to that question that I think are important and significant. For one, I don't think the cost is all that significant, Aaron, right? I mean, DocsBot is pretty relatively affordable overall.

Zack Katz
47:36-47:42
Look up other solutions. DocsBot, you'll see, is about 50 times less money than anything else on the market.

Aaron Edwards
47:43-47:44
It's a great value.

Zack Katz
47:44-47:45
Don't raise your prices, Aaron.

Matt Cromwell
47:46-47:51
We have a couple other episodes about pricing smartly according to the market, Aaron. Maybe you should watch those.

Aaron Edwards
47:52-48:07
We did just raise some prices. Sorry. But we do start at $49 a month for just kind of to get started. And that's enough to get you going. Like you mentioned, like Zendesk Intercom, I think they charge a dollar per actual resolution.

Zack Katz
48:08-48:19
And those are still the consumer-grade stuff we're talking about. maybe not intercom, but like the, the other, the big enterprise grade stuff is like thousands of dollars per month. And they often, most of them,

Matt Cromwell
48:19-49:21
they require that you adopt their whole system. It's not that you can just that the reason why I landed on DocsBot and while I'm still using DocsBot is because it doesn't require that I buy into a whole entire support desk ecosystem, a whole entire knowledge base ecosystem. It fits into my setup that I have. Looking into any of the other ones that I was interested in, like Sendbird, for example, is a really interesting platform that I've heard a lot of good things about. But in order to get the benefits from it, I have to buy into everything that they offer. If I'm going to have their chatbot interface, I have to have their help desk and their support knowledge base and all of those things. It's not portable and modular the way DocsBot is. But to Ian's question there about potential customer success opportunities given up by resolving without the human touch, the ones that it's resolving without the human touch are the ones that we keep saying over and over again are the ones that the humans don't want to touch in the first place.

Aaron Edwards
49:22-49:23
It's really, yeah.

Zack Katz
49:24-49:30
And those products should probably be better to prevent them from even having to be asked. Or our websites could be clearer, yeah.

Aaron Edwards
49:31-49:31
Yeah, exactly.

Aaron Edwards
49:31-50:13
And I think, to be clear, too, we have two different ways of kind of deploying DocsBot. We have kind of Tier 1 and Tier 2. So Tier 1 might be the chat bot that you have deployed in your product or on your website. And then that seamlessly hands off to Tier 2, which may be creating a ticket in your help desk. And in that case, like, I think both of you are using the Tier 2 a lot. Or I'm not sure if Zack is using Tier 1. But Tier 2, basically, DocsBot will write a draft response to the ticket in Help Scout. And so it's still leaving it for a human to review and edit. So you're still kind of getting that personal touch, but it's doing, gosh, 90% of the work. For me, in my case, that's my experience, you know.

Zack Katz
50:13-50:23
So my question about that, right, I've been going, we turned it on for a while, and that was back in the GPT-3 days, and then that was a mess because GPT-3 isn't GPT-5, right?

Aaron Edwards
50:24-50:27
It's not as good as it is now. It's quite good now.

Zack Katz
50:28-50:42
But there are so many nuances to customer support. Like, oh, one of our customers, Craig, he emails every other day, right? We know Craig. Craig is a customer we know.

Matt Cromwell
50:42-50:48
One of those. We have a history with. He's going to make it into the custom instructions. If you are Craig.

Zack Katz
50:48-51:09
Yes. So one of the things that I want to incorporate is training DocsBot on our customers themselves and like what type of responses does this customer prefer? Is this customer technical? Is this customer a repeating person who is a frequent user of our support resources?

Aaron Edwards
51:09-51:16
And I'm planning on having DocsBot be part of a chain that is part of that process

Zack Katz
51:16-51:27
where it looks up in our CRM about the customer. How valuable is this customer in terms of historical value? Is this a pre-sale? Have they visited the site a bunch of times but they haven't bought?

Aaron Edwards
51:27-51:32
All of this nuance is something that humans can do really well, and AIs can do really well,

Aaron Edwards
51:32-52:02
but it's not incorporated yet in the flow of many of these types of systems. Interesting. So we do have in our actual native Help Scout integration, for example, it's not getting past that data. It's getting past just their name and email, basically, in the metadata. But if you're using our API to integrate, or Zapier or Make or something like that to integrate it with your help desk. We do have the ability to add like metadata about the customer,

Aaron Edwards
52:02-52:05
about their plan that they're on, things like that.

Matt Cromwell
52:06-52:16
Yeah. So you could have the customer even self-identify in terms of their technical aspects and that could be passed then to the bot. Yeah, exactly.

Zack Katz
52:17-52:31
And if they mentioned the word filter, you know, like automatically we update their profile to say, like if there's an opening PHP tag, where like, okay, this person knows code, right? Like that's, or is able to at least write an opening page.

Aaron Edwards
52:31-53:30
- Classifying, that's interesting. Generally I find at least from the conversation context 'cause each time it answers a ticket, like in Help Scout, it has the whole conversation context. So it can determine a lot, you know, about the user just from that, like how they're talking, how they're communicating. We're using GPT-5 now, and that's one of our first like models that includes reasoning. And you found that so powerful. So for our Help Scout integration, we actually turn on, we bump up reasoning. In a chatbot context, you can't use that because you don't want them to wait thinking, thinking, thinking, right? But when it's writing a draft to your Help Scout ticket, it has a lot more time to do that deep reasoning. And we also input any screenshots that the user uploads or the staff member uploads in that conversation. That's all included as context for the AI model to do reasoning and to answer questions. And that's one of the things that always like, it blows me away, the answers sometimes. And especially I get a lot of support tickets in Spanish or Japanese or

Aaron Edwards
53:30-53:31
Portuguese.

Zack Katz
53:31-53:35
And it's like crafting the answers in whatever language.

Aaron Edwards
53:35-53:51
It's crafting this amazing answer. And even when I like Google translate their question, like, I don't know what they're talking about, but when I translate the answer that DocsBot made, I'm like, okay, it understands a lot better what they're saying. Maybe that's cultural context. I don't know.

Matt Cromwell
53:52-54:01
Yeah, that's another amazing benefit that folks need to recognize is multilingual support without hiring a whole bunch of additional native speakers.

Zack Katz
54:02-54:42
It's really amazing. One of the tips that we've implemented that I think others should do is in part of our prompt for the custom instructions for DocsBot, we have it score itself in terms of how deep it thinks the response should be. And then before then saying, okay, if it's a quick response, respond quickly. Like that's something that GPT-5 already does. But if so, we enabled GPT-5, but we told the prompt to tell GPT-5 how much thinking it should do. So we kind of incorporated so that the chat is fast when it's an easy answer and it thinks much longer when it's a hard question.

Matt Cromwell
54:43-57:22
That's interesting. Another really good tip. I like that. I did want to do a little show and tell myself actually real quick, if I could. I wrote an article just recently that I've been meaning to write for forever. The real power of AI and support isn't fewer tickets, it's better answers for more people. And you can check that out on my personal website. But what I want to lean in on is just a little bit of the data that I collected. This is over a short period of time. only about like two and a half months after we implemented DocsBot throughout the whole the events calendar ecosystem. So I'll just back up just a tiny bit real quick and just say that my rollout plan for each of the Stellar brands was that we had to use the bot internally first to validate whether or not the documentation was strong enough to give the bot good answers. And we did that for a long period of time. And then we rolled the chatbot out to only the documentation pages to start getting actual customer interactions, but only in the docs. So we know that they're looking for support in one form or another, and it's on the docs pages themselves. And then we monitored those responses and used those responses to improve the docs even more. Once we felt like there was a good resolution rate shown in those interactions, then we would roll it out as either directly in the product itself, which we've done that now with the events calendar, where you can go into the help tab of the events calendar plugin in your WordPress website, and you get to interact with our DocsBot agent right there in your WordPress website. And this is that rollout that you see here. And you could see how the red line in particular is the one that is the chats that we have with DocsBot and the support tickets are the yellow line. You could see how they're already converging a bit. But what you also see is that the total number of interactions is increasing. And that's the part that I think is really important and fascinating to acknowledge is that this, like I've said several times already, it's not reducing the number of team members that I need, but is greatly increasing the number of customers I'm actually serving without negatively impacting the capacity of my team. And that's what I really love about it. We actually are getting more answers to more customers without demanding more of our team, essentially.

Zack Katz
57:23-58:03
And when you have a support team, it's always frustrating when I talk to customers. one-on-one or like when I'm on a support ticket with a customer that's been with us for years. Like they've been with us for 10 years and this is the first time they've contacted support. I feel frustrated because I know that they've had questions about like how to do certain things or I know they've been frustrated and they haven't found a way that unlocks their feeling that they can just ask somebody. They can just ask a support question. DocsBot, automated AI chats, things like that, make it so that the customer doesn't feel guilty to asking a question.

Aaron Edwards
58:04-58:05
And I think that that's a huge part of it.

Zack Katz
58:06-58:20
And even searching documentation is a burden. And I think that having DocsBot and have it be a conversational thing, it unlocks our customer's ability to feel like they can ask a question without causing any problems.

Matt Cromwell
58:20-58:40
Yeah. And the questions they ask, they talk to a person, they're like apologizing. Oh, man. I mean, I think it's about this. I'm not sure. But like you look at the questions I ask, and it's like cadence is broken. Like they don't care. They're just like, give me answers. I'll say it however I want to say it. You bought.

Aaron Edwards
58:43-58:55
That's funny. Matt, have you found it the way you like hire and train your support team? Is that different now that they're handling more like tier two level stuff than tier one?

Matt Cromwell
58:55-59:11
There was several good pointed conversations that we had where I said very specifically, folks, if you are going through our queue and looking for the easiest questions possible, that's not a long term strategy for you.

Aaron Edwards
59:12-59:13
No more cherry picking.

Matt Cromwell
59:14-59:22
Yeah, exactly. If you're cherry picking, that's not going to do you very much favors in this job.

Zack Katz
59:23-59:33
If you're a procrastinator, that's a good way to get started on support. So if you need a couple to get warmed up, go for it. Just I'm just saying any procrastinators out there, I hear you.

Matt Cromwell
59:35-01:01:07
But but the ones who really want to like level up and start doing more of tier two type support, like that's that's where all of it is going. So, I mean, I'll be perfectly clear, like we're we're leveling up all of our people all along the way right now as we go. we want to keep everybody and keep it going strong. But I do think that that means that going forward, we're only hiring tier two folks. And that does have like a little bit of a limiting effect on the number of people that we hire in the future, for sure. So I think it's important to be realistic about that. But in my mind, it's still like what we have been saying is that if you're hiring somebody who's really only doing low hanging fruit things that they can answer in two, like 20 seconds or less all the time, it's not a fulfilling job. If that's how they spend 68 hours of their day, they're not going to stay with you long because who wants to do that all the time? They're going to be looking for new opportunities. Ideally, I train them to be a tier two technician and there is more exciting work to be done. But in the long term, I'm really hoping that everybody enjoying their job. They get to monitor the bot. They get to update documentation. They get to create pitches. We create pitches now for the product team based on customer feedback that we get. It's a much more enriching, holistic job than just tier one. I think it aligns nicely with what

Zack Katz
01:01:07-01:02:04
the society as a whole is going to need to do. And when our support team is trained in AI, then they're able to become a junior developer. They're able to become a junior product person. That is what the future is, is where like, okay, I have this bug. I'm not going to have to hand it off to the developer. I'm going to start this process that might be automated or it might not be. Like right now, it might be a little more manual than it will be a year from now. But I want everybody to be enabled to solve the problem, not just report it. I want people to be able to fix the website if it's not clear enough that the language on the site is a little confusing. Like, we should fix that. I think it's a more results outcome oriented future rather than just kind of spinning the hamster wheel of doing support, like pushing that rock up the hill and having it roll down.

Aaron Edwards
01:02:04-01:02:08
That is the experience of opening an inbox every day as a support person. Or it has been.

Zack Katz
01:02:09-01:02:17
maybe now it's more like working on the product holistically and being, being, um, more involved in other parts of the business.

Matt Cromwell
01:02:20-01:02:24
Yeah. Well, we have, uh, had a very riveting show, honestly.

Zack Katz
01:02:24-01:02:26
I love support. It's a great topic.

Matt Cromwell
01:02:26-01:02:41
I think we all do, which is why it's been so great. Um, but we do need to wrap it up. Um, Aaron, every single episode we conclude by, uh, given folks our best advice. So what is your best advice for anyone thinking about implementing AI in their support operation?

Aaron Edwards
01:02:43-01:03:35
I would say, like, there's a ton of, like, chatbot building tools out there now or agent building tools. But what's really important for this use case is kind of the back-end management, as we've talked so much about, you know, whether it's, like, having good logs and filtering and analysis, reports, ways to fine-tune the answers easily over time. I think that's a big thing to focus on. And the other one is making sure it's a very smooth, like, handoff to a human so that you don't give users that a bad, I think, AI support deployment is the one where they stick up a chat bot, you know, and that's all users can talk to. And you just see them saying, I hate AI, you know, bot, bot, bot, whatever. If you don't have that smooth process to hand off to the human team. I think those are my two tips, I think, that I've learned from our customers.

Matt Cromwell
01:03:36-01:03:38
Love it. Zack, what about you?

Zack Katz
01:03:39-01:04:18
Well, one of the things that we had customers asking frequently about was our change log. They saw an email come in that we talked about a new product and they wanted to know more about it and they would ask the bot and because EDD is set up in a certain way, the change log wasn't exposed publicly. I added a change log endpoint to our products that is nicely formatted and has all the latest information. And then I added that to the Yoast sitemap so that DocsBot can scrape our changelogs now and has the latest information on all our releases. So my best advice is don't forget your changelogs. Make sure they're being indexed as well.

Aaron Edwards
01:04:18-01:04:25
That's good. We do that as a feed, you know, as a category on our blog that we use to index.

Zack Katz
01:04:26-01:04:27
How about you, Matt?

Matt Cromwell
01:04:28-01:05:36
Yeah. My tip is, yeah, it's hard to narrow it down to one. I do think my tip is to focus on docs, focus on your docs as much as you can. I agree with Aaron, like you said in the outset, like people don't read your docs, but you focus on them because they should be the backbone of any support interaction that you create through AI. And you also create them because they're really great for gaining organic traffic still. And it's also a really excellent exercise for your team to increase their knowledge of your product. So there's just so many benefits of online documentation for your support team. I know some teams, they have the marketing team or the product team to do online documentation. I really think it's got to be support. You've got to get them involved in writing and editing and updating and tweaking and reading all of the docs as much as possible because there's so many, like lots of benefits that just pile up on top of each other.

Zack Katz
01:05:37-01:06:06
Ian and I had a great conversation where he was saying there's a concept called knowledge-centered support or knowledge-centered service where every interaction that somebody has with customer support should result in docs changes or product enhancements or like an action like and that's really tough and i uh but that's a an approach to support is like every single time there's a question make sure that your docs are

Matt Cromwell
01:06:06-01:06:14
updated or added yeah yeah love it love it great well aaron how about uh tell everybody where they

Aaron Edwards
01:06:15-01:06:43
can find you and uh and where you are on the web yeah i'm really active on uh twitter um my handle is ugly robot dev and um i'm kind of building a public there so always sharing the stuff i'm learning whether it's marketing or support or ai technology things um and of course DocsBot.ai if you want to sign up give it a try with your own support and for those of you who have ai

Zack Katz
01:06:43-01:06:48
transcription enabled. It's D O C S B O T dot AI. Yes.

Matt Cromwell
01:06:48-01:06:54
It is not a dog spot, right? Dog spot. Yeah.

Aaron Edwards
01:06:55-01:06:59
I had to put that in my like keywords to auto correct for me.

Zack Katz
01:07:00-01:07:11
Thanks again, Aaron. And next week we're going to be talking about the art of handling feature requests, which is a very related topic. We're with Robbie McCullough from Beaver Builder. So make sure to tune in next week.

Matt Cromwell
01:07:12-01:07:55
And if you're enjoying these shows, do us a favor and hit that subscribe button. Literally, it's the freest, easiest way you can support the work that we're doing. Build something awesome this week, and we will see you all next week. Thanks so much. Bye-bye. .

Related Episodes