1 00:00:00,000 --> 00:00:01,160 So you've got a website, right? 2 00:00:01,160 --> 00:00:03,920 And you really wanna know how it's doing on Google 3 00:00:03,920 --> 00:00:07,520 for the keywords that actually matter to your audience. 4 00:00:07,520 --> 00:00:08,360 Sounds simple. 5 00:00:08,360 --> 00:00:10,360 Yeah, sounds simple, but then you try to figure out 6 00:00:10,360 --> 00:00:13,180 where you actually land on that search results page, 7 00:00:13,180 --> 00:00:18,000 the SRP, for maybe dozens, even hundreds of keywords, 8 00:00:18,000 --> 00:00:20,260 and checking manually every day. 9 00:00:20,260 --> 00:00:21,100 Forget it. 10 00:00:21,100 --> 00:00:23,560 Oh, absolutely, it's just, it's not feasible. 11 00:00:23,560 --> 00:00:24,760 Way too time consuming. 12 00:00:24,760 --> 00:00:26,960 Exactly, and finding a tool that does it 13 00:00:26,960 --> 00:00:30,600 without costing a fortune or overwhelming you 14 00:00:30,600 --> 00:00:33,720 with a million features you'll never use, that's tough too. 15 00:00:33,720 --> 00:00:35,560 It's easy to get buried in information. 16 00:00:35,560 --> 00:00:36,400 It really is. 17 00:00:36,400 --> 00:00:38,720 I mean, understanding your ranking is so basic 18 00:00:38,720 --> 00:00:40,960 to knowing if your website's even working, 19 00:00:40,960 --> 00:00:42,480 if people are finding you, 20 00:00:42,480 --> 00:00:45,440 but just getting that info reliably, efficiently. 21 00:00:45,440 --> 00:00:47,920 Yeah, that's often the tricky part. 22 00:00:47,920 --> 00:00:48,760 And that difficulty, 23 00:00:48,760 --> 00:00:50,560 that's exactly why we do the deep dive. 24 00:00:50,560 --> 00:00:52,840 We take the source material that you, the listener, 25 00:00:52,840 --> 00:00:54,160 share with us, and for this one, 26 00:00:54,160 --> 00:00:56,000 we've got notes from a GitHub repo 27 00:00:56,000 --> 00:00:58,080 and some intro docs about a particular tool. 28 00:00:58,080 --> 00:01:00,480 All right, and we act as your guides, basically. 29 00:01:00,480 --> 00:01:01,880 We pull out the main points, 30 00:01:01,880 --> 00:01:03,160 explain how it all hangs together, 31 00:01:03,160 --> 00:01:05,060 and help you get up to speed fast, 32 00:01:05,060 --> 00:01:06,160 sticking really closely 33 00:01:06,160 --> 00:01:08,360 to what's actually in those documents. 34 00:01:08,360 --> 00:01:10,760 So our mission for this deep dive, 35 00:01:10,760 --> 00:01:13,920 we're unpacking an open source tool called SerpBear. 36 00:01:13,920 --> 00:01:15,640 Based only on the sources we have, 37 00:01:15,640 --> 00:01:17,360 we're gonna look at what it is, 38 00:01:17,360 --> 00:01:19,180 what features the docs say it has, 39 00:01:19,180 --> 00:01:22,000 get a handle on how it actually works behind the scenes, 40 00:01:22,000 --> 00:01:23,960 and walk through the setup steps they lay out. 41 00:01:23,960 --> 00:01:25,040 We really wanna make this clear, 42 00:01:25,040 --> 00:01:26,520 especially if maybe you're just starting out 43 00:01:26,520 --> 00:01:29,860 with this kind of Serp-y tracking software, 44 00:01:29,860 --> 00:01:31,400 trying to keep it beginner friendly. 45 00:01:31,400 --> 00:01:32,520 Absolutely. 46 00:01:32,520 --> 00:01:34,360 Think of this as your quick guide 47 00:01:34,360 --> 00:01:36,560 to understanding the specific approach 48 00:01:36,560 --> 00:01:38,360 to tracking your search rankings, 49 00:01:38,360 --> 00:01:40,000 all based on the info provided. 50 00:01:40,000 --> 00:01:40,840 Okay, let's do it. 51 00:01:40,840 --> 00:01:42,840 But first, before we jump in, 52 00:01:42,840 --> 00:01:45,780 a huge thank you to the supporter of this deep dive, 53 00:01:45,780 --> 00:01:46,620 SafeServer. 54 00:01:46,620 --> 00:01:47,920 Yes, thanks, SafeServer. 55 00:01:47,920 --> 00:01:50,320 SafeServer is a fantastic partner. 56 00:01:50,320 --> 00:01:52,120 They can handle the hosting for software 57 00:01:52,120 --> 00:01:54,560 just like the one we're discussing, SerpBear, 58 00:01:54,560 --> 00:01:58,520 and really support you in your whole digital transformation. 59 00:01:58,520 --> 00:02:03,280 You can find out more over at www.safeserver.de. 60 00:02:03,280 --> 00:02:06,280 That's www.saferver.de. 61 00:02:06,280 --> 00:02:10,800 Check them out. 62 00:02:10,800 --> 00:02:14,460 Okay, so, SerpBear, let's unpack it. 63 00:02:14,460 --> 00:02:17,760 Right at the top, the sources use two key phrases, 64 00:02:17,760 --> 00:02:20,620 open source and search engine position tracking 65 00:02:20,620 --> 00:02:22,500 and keyword research app. 66 00:02:22,500 --> 00:02:24,400 So the main idea seems to be, well, 67 00:02:24,400 --> 00:02:26,560 tracking where your keywords rank on Google 68 00:02:26,560 --> 00:02:28,400 and telling you when things change. 69 00:02:28,400 --> 00:02:30,420 It's the core description from the sources, yeah. 70 00:02:30,420 --> 00:02:31,520 It's built to be open source, 71 00:02:31,520 --> 00:02:33,440 so the code's out there for anyone to see your tweak 72 00:02:33,440 --> 00:02:36,040 and it's really focused on Google position tracking 73 00:02:36,040 --> 00:02:38,120 and helping with keyword research. 74 00:02:38,120 --> 00:02:40,000 Okay, so not just tracking where you are now 75 00:02:40,000 --> 00:02:41,760 but also helping find new keywords. 76 00:02:41,760 --> 00:02:42,600 That's interesting. 77 00:02:42,600 --> 00:02:44,020 Now the docs list several features. 78 00:02:44,020 --> 00:02:47,340 One jumps out immediately, unlimited keywords. 79 00:02:47,340 --> 00:02:48,640 Yeah, that sounds pretty good, doesn't it? 80 00:02:48,640 --> 00:02:49,480 Yeah. 81 00:02:49,480 --> 00:02:50,700 Especially compared to a lot of paid tools 82 00:02:50,700 --> 00:02:51,720 that have strict limits. 83 00:02:51,720 --> 00:02:52,560 It really does. 84 00:02:52,560 --> 00:02:54,440 The sources say word for word almost, 85 00:02:54,440 --> 00:02:56,880 unlimited domains and unlimited keywords 86 00:02:56,880 --> 00:02:58,320 for tracking SERPy positions. 87 00:02:58,320 --> 00:03:01,160 But, and this is important, 88 00:03:01,160 --> 00:03:02,640 we need to connect that claim 89 00:03:02,640 --> 00:03:05,040 with other bits in the documentation. 90 00:03:05,040 --> 00:03:06,960 There's like a little asterisk footnote. 91 00:03:06,960 --> 00:03:07,800 Ah, okay. 92 00:03:07,800 --> 00:03:08,620 The fine print. 93 00:03:08,620 --> 00:03:09,460 Sort of. 94 00:03:09,460 --> 00:03:11,200 It comes up again in the getting started bit 95 00:03:11,200 --> 00:03:12,900 and a comparison table they include. 96 00:03:12,900 --> 00:03:14,120 It clarifies things. 97 00:03:14,120 --> 00:03:15,640 Okay, so based on the sources, 98 00:03:15,640 --> 00:03:18,160 what's the real deal with unlimited? 99 00:03:18,160 --> 00:03:21,100 Well, the sources explain that while SERP Bear itself, 100 00:03:21,100 --> 00:03:23,920 the software, doesn't count the number of keywords 101 00:03:23,920 --> 00:03:26,680 you put in, the actual number of searches 102 00:03:26,680 --> 00:03:28,920 it can run each day or month. 103 00:03:28,920 --> 00:03:32,080 That depends entirely on the external scraping service 104 00:03:32,080 --> 00:03:33,060 you hook it up to. 105 00:03:33,060 --> 00:03:34,160 Ah, okay. 106 00:03:34,160 --> 00:03:37,360 So SERP Bear doesn't do the searching itself. 107 00:03:37,360 --> 00:03:38,760 Not directly, no. 108 00:03:38,760 --> 00:03:41,260 And that asterisk footnote, 109 00:03:41,260 --> 00:03:43,960 it qualifies unlimited lookups by saying something 110 00:03:43,960 --> 00:03:47,300 like free up to a limit and points to the free plan 111 00:03:47,300 --> 00:03:49,840 of a service like Scraping Robot as an example, 112 00:03:49,840 --> 00:03:52,280 which apparently gives you 5,000 lookups a month. 113 00:03:52,280 --> 00:03:53,180 Right, I see. 114 00:03:53,180 --> 00:03:55,280 So the software can handle unlimited, 115 00:03:55,280 --> 00:03:58,740 but the work of checking the ranks is done by another service 116 00:03:58,740 --> 00:04:00,840 and that service might have limits or costs. 117 00:04:00,840 --> 00:04:02,520 That's a really key distinction. 118 00:04:02,520 --> 00:04:03,560 Exactly. 119 00:04:03,560 --> 00:04:05,880 It really highlights the whole model of this tool, doesn't it? 120 00:04:05,880 --> 00:04:07,980 You get the software, which is flexible, 121 00:04:07,980 --> 00:04:10,400 but you need to bring your own engine, so to speak, 122 00:04:10,400 --> 00:04:12,120 to actually do the Google checking. 123 00:04:12,120 --> 00:04:13,320 Yeah, it does. 124 00:04:13,320 --> 00:04:15,060 And that ties right into how it works, 125 00:04:15,060 --> 00:04:16,960 which the sources also cover. 126 00:04:16,960 --> 00:04:19,700 So how does SERP Bear check those rankings? 127 00:04:19,700 --> 00:04:21,120 According to the docs. 128 00:04:21,120 --> 00:04:26,120 It uses third-party website scrapers or proxy IPs. 129 00:04:26,120 --> 00:04:28,120 The sources explain it like this. 130 00:04:28,120 --> 00:04:31,240 Think of a scraper as a sort of automated mini browser. 131 00:04:31,240 --> 00:04:33,860 It goes to Google, types in your keyword, 132 00:04:33,860 --> 00:04:37,260 and then scans the results page to find your website. 133 00:04:37,260 --> 00:04:39,720 SERP Bear itself just tells one of these external services 134 00:04:39,720 --> 00:04:40,640 to do that. 135 00:04:40,640 --> 00:04:43,920 The docs list examples like Scraping Robot, SERP Appy, 136 00:04:43,920 --> 00:04:46,520 SERP Choppy, or they mention you could 137 00:04:46,520 --> 00:04:48,560 use your own set of proxies if you have them. 138 00:04:48,560 --> 00:04:49,040 Gotcha. 139 00:04:49,040 --> 00:04:51,280 So SERP Bear is like the control panel, 140 00:04:51,280 --> 00:04:54,160 and it sends the orders out to one of these external scraping 141 00:04:54,160 --> 00:04:55,080 services. 142 00:04:55,080 --> 00:04:57,400 That makes total sense now why the number of checks 143 00:04:57,400 --> 00:04:58,280 depends on that service. 144 00:04:58,280 --> 00:04:58,880 Precisely. 145 00:04:58,880 --> 00:05:01,080 That's the core way it gets the ranking info. 146 00:05:01,080 --> 00:05:01,680 OK. 147 00:05:01,680 --> 00:05:03,160 What about some of the other features mentioned? 148 00:05:03,160 --> 00:05:04,440 You said notifications earlier? 149 00:05:04,440 --> 00:05:04,760 Right. 150 00:05:04,760 --> 00:05:05,720 Email notifications. 151 00:05:05,720 --> 00:05:08,180 The sources say you can get alerts about position changes, 152 00:05:08,180 --> 00:05:11,880 keywords moving up or down, and you can choose how often. 153 00:05:11,880 --> 00:05:13,600 Daily, weekly, or monthly. 154 00:05:13,600 --> 00:05:16,280 Useful, but how does it send those emails? 155 00:05:16,280 --> 00:05:20,160 Ah, well the sources say you need to set up SMTP details. 156 00:05:20,160 --> 00:05:21,320 SMTP. 157 00:05:21,320 --> 00:05:23,960 For someone maybe not familiar, what's that in simple terms? 158 00:05:23,960 --> 00:05:24,680 Sure. 159 00:05:24,680 --> 00:05:27,240 SMTP just stands for Simple Mail Transfer Protocol. 160 00:05:27,240 --> 00:05:30,040 It's basically the standard internet language computers 161 00:05:30,040 --> 00:05:32,160 use to send emails to each other. 162 00:05:32,160 --> 00:05:35,200 So you need to tell SerpBear how to talk to an email sending 163 00:05:35,200 --> 00:05:35,800 service. 164 00:05:35,800 --> 00:05:38,280 Like connecting it to Mailgun or something similar. 165 00:05:38,280 --> 00:05:39,800 Exactly. 166 00:05:39,800 --> 00:05:43,320 The sources actually mention Elastic Email or SendPulse 167 00:05:43,320 --> 00:05:44,460 as examples. 168 00:05:44,460 --> 00:05:46,880 And note that they have free options to get started. 169 00:05:46,880 --> 00:05:49,600 So you plug those details into SerpBear settings. 170 00:05:49,600 --> 00:05:50,400 OK, makes sense. 171 00:05:50,400 --> 00:05:52,440 So you need an email service for the alerts. 172 00:05:52,440 --> 00:05:54,240 The docs also mention an API. 173 00:05:54,240 --> 00:05:54,920 They do, yeah. 174 00:05:54,920 --> 00:05:57,160 A built-in API. 175 00:05:57,160 --> 00:05:59,000 The documentation suggests this is useful 176 00:05:59,000 --> 00:06:01,600 if you want to pull your ranking data out of SerpBear 177 00:06:01,600 --> 00:06:03,880 and maybe feed it into other tools you use, 178 00:06:03,880 --> 00:06:06,960 like custom dashboards or marketing reports. 179 00:06:06,960 --> 00:06:09,040 Right, getting the data out for other purposes. 180 00:06:09,040 --> 00:06:09,680 Cool. 181 00:06:09,680 --> 00:06:11,640 What about the keyword research part? 182 00:06:11,640 --> 00:06:13,920 How does that work according to the sources? 183 00:06:13,920 --> 00:06:16,840 So for this, the sources say it integrates with a Google Ads 184 00:06:16,840 --> 00:06:18,840 test account. 185 00:06:18,840 --> 00:06:20,640 That's an interesting detail, the test part. 186 00:06:20,640 --> 00:06:22,720 Yeah, why a test account, do they say? 187 00:06:22,720 --> 00:06:25,040 The docs don't really elaborate on the why. 188 00:06:25,040 --> 00:06:27,600 They just specify using a test account. 189 00:06:27,600 --> 00:06:29,360 But through that connection, SerpBear 190 00:06:29,360 --> 00:06:31,560 can apparently auto-suggest keyword ideas 191 00:06:31,560 --> 00:06:33,920 based on your site's content, and it can also 192 00:06:33,920 --> 00:06:36,440 show you monthly search volume data for keywords. 193 00:06:36,440 --> 00:06:39,000 It helps you find what people are actually searching for. 194 00:06:39,000 --> 00:06:39,720 Interesting. 195 00:06:39,720 --> 00:06:43,200 Using a test ads account for volume data. 196 00:06:43,200 --> 00:06:44,000 OK. 197 00:06:44,000 --> 00:06:46,280 And then there's also integration with Google Search 198 00:06:46,280 --> 00:06:47,800 Console, GSC. 199 00:06:47,800 --> 00:06:49,960 That's Google's own thing for site owners. 200 00:06:49,960 --> 00:06:52,720 Yeah, and the sources suggest this is a pretty powerful one. 201 00:06:52,720 --> 00:06:55,800 By linking SerpBear to your GSC account, 202 00:06:55,800 --> 00:06:57,960 you can apparently pull in actual data, 203 00:06:57,960 --> 00:06:59,480 like how many clicks and impressions 204 00:06:59,480 --> 00:07:02,400 your site got for specific keywords directly 205 00:07:02,400 --> 00:07:03,680 from Google's own reporting. 206 00:07:03,680 --> 00:07:04,180 Oh, wow. 207 00:07:04,180 --> 00:07:05,720 So that goes beyond just rank, right? 208 00:07:05,720 --> 00:07:06,680 That's real traffic data. 209 00:07:06,680 --> 00:07:07,400 Exactly. 210 00:07:07,400 --> 00:07:09,640 So let's just say it helps you see real visit counts, 211 00:07:09,640 --> 00:07:12,400 find new keywords you didn't even know you were ranking for, 212 00:07:12,400 --> 00:07:15,640 and spot your top performing pages and keywords by country, 213 00:07:15,640 --> 00:07:17,600 all using that real GSC data. 214 00:07:17,600 --> 00:07:18,360 Nice. 215 00:07:18,360 --> 00:07:21,280 So you get the rank position from the scrapers, maybe 216 00:07:21,280 --> 00:07:24,520 search volume estimates from the Google Ads test account, 217 00:07:24,520 --> 00:07:27,720 and then the actual performance data from GSC. 218 00:07:27,720 --> 00:07:29,520 That paints a much fuller picture, doesn't it? 219 00:07:29,520 --> 00:07:31,280 It does, according to the sources. 220 00:07:31,280 --> 00:07:33,880 It brings different data points together in one interface. 221 00:07:33,880 --> 00:07:35,400 What about using this on your phone? 222 00:07:35,400 --> 00:07:36,640 Any mention of that? 223 00:07:36,640 --> 00:07:38,600 Yep, the sources mention a mobile app. 224 00:07:38,600 --> 00:07:42,240 Specifically, they call it a PWA, a Progressive Web App. 225 00:07:42,240 --> 00:07:43,520 Ah, OK. 226 00:07:43,520 --> 00:07:46,320 So not a native App Store app, but one that works well 227 00:07:46,320 --> 00:07:47,200 in a mobile browser. 228 00:07:47,200 --> 00:07:48,080 Pretty much. 229 00:07:48,080 --> 00:07:50,600 It means you can access CertBare on your phone or tablet 230 00:07:50,600 --> 00:07:51,720 through the browser. 231 00:07:51,720 --> 00:07:54,920 But it's designed to feel more like using a dedicated app. 232 00:07:54,920 --> 00:07:56,920 Smoother experience on mobile. 233 00:07:56,920 --> 00:07:57,640 Good to know. 234 00:07:57,640 --> 00:07:59,600 And let's circle back to cost for a second. 235 00:07:59,600 --> 00:08:02,960 The sources make a point about zero cost to war and end. 236 00:08:02,960 --> 00:08:04,760 Yes, that's highlighted. 237 00:08:04,760 --> 00:08:06,480 The documentation specifically says 238 00:08:06,480 --> 00:08:09,800 you can run CertBare on platforms like MoGenius.com 239 00:08:09,800 --> 00:08:11,840 or Fly.io for free. 240 00:08:11,840 --> 00:08:14,560 So that's about the hosting cost for the CertBare software 241 00:08:14,560 --> 00:08:15,240 itself, right? 242 00:08:15,240 --> 00:08:17,280 Not the scraping costs we talked about. 243 00:08:17,280 --> 00:08:17,840 Exactly. 244 00:08:17,840 --> 00:08:19,640 It refers to the cost of actually running 245 00:08:19,640 --> 00:08:22,480 the application, the infrastructure for CertBare. 246 00:08:22,480 --> 00:08:24,840 On those specific platforms, the sources 247 00:08:24,840 --> 00:08:26,480 say that cost can be zero. 248 00:08:26,480 --> 00:08:29,800 It just reinforces that open source self-hosted model. 249 00:08:29,800 --> 00:08:32,560 You deploy it, you run it, and here are some potentially 250 00:08:32,560 --> 00:08:33,840 free places to do that. 251 00:08:33,840 --> 00:08:34,520 OK. 252 00:08:34,520 --> 00:08:39,600 So zero cost to host the app itself on certain platforms. 253 00:08:39,600 --> 00:08:41,720 Potentially free, up to a point. 254 00:08:41,720 --> 00:08:44,660 Or paid costs for the scraping service. 255 00:08:44,660 --> 00:08:46,960 It's definitely a different kind of cost structure 256 00:08:46,960 --> 00:08:49,200 than just paying one monthly fee for everything. 257 00:08:49,200 --> 00:08:49,960 Precisely. 258 00:08:49,960 --> 00:08:51,880 It's a key difference in the approach described 259 00:08:51,880 --> 00:08:53,120 in the source material. 260 00:08:53,120 --> 00:08:55,960 You trade off convenience for control and potentially lower 261 00:08:55,960 --> 00:08:57,400 direct software costs. 262 00:08:57,400 --> 00:08:59,680 And one last feature mentioned, exporting. 263 00:08:59,680 --> 00:09:00,200 Oh, yeah. 264 00:09:00,200 --> 00:09:01,120 Simple but important. 265 00:09:01,120 --> 00:09:03,800 Export CSV lets you download your keyword data 266 00:09:03,800 --> 00:09:05,400 into a spreadsheet file, which is always 267 00:09:05,400 --> 00:09:07,800 handy for doing your own analysis or making reports. 268 00:09:07,800 --> 00:09:08,320 OK, great. 269 00:09:08,320 --> 00:09:11,880 So that seems to cover the main features the sources lay out. 270 00:09:11,880 --> 00:09:15,360 The tracking via external scrapers, the notifications, 271 00:09:15,360 --> 00:09:19,080 the API, keyword research using that Google Ads Test account 272 00:09:19,080 --> 00:09:23,800 link, the really valuable GSE integration for real data, 273 00:09:23,800 --> 00:09:27,760 mobile access via PWA, potentially free hosting, 274 00:09:27,760 --> 00:09:29,280 and data export. 275 00:09:29,280 --> 00:09:31,520 Seems like a good summary based on the docs. 276 00:09:31,520 --> 00:09:33,680 Now, let's say someone listening is thinking, 277 00:09:33,680 --> 00:09:35,440 OK, this sounds kind of interesting, 278 00:09:35,440 --> 00:09:36,720 maybe worth trying. 279 00:09:36,720 --> 00:09:38,280 How do you actually get started? 280 00:09:38,280 --> 00:09:40,400 The sources give a step-by-step path, right? 281 00:09:40,400 --> 00:09:40,800 They do. 282 00:09:40,800 --> 00:09:41,800 They lay out a sequence. 283 00:09:41,800 --> 00:09:44,060 First step, deploy and run the app. 284 00:09:44,060 --> 00:09:44,560 Right. 285 00:09:44,560 --> 00:09:45,960 This isn't just a website you sign up for. 286 00:09:45,960 --> 00:09:47,680 You actually have to get the software code 287 00:09:47,680 --> 00:09:49,560 and run it somewhere, self-hosted. 288 00:09:49,560 --> 00:09:50,520 Exactly. 289 00:09:50,520 --> 00:09:52,040 The source is mentioned using Docker, 290 00:09:52,040 --> 00:09:55,000 which is a popular way to package and run software, 291 00:09:55,000 --> 00:09:56,520 or maybe running it without Docker, 292 00:09:56,520 --> 00:09:57,560 depending on your setup. 293 00:09:57,560 --> 00:09:59,240 But yeah, step one is getting it running. 294 00:09:59,240 --> 00:10:00,200 OK, you get it running. 295 00:10:00,200 --> 00:10:00,700 Then what? 296 00:10:00,700 --> 00:10:03,640 Step two, access the app and log in. 297 00:10:03,640 --> 00:10:06,200 Once it's running, you go to its web address and log in. 298 00:10:06,200 --> 00:10:08,800 Step three, add your first domain. 299 00:10:08,800 --> 00:10:11,080 Tell it which website you actually want to start tracking. 300 00:10:11,080 --> 00:10:12,560 Pretty straightforward so far. 301 00:10:12,560 --> 00:10:13,240 Add your site. 302 00:10:13,240 --> 00:10:14,400 What's next? 303 00:10:14,400 --> 00:10:18,120 Step four is key, and it loops back to how it works. 304 00:10:18,120 --> 00:10:21,560 You need to get a free API key from a provider 305 00:10:21,560 --> 00:10:23,880 like Scraping Robot, or choose a paid one. 306 00:10:23,880 --> 00:10:25,600 This is getting that engine we talked 307 00:10:25,600 --> 00:10:26,800 about, the scraping service. 308 00:10:26,800 --> 00:10:27,400 Exactly. 309 00:10:27,400 --> 00:10:28,840 The sources say you can skip this 310 00:10:28,840 --> 00:10:30,440 if you're using your own proxies. 311 00:10:30,440 --> 00:10:32,600 But for most people, it sounds like grabbing an API 312 00:10:32,600 --> 00:10:34,920 key from one of these services is the way to go. 313 00:10:34,920 --> 00:10:37,080 And then you have to tell SerpBear to use that key. 314 00:10:37,080 --> 00:10:37,720 Correct. 315 00:10:37,720 --> 00:10:41,920 Step five, set up the scraping API or proxy details 316 00:10:41,920 --> 00:10:44,100 inside SerpBear's settings. 317 00:10:44,100 --> 00:10:46,320 This is where you connect your running SerpBear app 318 00:10:46,320 --> 00:10:48,120 to that external service so it can actually 319 00:10:48,120 --> 00:10:49,200 start checking the rankings. 320 00:10:49,200 --> 00:10:49,560 OK. 321 00:10:49,560 --> 00:10:51,280 Connect the software to the data source. 322 00:10:51,280 --> 00:10:52,200 Makes sense. 323 00:10:52,200 --> 00:10:53,480 Then you add your keywords. 324 00:10:53,480 --> 00:10:53,980 Yep. 325 00:10:53,980 --> 00:10:55,960 Step six, add your keywords. 326 00:10:55,960 --> 00:10:57,560 The specific search terms you want 327 00:10:57,560 --> 00:11:00,500 SerpBear to monitor for the domain you added earlier. 328 00:11:00,500 --> 00:11:02,080 And there were some optional steps, too. 329 00:11:02,080 --> 00:11:02,720 Right. 330 00:11:02,720 --> 00:11:04,600 Step seven is optional. 331 00:11:04,600 --> 00:11:06,840 Set up those SMTP details in the settings 332 00:11:06,840 --> 00:11:08,560 if you want the email notifications. 333 00:11:08,560 --> 00:11:10,520 Like we said, connect it to an email sending service, 334 00:11:10,520 --> 00:11:12,640 maybe one of the free ones the source has mentioned. 335 00:11:12,640 --> 00:11:13,160 Got it. 336 00:11:13,160 --> 00:11:14,240 For the alerts. 337 00:11:14,240 --> 00:11:18,840 And step eight, also optional, integrate Google Ads, 338 00:11:18,840 --> 00:11:21,240 test account, and Google Search Console. 339 00:11:21,240 --> 00:11:23,400 That's if you want the keyword research data 340 00:11:23,400 --> 00:11:26,400 and those deeper insights about actual clicks 341 00:11:26,400 --> 00:11:27,480 and impressions from GSC. 342 00:11:27,480 --> 00:11:27,980 OK. 343 00:11:27,980 --> 00:11:31,220 So the basic setup is get the app running, add your site, 344 00:11:31,220 --> 00:11:33,960 connect it to a scraper, add your keywords. 345 00:11:33,960 --> 00:11:36,840 The emails and the deeper Google integrations 346 00:11:36,840 --> 00:11:40,160 are extras you can configure after that core setup. 347 00:11:40,160 --> 00:11:42,360 That seems to be the flow outlined in the sources. 348 00:11:42,360 --> 00:11:42,860 Yeah. 349 00:11:42,860 --> 00:11:45,560 It really walks you through that self-hosted process 350 00:11:45,560 --> 00:11:48,640 and highlights the need for that external scraping component. 351 00:11:48,640 --> 00:11:51,380 It definitely shows you're more hands-on with this kind of tool. 352 00:11:51,380 --> 00:11:53,640 You have control, but also the responsibility 353 00:11:53,640 --> 00:11:56,000 for setting up these different pieces. 354 00:11:56,000 --> 00:11:57,280 And finally, just a quick note. 355 00:11:57,280 --> 00:11:59,560 The sources briefly mentioned the tech stack. 356 00:11:59,560 --> 00:12:00,640 Oh, yeah. 357 00:12:00,640 --> 00:12:01,640 Anything interesting? 358 00:12:01,640 --> 00:12:04,600 It's built with Next.IS, which is a popular web framework. 359 00:12:04,600 --> 00:12:06,480 And it uses Sklite for the database. 360 00:12:06,480 --> 00:12:08,320 Just a little technical detail included. 361 00:12:08,320 --> 00:12:09,720 OK, good to know. 362 00:12:09,720 --> 00:12:12,320 So we've kind of taken CertBear apart now 363 00:12:12,320 --> 00:12:15,680 based purely on the GitHub notes and docs you shared. 364 00:12:15,680 --> 00:12:18,200 If we boil it all down, what's the main takeaway 365 00:12:18,200 --> 00:12:19,840 from this deep dive? 366 00:12:19,840 --> 00:12:21,420 Well, based strictly on these sources, 367 00:12:21,420 --> 00:12:23,720 CertBear looks like a potentially powerful open 368 00:12:23,720 --> 00:12:26,080 source option if you want to host your own Google ranking 369 00:12:26,080 --> 00:12:27,280 tracker. 370 00:12:27,280 --> 00:12:29,000 It seems to offer the core features you'd 371 00:12:29,000 --> 00:12:32,960 need tracking alerts, unlimited keyword capacity anyway. 372 00:12:32,960 --> 00:12:35,040 Right, capacity being the keyword there. 373 00:12:35,040 --> 00:12:37,520 Exactly, plus those useful integrations 374 00:12:37,520 --> 00:12:40,800 with GSC and Google Ads for deeper data. 375 00:12:40,800 --> 00:12:42,760 But the big thing is the model. 376 00:12:42,760 --> 00:12:45,160 You deploy it, you run it, and crucially, you 377 00:12:45,160 --> 00:12:47,160 need to connect it to and potentially pay 378 00:12:47,160 --> 00:12:49,960 for a separate scraping service to actually do 379 00:12:49,960 --> 00:12:52,880 the rank checking, even if the CertBear software itself, 380 00:12:52,880 --> 00:12:55,400 and maybe it's hosting on certain platforms, can be free. 381 00:12:55,400 --> 00:12:57,680 So it puts you in the driver's seat for your data, 382 00:12:57,680 --> 00:13:00,760 maybe offers a different cost model than typical subscriptions, 383 00:13:00,760 --> 00:13:02,940 but it definitely requires more setup. 384 00:13:02,940 --> 00:13:04,280 It's a trade-off, isn't it? 385 00:13:04,280 --> 00:13:05,480 It really is. 386 00:13:05,480 --> 00:13:07,680 And that brings up a good thought for you, the listener, 387 00:13:07,680 --> 00:13:08,880 to ponder. 388 00:13:08,880 --> 00:13:12,520 Given this model open source software, self-hosted, 389 00:13:12,520 --> 00:13:15,440 relying on external scrapers with their own free or paid 390 00:13:15,440 --> 00:13:17,760 tiers, how does that really stack up 391 00:13:17,760 --> 00:13:21,880 against the more traditional all-in-one paid SEO software 392 00:13:21,880 --> 00:13:24,520 where they handle everything behind one subscription fee? 393 00:13:24,520 --> 00:13:26,760 Yeah, what are the real trade-offs for you? 394 00:13:26,760 --> 00:13:29,040 Think about control versus convenience, 395 00:13:29,040 --> 00:13:31,280 cost structure versus simplicity, 396 00:13:31,280 --> 00:13:34,640 and maybe the technical skills needed versus ease of use. 397 00:13:34,640 --> 00:13:37,120 It's a different way to get that crucial SRP data. 398 00:13:37,120 --> 00:13:38,720 Definitely something to consider based 399 00:13:38,720 --> 00:13:40,320 on your own needs and resources. 400 00:13:40,320 --> 00:13:42,280 And if you do want to dig into the nitty gritty 401 00:13:42,280 --> 00:13:45,320 of deploying SerpBear or compare those scraping services 402 00:13:45,320 --> 00:13:48,080 the sources mentioned, then checking out the actual source 403 00:13:48,080 --> 00:13:50,000 documentation in the GitHub repository 404 00:13:50,000 --> 00:13:50,920 would be your next step. 405 00:13:50,920 --> 00:13:51,480 Absolutely. 406 00:13:51,480 --> 00:13:54,180 The sources are always the place to go for the full details. 407 00:13:54,180 --> 00:13:55,640 And one last time, a big thank you 408 00:13:55,640 --> 00:13:58,080 to SafeServer for supporting this deep dive. 409 00:13:58,080 --> 00:14:00,960 Remember, they can help with hosting software like SerpBear 410 00:14:00,960 --> 00:14:04,120 and generally support your digital transformation efforts. 411 00:14:04,120 --> 00:14:07,740 Find out more at www.safeserver.de. 412 00:14:07,740 --> 00:14:14,040 That's www.saferver.detec. 413 00:14:14,040 --> 00:14:15,800 Thanks again to SafeServer, and thank you 414 00:14:15,800 --> 00:14:18,200 for joining us on this deep dive into SerpBear. 415 00:14:18,200 --> 00:14:20,380 We hope unpacking these sources helped clarify 416 00:14:20,380 --> 00:14:21,680 what this tool is all about. 417 00:14:21,680 --> 00:14:22,680 I believe it was useful. 418 00:14:22,680 --> 00:14:24,880 We'll catch you on the next one.