Dan Leibson, VP of Search Local SEO Guide
Dan is a data driven marketer with over a decade experiance in search. Having worked both in-house and agency side, Dan works to drive business results for Local SEO Guide’s clients. Dan spends his spare hanging out with his wife and daughters. He enjoys organic gardening, good whiskey and beer, nerd culture and amazing food. Weirdo.
Noah Learner: Welcome, everybody to another episode of agency automators. I'm one of your hosts Noah Learner. I'm here with my patriot friend, Jordan Choo. What's up?
Jordan Choo: Well, today, yeah, I'm battling a cold. Oh, drinking lots of tea.
Noah Learner: I'm incredibly stoked to have Dan Leibson from Local SEO guide with us to talk all about how they use automation and all kinds of crazy tools. We're going to get into machine language we're going to get into excuse me, machine learning, net NLP, we're going to talk about how they use all kinds of different tools, whether it's node based tools, Python based tools. He just released a crazy cool tool that we're going to get into, Dan, how's it going today?
Dan Leibson: Awesome and awesome. I was just in Austin, Tuesday through Thursday. So in Austin is way colder, I thought it would be so it's nice to be back in So Cal
Noah Learner: I'm trying to come to grips with which I like more your hair now, or your hair in the headshot.
Dan Leibson: Oh, it's way longer now.
Noah Learner: So for folks who don't know, can you give us insight into into Local SEO Guide like what your what your niches, how you help huge enterprise firms and what you're really stoked on in the next year or two.
Dan Leibson: Yeah, local SEO guide is a boutique SEO consultancy. You would you wouldn't really get it from the name but we primarily do like enterprise and startup consulting. Most of the local component either comes from like multi location retail companies or traditionally like directories and other local organic sites that are trying to rank outside of local pack results.So you know, we have like a very technical SEO and analytics based kind of SEO background in our in our shop, very different from the kind of like GMB based focus. You would kind of I think of in terms of traditional local search.
Noah Learner: Cool. I think that's, that's really neat. And what have you been seeing in the past six months that's been impacting your clients in a way that might or might not be getting a lot of press in terms of new algorithm changes or new posts that we're seeing in webmasters blog?
Dan Leibson: We do not we haven't done anything new man. We're just like keeping calm and carrying on and all in all honesty, this this like whole new thing about the deduping of featured snippets has a potential to be really impactful to a segment of our business, only because we do a lot of work with like b2b SaaS companies and kind of startups like that. And so like the way that their content works, like double dipping on featured snippets, or like ranking in position five and getting a featured snippet, because you can't get above like, government entities and positions, like one or two, right, like has been a really solid strategy, but it's really early to even kind of comment there, right? Like it's only it's only a week old, I don't I don't really have any meaningful even even data to talk about where I would want to kind of pivot our, our content strategies.
Noah Learner: Cool. Cool. Let's talk a little bit about automation. So in the pre call, what I heard was a lot of tension around how you approach automation in putting a lot of thought into like when it makes sense, and when it doesn't. Can you jump into how you evaluate opportunities and whether to go down one path or another?
Dan Leibson: Yeah, so I used to, I'll kind of get there in a roundabout way. But like the first half of my career, I worked in house for like DTC startups and stuff like that. Hence the technical SEO and analytics kind of background. And my first gig and only gig agency side before this was a scalable local SEO shop where the leadership team was constantly fixated on bright shiny new automation examples and would constantly be pushing develop priorities away from like, kind of quarter business objectives towards like shiny new automation principles. And so as a result, it really impacted product development and like made us choose, like automation paths over like new kind of core products that we could launch and monetize. Right like, and it really, I thought it was really problematic there. And the way that we the way that we approach automation is very particular around like issues of scale or things that are like meaningful bottlenecks of SEO operations, right, like so the reason that the tool we're talking about later happened is I would say that like, I mean, like in a kind of core level, we think about like, Where can we be creating meaningful value out of automation? Like Where can we implement some piece of automation in order to make a meaningful value added to allow like an analyst to be able to take it and deliver a better insight, rather than like what saves us time, right, like tons of SEO is like gathering and collating data, just like an example. And so, you know, if you can do all of that via automation, then it lets the person that you're paying to do the analysis and drive the strategy use their brain for that, rather than, you know, the lookups. And joining data tables together and all that stuff.
Noah Learner: Yeah, yeah. So for folks who don't understand, when you're talking about scale, you're dealing with scale, unlike what almost anyone else in our industry deals with, right? Like, what's the range of locations the that you're dealing with for different clients?
Dan Leibson: Yeah, so it's like not it's not even locations. Just as like an example. I was doing some we were doing we're doing a content strategy for an electric scooter company. And so one of the things that I wanted to do is figure out what like the content looks like in like, the opportunity looks like in the electric scooter space. And so to do that, it's like crunching a million. I looked at a million keywords just like straight up. And so the process of like, how do you get those million keywords? Where do you get them from From How do you pare that down to something useful? Right? That's what I'm talking about in terms of like, how to figure out the strategy. It's not even like the like the scale of this site or business like astronomical, right? It's like, how do you find out, like where the most meaningful opportunities are in like an entire space of a topic, right. And the only way to do that is to get the data.
Noah Learner: So you are you in a high process and a high level, that's fascinating.
Dan Leibson: So the way that we would pull out the I'm actually going to have a post about this in a couple of weeks on how we pull out the data at scale. But we basically have us like this. So this is we have a scripting server. And part of it is we can just like rotate between, like an API, an API that pulls keyword level data. So like all the 10 ranking pages that are ranking for a particular keyword, and then you just flip it back around, and then you pull the domain like the URL level data, like all the URLs that that so it's top 10 pages that are ranking for a keyword other keywords that are also ranking for flipping are and again And then you pull up the top 10 URLs for those pages that are ranking for the new refactor list of keywords that you have. And then you click around again, right and pull all the additional ranking keywords. So it's just a scripting process to be able to like dump. Keyword level data from API is like, super quick. And then we like rolling through some ngrams. And then we look at it for like high level kind of concepts. And then we like back into what the content strategy is.
Noah Learner: So it's arrays inside of arrays inside of arrays, when you're processing stuff, I love that
Dan Leibson: constantly refactor SERP data more or less important in times enough to just like, expand it out, out, out, out out, right. So you start it's like the same metrics for like, like, the same philosophy for like seed metrics in the way that like search engines crawl, right? Like you start with a thing. And then you find out all the places that's connected to you until you have this like little ball of interrelation of all of these topics.
Noah Learner: So how What are you losing in the process? Because we were talking about on the pre call you were talking about how there are times when we just don't want to automate. It's just the wrong execution. So I read a lot of SEOs posting about how important it is to look at the SERPs. And you're kind of skipping that by script is keyword
Dan Leibson: This is keyword research. This isn't this is not looking at SERPs. This is this is this is purely for understanding the like keyword opportunity, right, like the topic opportunities. And looking at SERPs, I think personally is about understanding like intent and SERP features, right? And it's looking at like the different things that are in each of the positions, not like what are all of the like, what are all the keywords out there and what are all the pages ranking for them so that you can see what kind of content to create if that makes sense?
Noah Learner: Have you thought of building a tool to do that too.
Dan Leibson: In the intent thing, we actually have like a super crazy thing that's like top secret on that that like maybe out in like a month or two that That is like very local specific. And it should be really interesting, huh?
Noah Learner: I saw you scratching the back of your neck like shit. I can't tell them about this now.
Dan Leibson: You can find some stuff on Twitter about it between me and David Cohen but there's like night it's like not explicit, but you can you can find it if you want to look through.
Jordan Choo: With a lot of these processes are you doing them by hand first? Or are you just like kind of diving right into automation?
Dan Leibson: No, this is how we do SEO and so we're looking at ways to be able to like where we can automate things and where it makes sense to be able to like make these processes more accessible to a greater audience so that we can have more analysts doing better SEO right.
Noah Learner: Got it. Okay. That's super cool seat. Okay.
So share with us some losses and what you learned along the way that that made you want to not automate things.
Dan Leibson: Yeah, I mean, the only like loss I can tell you about is like when we like tried to like launch our like software product. Locadium. To do like GMB change notifications, we didn't like think about how like the system we were architecting it on top of could change like the the Google Places API, right? How do you see what's on the front end of a Google Maps and then what you see on the back end, in GMB. And then they changed their API pricing around and it made it like insane for anybody that wasn't like, you know, like a large app that's making millions of calls. And the API budget is a huge part of their infrastructure costs, which like, wasn't ours it, like took all the margin out of it, right. And so don't automate if you can't understand this just like what the dependencies are of the system's your architecting, because that can get you screwed. And don't do it if the human brain is better at it. That's just like a general philosophy. There's a lot of stuff the human brain is just better at.
Noah Learner: I'm reading a really good book called range by David Epstein right now. And apparently the things that humans are really good at versus machines like machines can beat humans at chess, because machines can memorize millions and millions and millions of positions on a board. Whereas humans are really good at understanding the higher level strategy on top of that, and can process all of like, the data that a machine can give you. A human can process it better than a machine. How have you kind of thought through that and what what kind of buckets are you putting in humans doing and what buckets are you not?
Dan Leibson: Yeah, so I want humans to do strategic work, right. And SEO to me the like high level stuff is all strategy. It's being able to see the trees, the forest through the trees, right. And so there's so much stuff that's not that that's like part and parcel of the day to day SEO, like work in the business of running an agency and all all of that. All of that stuff happens when people don't when people aren't automating, they're doing that by hiring like really low wage labor and either exploiting them outsourcing to the Philippines, or just doing a doing all this other stuff to be able to account for the cost, right? And whatever your kind of personal opinions are on that, like that wasn't what we wanted to do. And so the solution is to, like, figure out what like we're talking about right? Figure out what the humans do and automate the rest of it.
Noah Learner: hmmm... Love it Tell us about the tools that you're using it in house and the technologies that you can share.
Dan Leibson: Yeah, so I mean, we were like AWS EC2 to Docker. And language wise, we use node and we use Python. We do a lot of stuff via slack integration. We use g sheets for some stuff. We don't do like App Script or anything like that. But we use like g sheets is like an interface for some things like a front end. Like the first version of the the like scale API keyword extractor thing was using a Google Sheet as an input just because passing like flat files of large scale data back and forth doesn't make a lot of sense. And we have some stuff on BigQuery some stuff not on BigQuery. We're moving everything over to BigQuery for data warehousing, because just if you like have a lot of data and are storing stuff, there are storage costs for like active and cold storage are really, really good. And it's Google Data Studio reporting faster. And if you need to do any report, automate like visualizations or whatever will generally do that in Data Studio. Occasionally, we will do some rendering via Python with MatPlot
Noah Learner: And you were saying you love pandas? Right?
Noah Learner: Vanilla JS and node.
Dan Leibson: Yeah.
Noah Learner: Okay, cool. Sweet.
Dan Leibson: BigQuery. Right? Like it was like primarily front end stuff back then.
Noah Learner: Yeah. Can you we've already gone over we already gone over the loss? What's your biggest automation loss? Can you share any any any fails?
Dan Leibson: Yeah, we launched a software product and then shut it down because we couldn't Yes. infrastructure, the infrastructure costs kill the margins. That's pretty fair.
Noah Learner: I'm going to cut that out. Don't worry. I'm so
Dan Leibson: happy to make fun of myself again. It's okay.
Noah Learner: When did it I want to learn about how/when you got into machine learning and NLP like when did that all start for you? Was it like in the past two years or farther back or
Dan Leibson: farther back for sure and not like the not like the NLP part of it not like the way that we kind of talked about those things in 2020. But like the way that Google understands like topics and content structure, I think was becoming like a very apparent thing post Panda, right like they've been in like all the Panda update rollouts, and a lot of like, their core updates. I think it just been like, the change about how they understand content has been like very clear for a long time. They've been publishing about it, right? They speak it like symposiums on information retrieval. It's just, it's there to be seen, right. And so I think that recently where it's kind of been something that I've been more focused on as the barriers to entry have become a lot easier and there's just like a lot of easy ways to kind of break into a lot of this stuff because a lot of the libraries have already been written right like the the NLTK library for Python is like a good example of a NLP library that just is there and open source and exists and it's maintained by people doing good stuff. Or like spacey that's like a C as machine learning and NLP based library. It's a little more expensive, but like it's supposed to be like faster, but it's supposed to be easy to use, stuff like that. We can talk about BERT, right like but like, one of my big things right now is to try to find out how easy it is to deploy BERT models for some of our stuff because like so, so the thing that you connected me to Andrea for for the WikiLinks corpus is we're going to take Wikilinks corpus, we're going to spend a month on ngraming the whole thing on an EC2 instance and then we're going to start scoring it and like we're going to start figuring out how we can incorporate that is one end of a BERT model and then like the content on like, In like a content silo on a site on the other end so like how does like Google's core understanding of old structured data around like the ngram for the topic relate to like that content content in the content silo on the site? I can only talk about this theoretically right now because we have not like kicked anything off. But people keep telling me that it's something that should be able to be done. So
Noah Learner: that sounds kind of similar to how his WordPress plugin works, right, Jordan?
Jordan Choo: Um, I want to say similar I think it allowed like
Dan Leibson: Wordpress plugin Can you like hit me up with a link via email? Yeah, it's like we don't like do WordPress or whatever but I totally interested in how it works
Noah Learner: also built he also built custom in implementation so that you wouldn't need it to go through WordPress. But basically, it enables you to build up a Knowledge Graph around you and you're entity
Dan Leibson: it's like the custom Knowledge Graph stuff. That's very cool. I know. I know some people that are doing that. That makes sense, with the paper that he's Me actually,
Noah Learner: okay, cool, because it
Dan Leibson: was like on like deep understanding deep ontologies, which is like very much a Knowledge Graph thing.
Noah Learner: How do you use these to drive ROI for clients?
Dan Leibson: Yeah, so our content strategy is almost exclusively based around, like the way that we think about like content and language. And so like almost all of the wins I share on Twitter are from our content strategy, versus technical SEO, content and links. And so one of the next things that we're going to do is try to apply some automation processes around our link building, not like the outreach or the prospecting or anything like that. But the research we made, we made we do a lot of automation around research. We do a lot of research. I don't know about y'all or other people that are doing SEO stuff. But SEO for us is like a research discipline.
Noah Learner: So when you say can you share with us how you think about you just when You started that answer you said, it's all tied to how we think about content.
Dan Leibson: Then wins, our wins like like how we use NLP is all tied to how we deploy your content strategies. Right. And so like, do you know are you familiar with the tool Clearscope.io,
Noah Learner: yeah.
Dan Leibson: Do you know Bernard?
Noah Learner: Yep. I don't have it yet.
Dan Leibson: Yeah, Bernard is great.
I was able to pick his brain over some beers on Wednesday. And so like, the core understanding of sophisticated NLP for me and content strategy started actually with talking to Bernard before he moved Clearscope to move to Austin when I met him in San Francisco for part of Bay Area search. Yeah. So we use clear scope for a lot of our content strategy. And the reason that I want to build the like BERT thing the the BERT Wikilinks thing is to do like I'm talking about looking at a content silo on a site. It's like how to do it on the research side, right? Where ClearScope is on the content creation side, like how do you write about something very specific. This is how to understand how other people are talking about something very specific. So if you were to use that Like ClearScope credits or MarktMuse or something, it'd be like thousands of dollars, right? Because they're they have a much more sophisticated process and their their costs are really high. I want something like quick and dirty that can scale.
Noah Learner: Do you tend to lean that way when you're building internal tools where you just want to build something robust in that it won't break? Simple and that you can jump in and tweak it if you need to. efficient and cheap. Is that your decision making matrix or how do you think about thing?
Dan Leibson: Oh, yeah, I want to keep infrastructure costs down, right? Like we don't want technical debt. I don't want to be a developer shop I we need to be thinking about how we scale the resources to do some of this stuff, because it's not like operational or marketing driven functions. It's custom development. Right? So like actual hiring of engineers
Jordan Choo: Awesome
Dan Leibson: That was a big thing with our tech ops off site that we just had. It's like what's our approach going to be to like feature improvement in technical debt?
Noah Learner: I often They, I mean, Jordan and I often think of things in terms of our execution, we go as simple as possible. Oftentimes, we try and keep the processes as discreet as possible. So it's like a simple process that's integrated and built with the simplest tools at the lowest cost possible. And it doesn't have huge scope. Typically, it's like a simple tool that does a simple thing. But it might save us, you know, an hour to four hours a month, and the more those we can stack up, the more time we can spend on strategy and less time on just kind of doing the work. Which I think is the goal for most SEOs. Right. Is that how you think? although you were saying that you don't like doing it, you don't like using automation to automate processes, you like to do it more on the in other ways, like
Dan Leibson: research and stuff, right? We don't do a lot of repetitive tasks and the ones we do, we definitely would want to automate those away.
Noah Learner: Do you want to share your screen and talk about Through the tool I also have, I have a couple of I have both your blog posts and I have your I have your GitHub repo open.
Dan Leibson: You just call it a GitHub repo?
Noah Learner: Yeah, I'm lame. It, isn't it?
Isn't this a repo? What What did I do wrong?
Dan Leibson: Man? It's GIF.
Noah Learner:No, I said Git Hub repo.
Dan Leibson: It's GitHub. It's not GitHub and GIF bro. This is like, this is a deal breaker.
Noah Learner: So I had no idea that there was a GIF, GIF, GIT, JGIRIT argument or a flame or potential excuse me, Shadow what, what his feelings about this apparently, GitHub. Okay, sorry. Yeah. So Mic drop. So tell us all about Tell us about your tool. Where do you want me to go?
Jordan Choo: So in our pre call you were talking about lighthouse versus PageSpeed Insights. Do you care to elaborate on that on why you decided to go with the lighthouse with the lighthouse reports?
Dan Leibson: Yeah, so LightHouse power is PageSpeed Insights, my understanding of it is that you get much more granular data back on the lighthouse side, even even if that wasn't the case, the reason that we were building it, the way that we built it is that we wanted to build a Node JS Puppeteer to your back end performance testing stack. Because that's what Google uses for rendering the web is they use Node JS plus puppeteer. And so we wanted the excuse to build that infrastructure component, right? And so we can use that for like running a lot of this stuff to be rendering the scripts etc. And then we can also be running like hitting puppeteer URLs with puppet here and be measuring performance from the puppet here instance like the puppeteer render of that URL. And so that was kind of the reason that we made that determination is just to just to get the puppeteer NodeJS stuck up.
We like working with API's in general.
Noah Learner: So you built this as an excuse to have to have a puppeteer's stack built up. I did the same thing. Because I just knew that I had to go down that road like I needed to do crawls behind logins, in a weird situation where the sites had good cookies that reset every night. And so I knew that I needed puppeteer because I couldn't rely on other tools that I was considering.
Dan Leibson: So building our node.js puppeteer instance was essentially free for us because I built building this tool means that we're going to amortize the cost of doing any performance testing on any future audits we ever do. Which means now we have the thing, we can use it for other things, and the business case is just already taken care of.
Noah Learner: I love that. I woke up. I don't think this is original at all, but I woke up and again, you're you're dealing with way different scale than I am. Most of my clients have sites that only have whatever 50 to 70,000 pages, they're not
Dan Leibson: your full stack. So it's slightly different.
Noah Learner: Yeah. So like I was thinking about using Screaming Frog, exporting to drive exporting the results to drive. And then when it goes into drive, using App Script to pull the results into a sheet, and as it's doing that, applying timestamps to the results, so you'd have periodic tracking of of performance and its it would pull in all the PageSpeed in sites at the same time. I wouldn't want to do it on all the content clearly but for different, different although for you, I love how you're thinking about this how you're looking at at
Dan Leibson: You can set this up to on a cron job man This can this can this can run monthly. Yeah, it has. It has a the ability specifically to run monthly
Noah Learner: Yeah. I love that. That's super cool. Do you want to talk us through setup and how to use it? And
Dan Leibson: yeah, you want to roll to the let's go to the other side of it, actually. So first, there's a backlog back in. Oh, sorry. I was gonna say no, go to the report. Yeah. Yeah. Because I think it's easier to talk about how it works when looking at the kind of end result of it.
Noah Learner: Okay
Dan Leibson: So this is a end result of the report you can like these are the roll ups of all the individual templates, you can kind of drill down into any of these to see the particular URLs that make up that aggregated metric that drill down to see the individual URL performance reports, right. So this is really just like 10,000 view foot view, you can drill all the way down on it. And so what the where this started with is, like we were talking about what not to automate, right. And so the main thing that this uses to get anywhere is a series of URLs associated fully right category in this case. And so like doing that right there, but with a person is like 5-10 minutes, right, versus figuring out how to like scrape a website, like we were talking about in the pre call, being able to determine different templates, when the URL structure is the same big problem. It's a problem for Target in this example, report, actually. So this wouldn't have been able to happen without doing it manually. And then so you have that manual creation there, right, which takes a person way less time way less infrastructure to do than a human. We have a slack bot that works to take any commands or info like a file, in this case via our slack instance, because that's where we were remote agency. It's where we do most of our Stuff, the bot takes it goes over to our scripting server our EC2 instance, runs the lighthouse scripts in order to grab all the information stores it in clouds SQL in a series of different data tables, the data tables setup is particularly designed in that way so that we can use appened this to other stuff, right? Like we want to be able to add, like I was talking about like native puppeteer crawl, native puppet to your render metrics in and some potential other things from GSC. Right. So we want this warehouse so it can be bolted onto things. And and then we visualize it in Data Studio. And that's the that's the net net of it at a at a high level
Jordan Choo: Our curiosity. What, what opted you to go into, like the traditional SQL database rather than Big Query?
Dan Leibson: Yeah, it was just like not a hard requirement on my end. And so just a good example of if I'm going to be hands off and like let my engineer build things however they want to I need to be like hyper specific on what I'm looking for in order it like in our database and data warehousing. This is actually the only thing we have that is not on BigQuery. So in the vein, I have a user story out right now to move the infrastructure.
Noah Learner: Got it. So when you're uploading, I feel like such a tool for asking all these like granular questions, but the Wait, so your slack bots called Jarvis. I love the name. When you explain to me how it works like you up I don't understand how the file upload into Jarvis turns into how Jarvis pushes the data. Straight through Is it is it directly connecting the slack API to know
Dan Leibson: Jarvis is Dockerized. Okay, so this is where we use Docker for scripting. Can you talk about that because this is
Noah Learner: the edge of my understanding like do
Noah Learner: I'm pretty comfortable. With the "and then magic happens"
Dan Leibson: the net net on the Docker instance is to make it so that we can be using Jarvis to communicate with our scripting instance. And so that it can run the scripting and communicate input and output back from the server into slack bot
Noah Learner: have you thought of turning this into a into a time series?
Dan Leibson: So it is a time series right? Like you can change the this is set up, it has two days worth of data in it right? You can run it chronologically. We just don't have the visualizations done for the time series yet because I released it as soon as we built it.
Noah Learner: Yeah yeah
Dan Leibson: we're just like implementing this into our processes and stuff right now.
Noah Learner: Because I see that is super, super easy. Just copy the chart and then change the this breakdown to the category that you want to visualize. And then maybe pick three or four of them that you could visualize and then you could show a couple at a time. an overlay them to see if there are trends.
Dan Leibson: Yeah, I would actually probably say for the time series, it would be something like an individual one per page only because that way we could get like the time series for all the various metrics of an individual template. But this is like, I haven't really put any thought into why one would be better than the other. So that's like, there's nothing to that in any way shape or form. And in the only reason why I have no good answer to like, why not a time series?
Noah Learner: Yeah
Dan Leibson: Just like, why is everything a bar chart? Because it's easy to do drill downs and measure like a bucket in there like all histograms, more or less, right, because it is a distribution of like performance from a various variation, a bunch of different things.
Noah Learner: Yeah, I love it. I think it's super cool. Um, let's see back to our questions. What did you and the team learn along the way building the tool. What were so
Dan Leibson: that are that are sprints for development are awesome that we can do it really well. If we're really nimble. We're like just working out having a dedicated Like tech ops team, right? So like a lot of it a lot of the things that we worked out is like around like user stories and stuff like that and scoping not really about the like, development process or like how it works or anything. The thing that I learned about this is I got some good feedback on like the user experience of the report like the GDS report, which I thought was kind of funny only because it's like such a, it's just like a data visualization to me, right? It's that it's the output and I'm happy to like struggle to use it, however possible, because like getting to there was the hard point, right? And I can see how that's like a really poor experience for sending like embedding this report into like, audits and stuff like that, right, like And so, one of the things that I'm going to do for like, like all of this text here is all written by engineers just like an example. And so like having our like content team or myself or whatever going in and doing the like, page one Help page for some of our like GDS reports or visualizations and then writing the Copy of things and stuff to make it more palatable for like a non engineer based user group.
Noah Learner: I didn't really get into this and I apologize I forgot to keep keep going through the report. So it's it's basically pulling through all of the lightspeed metrics just visualize page by page,
Dan Leibson: not all of them and so like for target the like a single a single URL the JSON blob of it in terms of the LightHouse report is like 50,000 lines or something.
Noah Learner: Oh, my God, one page, one URL?
Dan Leibson: Yeah, URL so we did not take everything there is so this is what I'm saying? Like I don't think the PageSpeed Insights API gives you all of that granular reporting. So I go all the way to the end go to page seven or eight
Noah Learner: Yeah, sorry.
Noah Learner: So are you is this in your wheelhouse, where you'll look at 11.3 seconds and go and be able to look at their the JS used and within a couple days be able to get them in a very different place, or is it
Dan Leibson: I mean, we would just call out what's like the resource hogs, right? Like this is like where it's like, we can't be the decision makers on like, what you pull in what not, because there could be meaningful business reasons that they have all of those things and they have to run their own internal engineering sprints in order to sort that all out with their stakeholders. Right But like, we are 100% care about like the timings on script evaluation and like render time and all of that stuff very deeply. And we're like way deep into the client side rendering and like crawl ability and render-ability and flexibility of sites is like a core part of what we work with on like a week to week basis. And so like, I don't know if you paid attention to this, but like, I'm Google Martin Split, developer advocate at Google on the webmasters team has been, like, explicit at conferences about how there's a render budget to go along with craw budget just as like an example. And so that 11 second script evaluation is an insane time to let have evaluate scripts before it renders them right. Like you could be having a possible timeout.
Noah Learner: Yeah, that's okay. That's insane. I feel like when I look at lighthouse reports, and on some platforms, I have a sense of how to, how to improve and on others it's it's super hard and beyond the scope of what's possible, meaning it's locked. The platforms locked down, which is super frustrating. And every time I look at one of these where that's the case, it's just, it's just heartbreaking.
Dan Leibson: Yeah, so I mean, so this, this is probably a proprietary platform, right? This is Target. Like they're definitely not like rolling their own enterprise CMS, right. Like, yeah, they may have custom connectors and stuff, but they're using off the shelf stuff that they're customizing. And so they probably don't have the ability to change some of this stuff. actually go back to that page one, I can tell you a funny story. Yeah. So this is like, this is like a perfect example of these two charts on why it's important to look at performance. If you notice the e commerce side of the site is screaming.
Noah Learner: Yes
Dan Leibson: It's screaming but boy, those locator and location pages are dragging wind. And that's because in mixed retail businesses like Target right, the e commerce side is like really, they like control the digital experience, right. So like the digital execs on like the the ecom side PDP, shelf pages and all that stuff like Faster, faster faster, right like more links better internal links, it's like a core part of their business. And on the other side that's just like, we're local SEOs, you know, we're just gonna, like, get our citations and have our NAP and do all of this stuff. Like, at the end of the day, they're locator and location pages are just dragging ass man like, Ooh
Noah Learner: yeah. So
Dan Leibson: it's like, that's likely a business problem, right? Not a not a problem, likely business problem.
Noah Learner: I get it. Can I ask you an e commerce question? I don't know, deal with this stuff. So in SEO told me a strategy the other day that I thought was really interesting. He was saying that one tactic that you can do to improve indexation inside categories is, so you have a category, it's got 12 products on each of those PDP, you do internal linkings and you treat each of those products inside the category almost like it's paginated content. So you go from product to To product B from B to C, D to E to F to F to G, and when you get to the end, you link back up to the parent and you have breadcrumbs doing that to you have all your breadcrumbs that are going up and down the categorization chain. And I had not heard of that as a strategy before is that something you've ever employed?
Dan Leibson: It mean it just seems like what you would what you would be getting out of employing like a widget rate like a cross sell upsell widget like if you check out somebody like Best Buy's like their any of their PDP is right, they have like people also buy bought, you may also like related products, those are all driven, they drive a lot of revenue, but they are also internal in place, like explicitly in a lot of instances.
Noah Learner: and your pro or con.
You got to have Them?
Dan Leibson: internal links always on an e commerce site. Like I mean so so think about this at the scale of Walmart Walmart has I forget I forget the number and I'm gonna say something stupid. I want to say they have something like 6 million skews and I want to say maybe I'm I'm under there. Maybe it's 60 million So whatever that number million number of skews is, right? It's hard to get Google to crawl all those things.
Noah Learner: Sure
Dan Leibson: right? Because they're deep in the taxonomy of categories and subcategories, not everything is linked to the same. It's hard to get internal links to every piece of the page if you want to be relevant, and you're trying to like generate revenue, right? The ability to get pages like site like all the pages on a large site index, particularly about large ecommerce retailers very challenging.
Noah Learner: Yeah. Fascinating.
Let's see. So what else what else can we jump into with the tool? Like, what? What did you think you're going to be able to do that through the dev process? Like it just became beyond the realm of possible Was there anything that fell into that that category or anything that that you were surprised you were able to pull off that you didn't plan on up front?
Dan Leibson: I was totally surprised at the speed that this thing got input that we were able to crank this thing out for sure. I think that the thing that like became too much to get implemented. This is like about writing user stories and stuff like that and making sure everyone's on the same page. But the GDS report creation, you have to like set up some of the like data sources in GDS for it to to populate the template. And so we wanted to get that all have that all be automated, but that's like, again, part of like the puppeteer process and everything right? Like you have to like be get the trigger pages and stuff and then share URL. And so it just it just became like its own process that wasn't worth like keeping in the sprint, right? It's like, core to making it work.
Noah Learner: See, so what else are you using puppeteer for?
Dan Leibson: We're gonna use it for a lot. We're gonna use it for a lot. Um, so pulling stuff out of search console, right, like pulling not like keyword data or anything you can get via the API there, but like a lot of their other reports like the index status report and stuff like that. Let's see. Let's see performance testing on its own. Oh, using the URL Inspector, getting it to run new URL inspector like bulk URL inspector reports and GSE and then pull that data out. I think that'll be really interesting. Because the problem with the URL inspector right now is that like, you can see like, like, in terms of like canonical, or Google chose different canonical, you can like see that it's that's happening in the world's better report. But you can't like see it across like a type of URL or you can't like look at like a URL. And then the one it also shows as a canonical, like side by side, right, you can't do any comparisons with the data at all. It's just it just kind of in a vacuum. And so figuring out a way to make that kind of actionable is, I think, going to be a really important thing. And again, since it's behind Search Console, it's like a logged in Search Console view. It requires puppeteer or Selenium. Yeah.
Noah Learner: And I, you know, I shared with you that I was building a tool to do it with puppeteer as well and I ran into captcha issues. And I would love to connect with you about that because I would love to do something similar to what you're talking about. I don't have
Dan Leibson: Was it captchas for IP address or was it captchas for like the crawling, like the automation?
Noah Learner: So I logged in, I typed in a URL, you know, puppeteered typed in the URL into the inspect tool. Click the inspect button to
Dan Leibson: How did the mouse move?
Noah Learner: it then it
Dan Leibson: Did it move the mouse and click the inspect button. The inspect button.
Noah Learner: Yeah
Dan Leibson: okay
Noah Learner: Yeah. And, oh, I didn't move the mouse to that spot it just
Dan Leibson: puppeteers, good because you can mimic a human right? And so that's why I was asking about like, if it was calling like, like, you need proxies or is it like the human mimicry?
Noah Learner: I wasn't sure and I had it do timeouts to I implemented a whole bunch of, you know, delays along the way thinking that it was like doing things too quickly.
Dan Leibson: Oh, yeah, of course.
Noah Learner: Yeah, I had a bunch of timeouts that I did. And they're also they're these puppeteer. You I'm sure you found these. They're these puppeteer libraries that are like extra safe or extra secure. Yeah, and in theory can get around detection, but didn't work for me. And basically what it would do is it would submit the first URL, it would go through the whole process. You know, this is in the index, are you sure you want to request again, question mark, clicks that button, it then would submit it Then show me the report. And then when I tried to do another URL, that's when it started maybe was like five deep it started to throw up captures, and then those wouldn't go away. So I started trying to figure out if I could loop through different user agents, and use a different user agent each time I did it, and I just couldn't figure out the solution. So I'd love to pick your brain on that off the call. And of course, we at agency automators don't recommend that you break Google's Terms of Service with anything you do in the realm of SEO or otherwise. But damn, Isn't it fun to play with a dummy login?
Unknown Speaker: Okay, Python node App Script? How do you pick a tool?
Noah Learner: and are you not using App Script because it because of timeouts and runtimes won't work on an enterprise level for the types of stuff to do.
Dan Leibson: So we have like one thing that uses App Script that I'm going to release next week, I think it's a thing that just does like a bunch of branded searches you use like a GSheet, it does a bunch it like runs a bunch of searches, and then lets you know if the image URL has changed. So it's just like to check and see if you're, like, the main Knowledge Graph photo has changed for a local business. So you can put like, you know, like 1000-5000 branded searches, or whatever the cheat sheet, run the script, and it'll give you the image URL and alert you for changes. And so that that makes sense there. We tried to do an ngram script in the in App Script or an engram script and App Script and it timed out for sure just couldn't do it fast enough.
Jordan Choo: Must have been a lot keywords then?
Dan Leibson: No, it wasn't that many
Noah Learner: We've both Jordan and I have both done that, but our tool isn't sophisticated enough for what you need. But is that is that a fair description? Jordan?
Jordan Choo: Yeah
Noah Learner: I don't know how much how many rows of data we were dealing with. I mean, how many rows are you dealing with? Is it thousands is it millions?
Dan Leibson: The it when we moved it off of sheets, it was in the like, 10s of thousands
Noah Learner: 10s of thousands. Okay. Cool. We've already talked about
Jarvis, so we don't really need to go through that.
Dan Leibson: So
Noah Learner: this was a question from another guest of ours. Dale. He's the founder of brand newly released just on Product Hunt now called Jepto.
Dan Leibson: Yeah, I know about their Data Studio. GMB insights connector.
Noah Learner: Yeah, yes. Great, dude. Fantastic, dude. It's been crazy helpful to both of us. His question was, can you talk about how to identify pages that differ from there from the from the average of their you URL classifications, I'm not sure
Dan Leibson: Huma brain man, human brain.
Noah Learner: Okay. All right.
Dan Leibson: This is one of those things that's like I'm right, like, right, like automated ways to write title tags and stuff like that this is something that you could like, theoretically automate, but it would require you to, like, analyze all of the content on the page and compare it to other pages of the same URL type, right? Like, it's just like, wow, that seems like a lot of work in order versus like a human looking at two similar URLs and being like, No, those are different templates.
Noah Learner: But you're also so how do you scale dealing with with those types of challenges? I mean, when you're talking about 6 million SKUs at Walmart, I mean, are you? You have like,
Dan Leibson: They probably have like 10 templates? Yeah, okay. 6 million excuses. One is one template. It's one PDP, the rest of the categories that make that up is three templates, right? It's like a category a shelf and a subcat. Right? It's not that's like That's why I'm looking at them with like multiple examples in the template is the solution to it and why a human can do it pretty easily right? Because there's not millions of types of temp URL templates, man imagine being engineers that have to manage that shit. Whoo.
Noah Learner: So so if you're a small, small site and you've got 100 SKUs and you've put a huge amount of effort in everything is manual in terms of how you're doing your page titles meta descriptions every you know, the way that the site is built is crazy, crazy performant. Your internal linking is perfect. You've got a shadow on your side get down dude. This is what obnoxious This is the first time he started in any of our Hangouts. Yeah, he's a great dog but he's kind of a pain in the ass But umm can small sites when I mean what I'm hearing that everything's templatized on a big site, I know it's a it's a link game but
Dan Leibson: Everything else has done a big site because They can't manually enter a product, that's like not a thing, right? Like they have to get product inventory from an ERP system, right? And their merchandisers have to work in an independent system to make their supply chain work, right? Like, the only way that they can make the business work and have a site that is keeping line inventory up is through a connector between an enterprise resource planning thing and an inventory management thing. And like the e commerce shop, right? And so, like the SMBs don't have medium sized businesses even don't have that challenge, right. And so whether opportunity is is not to try to index a million skews their opportunity is to try to like have really like a template that can go after like a subset of those and doesn't need to like yeah, make it so that you can adequately link to 6 million pages in the taxonomy. Right. Like, it's their challenges is it's really easy and they have to go build links.
Noah Learner: Yeah. That's fascinating. And I have just, I just So I'm in bike land, right and in bike land, REI started selling a product line called Bontrager in March of 2019. They're kind of like the Amazon of outdoor sports. And at the time, I was pretty afraid that was in negatively impact a number of my clients who sell sell that specific brand. And throughout the 2019, we saw revenue grew up for pretty much all of our clients. And then we found that Bontrager revenue has has in fact been negatively impacted. So I'm looking for these ways of chiseling into to get that back to get that revenue back. So it's like this is a challenge that I'm facing right now day to day or week to week
Dan Leibson: but you have products that solve that right this is where you being full stack makes that something that solvable, right you can do nurture campaigns on cart abandonment and stuff like that the ability to connect the like entire lifecycle and the like social and email marketing it together. are easily is is not something that REI can do, right? They have a bigger list for sure. And they can develop prettier graphics and all of that stuff. But it just takes them longer and they have to build a process and then their process breaks they can't do anything.
Noah Learner: Yeah. And I found that to be the case that when we built bike basket it pretty much instantly grew revenues for clients in the 6% to 8% range bike basket is a cart abandonment tool that we built that's bike industry specific. That works on the platform that most of the e commerce shops in the bike space are using.
Cool How we doing for time. Jordan, are you tracking?
Jordan Choo: I am tracking. We are just coming up.
Noah Learner: We're coming up. Okay. Dan, this has been amazing having you on what are you what are you working on now? What can you share? What are you excited about? I don't want to do the what's coming up in 2020 a blog post with with you know, looking at 17,000 different concepts but what are you stoked on?
Dan Leibson: Yeah, so we have an NLP process that we're kicking off next week and then we have like a really interesting public facing thing that's going to be a marketing push for us that's gonna be really interesting that is going to be getting kicked off next week to both from the this is both on the tech ops side both on like the automation and development so
Noah Learner: that's super cool. When you release it, can you ping me on Twitter?
Dan Leibson: Oh, yeah, it'll be dope. They'll be big pushes around both so the NLP thing is like an internal thing. I don't know that we're going to end up releasing it yet until we like have something there. Right. Like, I don't know that we're going to open source and we have to figure out like, we're just moving on building it right. So it's like that we have the investigation done. It's just going to be getting the kind of first parts of it next week. So I can't tell you what's going to be like a free thing or like a paid product or something that we like use internally until it's a paid product but the the public facing thing will be a big launch. I'll definitely I'll share it with you before it goes live so you can tell me what you think.
Noah Learner: Cool. Jordan any Any last questions?
Jordan Choo: I don't think so. It was all very insightful. Dan, thank you very much.
Dan Leibson: Oh, my pleasure. My pleasure. Thanks for having me, guys.
Jordan Choo: Yeah. And people want to get in touch with you. How would they do that?
Dan Leibson: At Dan Lebison on the Twitter's.
Unknown Speaker: And I'll just show everybody the proper spelling.
Dan Leibson: Yes, da n
Unknown Speaker: Lie B as in boy. SON.
Noah Learner: Awesome. Dan, thanks so much. Everyone out there. We really appreciate you watching this. And we've got a new episode coming up soon. We're not going to tell you what it is. Because we don't know what. And Dan thanks so much. Have an awesome, have a great weekend. And for those of you on Twitter, Dan has some amazing gardening photos that he's been sharing. I know it's like way off topic. But if you're looking for inspiration to build an organic garden, that's like Reason number 2000 to follow him on Twitter. It's been amazing. Watching you Learn how to garden and yeah, he's great.
Dan Leibson: I'm sending I'm actually sending you know, Jamie Alberico, right like actually yeah yeah so I'm sending and sending Jamie some some seeds so I told her I would send her a list of all my open seed packs. So I just have like, a ton of seed packs on my office as you can see.
Noah Learner: Yeah, I'm coming out to LA in March. Are you going to start to see stuff popping up yet at that point?
Dan Leibson: I probably will. Like just put them in the ground then. Like a march 1.
Noah Learner: Yeah, and I'm coming up end of the month. I've got some clients to see and we've got a bunch of stuff going on. Are you are you right? Like are you near the city don't worry I'm not gonna ask to like bump into you. But
Dan Leibson: oh, I mean, you can bump into me if you try it but if you want to get a beer or whatever, like in depending on how it works, I could potentially happen I'm like 30 miles away, but it's like, depending on traffic could be Yeah, like two hours so
Noah Learner: And I'm going to Realto and Redlands is that opposite direction in
Dan Leibson: Redlands in land. Yeah, in land is in the not neither of those are la man,
Noah Learner: but it's like an hour and a half, right?
Dan Leibson: Yeah, that's dude. I know everywhere
else like the like metro areas are like distinct and everybody treats Southern California like it's all Los Angeles but like not Los Angeles.
Noah Learner: Yeah, no I know it's a different like Yeah.
Dan Leibson: I think it's a different County. I think it's in I think it's in San Bernardino.
Noah Learner: Oh yeah, nice. Okay. So everybody, thanks for watching. And we're gonna tie it up now. Until next time, thanks for watching agency automators. Dan you rock,
Jordan, you rock, everybody take care. Thanks, everybody.
We built this Add-on to help you manage GMB Questions and Answers in an easy familiar way with the aid of Google Sheets.
Share and learn about automating your digital marketing agency.
AgencyAutomators is a Bike Shop SEO and Kogneta company devoted to helping your agency become more profitable with automation.
AgencyAutomators © 2023
Terms of Service