Google’s 2024 Ranking Promotions and Demotions Factors
Google’s algorithms change all the time. What worked yesterday may leave you on Page 2 today. Let’s talk Google boosts and demotions.
My love affair with SEO started way back in 2003. SEO has become a time-honored way to attract your ideal prospects to your business through search engines like Google and Bing. Research, regular content publishing and the latest engagement strategies will lure your ideal prospect at the precise moment they’re searching for what you offer.
In the beginning, all websites were created equal, and no one knew where any good websites actually were.
You needed to actually know the URL of the website that you wanted to visit!
Yikes! That’s like having to know your mom’s cell phone number instead of just hitting “Mom”.
The earliest search helpers were the plethora of “directories” that popped up in the early days of the web, where a list of supposedly curated website links would be presented to the user.
Using a directory, a user could work their way through a hierarchy of categories in order to arrive at the “answer” (a website) they were after.
Wow, that sounds so olde-tymey.
You can see an example of this on Wikipedia, which also includes simple search functionality, which searches Wikipedia only.
Of course, directories had their limits — namely, they only knew about what ever was listed in their own directory.
What about everything else on the internet?
The bright idea of creating a searchable index of all websites anywhere on the internet that contained the best content was conceived and executed by a fella named Alan Emtage, and he never made a penny off of his brilliant invention, “Archie”.
This searchable index idea and the functionality to search it became known as a search engine.
Simply stated, a search engine is a tool for finding the most relevant and accurate answers from the millions of websites on the internet.
There have been just a few inventions that truly transformed the way I live my life. Google was one of those (as were drum machines, but I’ll leave that for another article).
From 1993-1994, Excite, WebCrawler and Yahoo quickly burst on the scene, and the race was on to build and maintain the web’s #1 search engine.
Of course, a search engine would be nearly useless if it had to re-scan the entire internet each time someone asked a question. Thus, search engines use search indexes. The index is where the discovered web page information is stored.
But how is a search index built?
By using bazillions of web “crawlers“, also known as “bots” or “spiders”, which are little programs that go out and actually fetch the content of a web page and the entire website.
Next, the search engine must attempt to understand what each web page is about.
This process involves more and more factors each year, as machines get “smarter”, and better at recognizing the relevancy of a particular web page to the search query that was posited.
Finally, the web pages and websites that contain them are ranked.
Search Ranking refers to the ordering of results that happen in the Search Engine Results Pages (or SERPs) for a given query.
As with golf or hit records, a “lower” numbered ranking (e.g., 1, 2 or 3) is better than a “higher” numbered ranking (20, 46 or 1093) because a rank of 1 means the first result, a rank of 2 is the second result, etc.
Social Media has become such a prevalent term in society, there is a temptation to think that when you have a “social media person”, you’re covering all your bases for your business or organization.
Nothing could be further from the truth.
Searching on a search engine is largely a solo activity, while Social Media websites attempt to foster interaction among their users.
Social media interactions are something humans are used to in “real life”, and these interactions are very often off-the-cuff, stream-of-consciousness, un-categorized, and largely random.
Many Social Media interactions are the antithesis of “finding an answer” — people love to share their opinions and debate. However, certain interactions may be more valuable to the users of social media than others.
On social media, someone might post: “What’s your favorite Italian restaurant in San Ramon?”
This is different than typing “Italian restaurant San Ramon” into a search engine like Google. Google will factor in reviews, the ability of a particular website to respond, mobile-friendliness, and a variety of other factors which may or may not be interesting to the user.
Most social media websites like Facebook or Instagram are pretty mediocre with regards to the type of search-to-find interactions that we’re all used to on search engine websites.
Showing users search results is not the raison d’être of social media.
Search engines are far better at answering direct questions.
A search engine is a question-answering machine.
Delivering the best answer for a given question — that’s Google’s (or any search engine’s) mission. Pretty much.
A “search query” is just a question, typed in by a searcher.
Search queries typically tend to go from broad to more specific. This is because a search engine will produce more generic answers, the more generic the query. A quick example will help to illustrate.
Someone searching for a good Italian restaurant might simply type “Italian” into Google.
But that query produces generic results about the country of Italy, how to speak Italian, etc.
So the searcher types the more specific “Italian restaurants”.
That search produces a list of Italian restaurants that are the highest-rated, and (more recently) closest to the searcher.
This “narrowing” of the search happens naturally, as a searcher becomes more accustomed to using a search engine.
“Organic” search traffic, is visitors coming to your website, who have been referred in the un-paid results from a search engine.
“Un-paid?”, you say?
Why yes, not all search engine results are created equally.
Google learned early in their lifespan that advertisers would happily fund their company, and just delivering great free search results was — well, not a business model.
So, paid Search Engine Marketing (SEM) was born.
Google tries to balance the results it returns between the best paid results and the best organic results. Of course, paid results are sorted to the very tippy-top of the page.
However, Google has actually trained experienced searchers that the best results are actually sorted under all the paid results.
That’s okay. There are plenty of inexperienced searchers to fuel Alphabet and its “baby”, Google.
We touched on web crawlers above. Their job is to ingest all the data from a website and send it back to the search engine.
On many occasions, certain websites or even individual pages elect not to be crawled, and make this known to the crawling agent via a robots.txt file. This file can request bots to index only parts of a website, or nothing at all.
It is estimated that approximately 4.2 billion web pages exist in 2020. Even the largest crawlers from the major search engines cannot create a complete index of the entire web.
Once the data is received from the web crawler, the search engine attempts to understand what the web page is actually about, and how well it does its job of conveying useful information for a particular search query.
In the early days of Google, a ranking strategy was created called “PageRank“.
PageRank is a methodology for calculating the importance of specific website pages.
PageRank works by counting the number and “quality” of links to a page to make a rough estimate of how important the website is. The strategy is that more important websites are likely to receive more links from other websites.
In this way, a link from a website to a target website (or “backlink”) can be seen as a “vote” as to the target website’s authority.
Google no longer places quite as much emphasis on the number of backlinks pointing to a particular website, as they learned that people would create lots and lots of meaningless links, simply to encourage Google to rank a particular website.
Backlink quality has much more to do with the power of a given link than the number of links. One kickass backlink can contribute more authority to its target than a million crappy/spammy backlinks.
In fact, there is actually a way to “disavow” a crappy backlink that’s pointing to your website. It’s as if you’re telling Google: “Hey, I know this is a bad link, and I had nothing to do with putting it there!”
Everyone wants to get their website’s pages to the top of the Google search results. Google’s algorithm looks at a myriad of factors in order to rank pages. They do not publish any specific information regarding their algorithm, lest certain people try to “game” the system.
Instead, Google suggests that every website do very specific things right in order to rank well. Some of those things are:
Meta tags were one of the first ways invented that gave search engines a “hint” about what the web page was about.
Meta tags are not directly visible to the user of the website. It’s sort of “insider” information.
Early on, meta tags were more heavily relied-upon, as early search engines were not nearly as good at figuring out what a website was about.
Soon people became talented at embedding misleading meta tag information in order to get more search traffic, even when the website was in no way about what the searcher had searched for.
At the time of this writing, the title tag and “meta description” are the main tags that a search engine will look at for relevance, and they’d better jibe with the rest of the content on the website, or the website will disappear from the rankings.
As I wrote above, SERP is an acronym for Search Engine Result Page, the blizzard of text, images, video and whatnot that pours forth from Google when you execute a search.
It’s just easier and faster to say SERP.
A recent Google innovation are Search Snippets, which are larger pieces of relevant content (some might call them “answers”) that address a searcher’s query immediately within the SERPs.
Search snippets attempt to directly answer questions, without the user needing to click through to the web page that provided those answers.
This zero click search has not been met with great enthusiasm by the websites providing those answers, as their businesses rely on searchers finding the website, and clicking through to learn more.
Fortunately, most of these search snippets are for very direct search queries, like “What’s the weather in San Ramon, CA?”
When a searcher searches for “Italian restaurants San Ramon” in Google, Google will note the first search result that the searcher clicks on in the SERPs, and ascribe a small amount more of authority for that website the next time this search is performed.
Fortunately, most of these search snippets are for very direct search queries, like “What’s the weather in San Ramon, CA?”
If the searcher clicks on a particular result, Google takes that to mean that result has relevancy to the search.
The act of clicking on a search result is called click-through, and the number of times a result is clicked vs. the number of times a certain search is performed, is known as the click-through rate (or CTR).
The term Conversion Rate as it applies to Google refers to the number of times a searcher found a result, then continued all the way through to complete a conversion goal.
The conversion rate is simply the percentage of times that the conversion happens vs. the search result was surfaced.
A conversion goal could mean a purchase of a product or service, or simply the filling out of a contact form, or downloading an e-book.
Conversion rates are something that can be tracked in Google Analytics, a very deep analysis tool used to track and report on all kinds of statistics associated with running a website.
PPC (Pay-Per-Click) is a common colloquialism for Google’s paid advertising platform, now called Google Ads.
Google Ads offers services using the PPC auction price model. This means that when someone clicks on your ad, Google takes a fee from your account. This is called a Cost-Per-Click, or CPC.
The fee taken on each click varies depending on what your particular market is bidding for those keywords and position. If no one else is bidding, even a small fee will suffice to put your listing at the very top of the Google SERP.
However, unless you’re in a very blue ocean, you’re likely bidding against other businesses who also want their ad to show first.
Not everyone can be #1!
Law firms have some of the highest CPCs of all Google Ads bidders. “Lawyer” and “Attorney” are in the Top 10 most expensive keywords on Google. The average CPC in the legal industry is over $6.
CPC is not the only way to pay for Google Ads. An advanced bidding strategy called Cost-Per-Acquisition can be used to automatically reach a predefined cost for a particular user action.
Google Ads is Google’s main source of revenue, earning nearly $135 billion in 2019.
It’s no wonder Google is as hot on organic search traffic as it once was.
Bing offers search advertising as well. If you’re already maxed out on your efforts on Google, in 2020, Bing is a great place to go to get that extra 9% of available search traffic.
Search volume is a measurement of the popularity of a particular search term. This is particularly valuable to know when doing keyword research.
The gist of search volume is: if a term is popular, it’s worth having on your website. Otherwise, the term is not something that people are searching for, and thus, not interested in.
Why optimized your site for “Blue Klarnflarb”, when no one is searching for that term?
The domain authority of a website is a heuristic invented by Moz that helps to quickly determine the strength of the website vis-a-vis Google.
In general, the higher the DA, the better the website ranks in Google.
However, Google themselves do not recognize this metric –it’s externally calculated.
In fact, this metric has become fairly easy to manipulate by unscrupulous SEO experts; so much so that it’s rarely used among the more experienced SEOs. Ahref’s DR (domain rating) is a bit tougher to game, and is a more reliable indication of what Google will “think” when looking at a website.
There’s no question that Google is the biggest, baddest search engine on the block, and the one to which you should pay the most attention.
Usually, whatever is good for Google is good for the other search engines as well.
Bing is Microsoft’s search engine, and for a while, it seemed like it might make a dent in Google’s dominance.
Now that most of the search dust has settled, it’s clear that, while Bing does account for a decent amount of search traffic, they’re no real threat to Google’s dominance.
However, if you’re looking to score 8% more search traffic (which can be a lot, if you’re a larger or national brand), you should definitely look into Bing and its Webmaster Tools.
Yahoo! Search is the search engine owned by Yahoo. In 2018, it was the fourth largest search engine worldwide across all platforms, as long as you’re only measuring proper “search engines” (see YouTube, below).
In an ongoing and confusing twist, Yahoo’s search results are actually provided by Bing!
Even though it’s not technically a “search engine”, YouTube is actually the site that logs the second largest number of search queries of any web search provider.
Over the years, YouTube’s content has improved dramatically, and “how-to” videos, news, training and entertainment of all sorts can be found on the site.
YouTube is actually owned by Google, which is why Google is so terribly dominant in overall search.
The art of “getting found” on the internet is constantly evolving and changing.
New strategies and tactics are discovered constantly, mostly based on reverse-engineering search results. In other words, if a page ranks, someone is looking at the reasons why and trying to formulate recommendations.
An inexact science, to be sure, as it’s all post-hoc analysis.
Nevertheless, let’s look at some of the common search engine optimisation guide and strategies that people use for search engine optimization.
In order to create a page that attracts searches, you first must determine what people are actually searching for.
Fortunately, there are many tools out there that help you accomplish this mission.
The title of this discovery task is known as Keyword Research.
Some of the free tools you can use to discover what people are searching for include:
Search Snippets are bite-sized “answers” to the queries that people have typed in, which in general, should not require delving into a different website.
A good example is “Weather san ramon ca”. You really just want to know the weather in San Ramon, CA. Not dive into a website with cookies and pop-ups and a bunch of crapola.
Search snippets are great from a user’s perspective, but thorny from a web business perspective, as it cuts both ways: better for users, worse for website owners.
If you can create your own search snippets to quickly answer questions, Google will reward your website with increased search authority. After all, it is pulling its own results from your website.
Search snippets can of course lead to click-throughs to the underlying websites if the snippets lead to more questions.
B2B business websites can be perfect as providers of search snippets.
With complex, multi-touchpoint B2B services, users will often click through to determine the origin of the “answer” to their question, thus becoming ensnared in the top of the sales funnel.
Creating high-quality content is another key to getting found on the internet.
Content these days is not simply written blog articles like this one. Google loves infographics (any kind of chart with numbers included!), custom research and especially YouTube videos.
Making videos used to be hard. Now it’s ridiculously easy, with everyone carrying around full video production capabilities in their pockets via smartphones.
You must strive to produce varied types of content for your website in order to capture as much traffic as possible.
You may see the word HTML sprinkled about when it comes to the topic of search engine optimization.
HTML is short for Hypertext Markup Language, and it’s the lingua franca of the web.
Fortunately, website builders like the amazing WordPress have made learning much HTML far less important than it once was.
An exact match domain simply means that the domain name itself contains your most important keywords.
A simple example for an Italian restaurant in San Ramon would be:
italianrestaurantsanramon.com
This EMD technique is still relatively impactful today, if you’ve chosen the right keywords. If you’re just starting your website, it’s a nice way to gain some quick relevance with your URL.
Obviously, most choose branding in their URL. For example, “Boomcycle Digital Marketing” chose boomcycle.com.
Branded searches are the most powerful searches, by the way.
Link building is the process of getting other websites to link to a target (a client’s website, or your own website). A link back to your website is called a backlink.
Since Google’s earliest ranking system was heavily weighted to favor websites that had lots of links pointing to them, link building has traditionally been a very effective way to get your site to rank quickly. It’s still effective today.
Now, strictly speaking, Google doesn’t want you to “build” links — they want them to happen naturally.
But how do links happen naturally? That’s a great question.
Theoretically, links happen because you’ve posted great content, someone found your website, and thought you had written such a great article that they just had to reference it in their article or post.
But how does someone find your website if Google won’t rank it, because your website doesn’t have links pointing to it??
As you can see, this is something of a chicken-and-egg game.
The task of “link building” is so game-able, that Google now includes hundreds of other factors in ranking your website so as not to over-emphasize the number of links pointing to your website.
Having said all that, what if you really, really want some links pointing to your website so that you can at least signal to Google that you’re important and relevant?
You have to build the links by creating decent-to-great content, then asking other webmasters to link to your website. You know, because of your great content and all.
Here’s the rub: no one is super-motivated, much less required, to link to your website. It’s a pretty thankless task for the other webmaster, who creates a link for no compensation. You’ve just wasted her time, and she has nothing to show for it.
So you think “Hey, why don’t I just pay this webmaster for a link?“
Well, Google has you covered there too: they don’t want you to pay for a link because it’s considered “unnatural”.
Well, sure.
Paying for links is considered spammy and is certainly an unreliable “authority signal” as far as Google is concerned.
Okay, so I can’t pay for a link, no one will see my website without links, so what do I do?
There once was a theory that following a link to another website sort of detracted from your authority. The term “link juice” was even invented to somehow quantify what you were “giving away” by pointing at another website.
“Hey learn more about this subject that I’m talking about at this much more authoritative website here!”
And to be fair to the link-juice-leak theory, how does pointing at another website help your website’s authority on the topic?
So “nofollow” was invented as a signal to search engines: “Yeah, this is a link, but don’t give it any authority if it would detract from my own authority.”
Google gives a somewhat vague answer here on the usage of nofollow.
Just to show them, I nofollowed that link. Haha!
More recently, the strategy of internal linking has gained momentum and credibility.
Internal linking is simply linking your own pages on your own website to each other, based on topical relevance.
You don’t simply link every sentence to every other sentence, as that does not help the search engines to understand what your content is about.
For example, this sub-topic about internal linking in this article about SEO can (and eventually, will) link to a deeper article about only the topic of internal linking.
As you might imagine, it takes more internal links to rank your pages than external (incoming) links from other websites. No one really knows, but one metric that has been floated in SEO circles is that it is about a 2.5 to 1 ratio: 2-3 internal links is the equivalent of one external link.
But knowing how SEO works, even if it was true yesterday, it’s probably not true today. Google changes their algorithms a lot.
In any case, internal linking is a good strategy for building authority and relevance in the eyes of the search engines, and it’s completely free.
Anchor text is the text displayed to the reader for a given link.
An example of anchor text is this link to Boomcycle Digital Marketing.
The “anchor text” is “Boomcycle Digital Marketing”, and the actual URL is https://boomcycle.com.
Anchor text, in general, should strongly correlate to where you’re sending the person clicking the link.
In SEO lore, mixing your anchor text so that it appears “natural” (what’s “natural” about any of this, you may ask) is a ranking factor, and many tools show the types of anchor text you have on your backlinks (again, the links from other websites that point to your site).
Directly related to the topic of linking is the strategy of site architecture.
A website’s architecture is essentially the way that the information is organized on the website.
Boomcycle’s website features a simple menu structure that directs the visitor to the sections they are interested in.
If the visitor is in a “learning” mode, we have topics (like this one!) that deepen their understanding of a given subject.
If the visitor wants to engage with our services, we have sections that allow that process to happen smoothly.
Ultimately, you want to weight your website towards the needs of a visitor, with the rest of the website serving the “needs” of a search engine.
After all, what good is it to serve the needs of a visitor 100%, if no one can ever find your website? An no one can find your website because you didn’t give the search engines enough of a clue as to what your website was.
Thus, no searcher was ever directed to your website because you don’t rank in the search engines!
The optimal site architectural model is an understandable and navigable website, with clear textural signals to help visitors use, and search engines understand, your website.
XML Sitemaps are something of a propeller-head topic.
The optimal site architectural model is an understandable and navigable website, with clear textural signals to help visitors use, and search engines understand, your website.
The more easy you make it for search engines, the more “friendly” they’ll be towards your site.
A robots.txt file is a file that tells search engine crawlers what to index and what not to index on a particular website.
It’s up to each individual crawler to decide whether or not to take this “advice”.
A stronger way to let crawlers know that you don’t want your page in the search results is to use a noindex tag.
However, the whole reason for this article is to talk about how to get found via search engine optimization, so this “don’t look here” tag has been used somewhat sparingly in my career.
After the domain keywords, (see EMD, above), your “title tag” is your best SEO weapon.
Your title tag is what will show up in blue bold in the SERPs.
It’s the first thing that will catch the eye of the searcher.
Heading tags, or “h-tags”, are the standard HTML way of giving a search engine a clue about the organizational hierarchy of the subject matter on the page
In short, Title Tags are a great way to tell a search engine “here’s what’s up on this page”.
Title Tags work from lowest to highest, with lowest being the most important.
H1 tags should be used only once per page. An H1 tag is essentially the “title” of the page, at least as far as your visitor is concerned (who can read your actual title amongst all those tabs once they’re off the SERPs?)
An H2 tag is a subtitle or subtopic. H3, H4, H5 and H6 go down the topical hierarchy, which each tag being a sub-topic of the tag before it.
The subtopic you are reading right now “Title Tags & Meta Descriptions” is an H3, since it’s a subtopic of the H2, “How to Get Found on the Internet”.
A meta description is the suggestion to the search engine as to the web page summary to display to the searcher.
This is the place most businesses will make their first impression, as it’s the first text the searcher sees after finding your website in the SERPs.
Your meta description is a great place for a quick, punchy advertisement and call-to-action!
You should always create custom meta descriptions for each page. If you just copy-and-paste your same meta description, you are losing credibility with Google.
Make sure each of your web pages has a bespoke meta description of less than 160 characters.
Check your meta descriptions on the Google SERPs by typing in search terms to display your particular pages, and make sure your meta description displays exactly as you want it to.
On-Page SEO is the summary of the process of making sure your web page and website is correctly set up to help the search engines best understand what the page is about.
On-Page SEO includes ensuring you have optimal page titles, meta descriptions and h-tags, all discussed above.
In addition, On-Page SEO includes deeper technical issues such as:
Suffice it to say, on-page SEO is all about what’s on the page.
Your on-page SEO need to help Google to understand what your website is pitching or talking about.
How does User Experience (UX) on mobile devices count vis-a-vis search engine optimization?
Google now typically uses the mobile device experience as the primary benchmark as to whether or not your website is “mobile friendly”. And “mobile friendly” usually means optimally search friendly.
However, not all websites are created equal! Many sites, such as educational, scientific research, deep B2B subjects or really any subject that requires a great deal of reading (like this article!) will typically be consumed on a desktop (or laptop) computer.
So who cares about being mobile-friendly?
Any website where the primary visitor is coming from a mobile device.
As of this writing, mobile (smartphone) usage is about at 52%, and desktop usage is a little over 45%. Tablets have been holding steady, down in the dumps at about 3+%.
If Google sees that most of your traffic is coming from smartphones, it will inform you that it is switching to “mobile-first indexing”, meaning, get your mobile experience together soon, or risk losing search traffic.
A wide variety of tools exist for the purpose of analyzing search traffic, as well as conversion rates and goals.
In truth, your search data can be sliced and diced and served up in a variety of ways that make your job as a webmaster completely data-driven.
Google Keyword Planner is the tool that folks using Google Ads work with primarily to decide upon which keywords they should bid, and how much they should bid.
However, the Keyword Planner also shows interesting statistics for those interested in “organic” search as well, displaying search volume and the level of competition.
This can be a good first swipe in an SEO campaign.
Most importantly, the data is coming directly from Google, so the confidence in the data is high, even if the data itself is a bit vague.
For example, instead of giving you the exact number of Monthly Searches, the Keyword Planner will display a range like “10K – 100K”. This is pretty damn wide.
“I’m sorry officer, I thought I was going 10 MPH, but you say I was doing 100 MPH? I guess we’ll just call it somewhere between 10-100 MPH.”
Again, the Keyword Planner is but one data point you can use.
Google Analytics is the granddaddy of all search engine tools.
GA allows you to see reports, set up conversion goals, figure out what devices people are using when they visit your website, how long they stay on your website, whether they “bounce” away after visiting just one page, etc.
Analytics has retained its usefulness and improved in many respects, but the tool is very deep, and it’s easy to get tunnel-vision.
While Google Analytics tells you what visitors do once they arrive at your website, Google Search Console actually provides the information about how they found you.
Google Analytics used to provide this information as well, but ever since search results were encrypted for reasons of user privacy, Analytics no longer provides the actual search terms used.
Fortunately, Search Console does, though only “site wide”, not for specific pages.
It’s better than nothing. Fortunately, there are tools that are even better than better than nothing.
Fortunely for SEOs, a whole “cottage industry” has sprouted, featuring tools to help make the SEO’s job easier.
One of the best (and priciest) is Ahrefs.
Ahrefs gets its data from its own search crawlers. It’s essentially a third-party search engine that does its best to analyze search traffic and it allows the user to slice and dice that data in ways that Google never will.
Ahrefs lets you see which search terms are the most popular, as well as the difficulty to rank for the term.
Of course, since the data does not come from Google, Ahrefs must do its best to guesstimate what Google is “thinking”.
They do a pretty good job, and they have an astonishing array of videos online that helps you use, naturally, Ahrefs.
SEMRush is the direct competitor to Ahrefs, and it provides even more analysis tools than Ahrefs. Ahrefs is slightly better in its layout and simplicity.
SEMRush also runs its own crawler and produces its own results in this manner. SEMRush does a bit more than Ahrefs in terms of alerting you to SEO problems, such as “keyword cannibalization” and toxic links (those nasty spammy backlinks we spoke of earlier).
In my personal usage, it feels like SEMRush likes to upsell you to a more expensive service level just about every time you push any button.
They just went public in 2021, so maybe that has something to do with it.
SEO for small business is not terribly different from SEO for large business. The same principles apply, though the tactics employed may center more around a concept known as Google Maps Marketing than traditional search-to-website strategies.
This is because Google Maps, and more specifically, Google My Business (GMB) has become an important ranking signal to, you guessed it, Google!
As GMB has matured, Google has added new features to allow business owners to take more control over their listings.
And Google Maps listings appear ahead of the organic search rankings when there are relevant local businesses to show to the searchers.
Blogging is an olde-tymey (in web-years, anyway) term that essentially means, writing lots of articles on a regular basis, and publishing the content on your website.
Search engines do love fresh, timely content, so blogging is definitely an effective technique, though more recently the definitions of blogging have expanded to include audio blogging (essentially, podcasting) and video blogging, or “vlogging”.
Adding new pages and fresh, high-quality content to your website is always going to be an authority signal in the eyes of the search engines.
Adding fresh new video content to your YouTube channel doesn’t hurt either. In fact, you should be doing both on a regular basis.
A content silo, or content hub, is an advanced SEO strategy.
A content hub starts with creating a main reference page (like this one), then creating and linking sub-topics to the main page.
The idea is to create sub-topic pages on terms that are extremely niche, have decent search volume, but low competition.
In doing so, you create authority around niche topics, and when those sub-topic pages link back to your main page, your main page gets the benefit of that niche authority.
In other words, you become a high priest of a low cult.
This strategy is highly-effective, but requires hard work, patience and time to bear fruit.
Several companies compete head-to-head for the mantle of The #1 Authority on the subject of search engine optimization.
All of these websites provide valuable insights, though their insights come with varying price tags.
Moz.com is a SaaS company, and was one of the earliest SEO tool and insight providers, and is still one of the best. Moz sells inbound marketing and marketing analytics subscription services.
Their online community includes contributions from more than one million digital marketers from around the world.
Search Engine Land is a fabulously dense website that leaves no stone unturned with regards to search engine optimization, and search topics in general.
They even analyze the latest and greatest places to find eyeballs, like TikTok, and other social media properties. The idea is to study where businesses can grab attention, no matter where that attention is focused.
Last but certainly not least, Neil Patel is a sort of “SEO=lebrity”, and he and his team provides some great articles and tools to help you learn about marketing and search topics.
Boomcycle helps your business increase visibility in your niche, resulting in more customers and sales.
A complete overview of your Local Business Visibility and SEO performance in minutes! The Boomcycle Business Visibility Report includes Search Rankings, Local Listings, Reviews, On-Site SEO & Social Media.
Simply enter your business name. If we find it in our listing, select it from the list, and you’re good to go. Otherwise, we’ll need a few more details from you.
Google’s algorithms change all the time. What worked yesterday may leave you on Page 2 today. Let’s talk Google boosts and demotions.
Essential SEO strategies tailored for nonprofit organizations to enhance your online presence, drive engagement, and further your mission.
Semantic triples: the SEO technique that supercharges your website’s visibility. Learn how to give Google a clear map to your content’s meaning and watch your rankings really fly!
Get unlimited visibility for your business.
Pleasanton Office:
4125 Hopyard Rd
Suite 225-138
Pleasanton, CA 94588
Phone: 925-222-5221
As expert SEO consultants since 2003, Boomcycle Digital Marketing knows what you need to do to push your website to the top of the Google search results.
Click the button to book your FREE consultation today!