Blog

Recapture Lost SEO Leads Using This Ninja Method

If you’ve ever lost a prospect to your competition even though you’ve poured your heart and soul into creating a fantastic proposal, you are not alone. In fact, your hope may not be lost as SEO is always a moving target and you can use it to your advantage. Quite often, many SEO companies will charge a competitive pricing just to keep the cash flow going regardless of what they’ve promised to deliver is possible or not. If this happens to be your competition, here are the steps you can follow to recapture the lost lead:

1) Wipe your tears right away. It’s not the end of the world. Just get even by creating a neat database of niche related keywords. If the seed keyword happens to be ‘personal injury lawyer’, use Google keyword planner or a site like mangools.com and find 100 related keyword phrases. For this example, I am going to use 10 keywords to save the writing space.

injury lawyer
personal injury attorney
accident lawyer
car accident lawyer
personal injury lawyers
slip and fall attorneys
accident attorney
personal injury claims lawyer
car accident attorney
personal injury law firm

2) If the lost prospect targets multiple cities, combine them with the keywords. Below are the combined keyword phrases using Dallas and Plano.

Dallas injury lawyer
Dallas personal injury attorney
Dallas accident lawyer
Dallas car accident lawyer
Dallas personal injury lawyers
Dallas slip and fall attorneys
Dallas accident attorney
Dallas personal injury claims lawyer
Dallas car accident attorney
Dallas personal injury law firm

Plano injury lawyer
Plano personal injury attorney
Plano accident lawyer
Plano car accident lawyer
Plano personal injury lawyers
Plano slip and fall attorneys
Plano accident attorney
Plano personal injury claims lawyer
Plano car accident attorney
Plano personal injury law firm

3) Go to SerpBook.com and create an account. I am mentioning SerpBook only because they’ve accommodated my request by adding a bulk upload format which is handy. You can, however, use any rank tracking software.

4) Create a bulk upload format by adding a domain and “:” after it. Below is the combined list.

006-001

jrlawfirm.com:Dallas injury lawyer
jrlawfirm.com:Dallas personal injury attorney
jrlawfirm.com:Dallas accident lawyer
jrlawfirm.com:Dallas car accident lawyer
jrlawfirm.com:Dallas personal injury lawyers
jrlawfirm.com:Dallas slip and fall attorneys
jrlawfirm.com:Dallas accident attorney
jrlawfirm.com:Dallas personal injury claims lawyer
jrlawfirm.com:Dallas car accident attorney
jrlawfirm.com:Dallas personal injury law firm
jrlawfirm.com:Plano injury lawyer
jrlawfirm.com:Plano personal injury attorney
jrlawfirm.com:Plano accident lawyer
jrlawfirm.com:Plano car accident lawyer
jrlawfirm.com:Plano personal injury lawyers
jrlawfirm.com:Plano slip and fall attorneys
jrlawfirm.com:Plano accident attorney
jrlawfirm.com:Plano personal injury claims lawyer
jrlawfirm.com:Plano car accident attorney
jrlawfirm.com:Plano personal injury law firm

5) Store the information in an Excel spreadsheet or any other database-driven program for later access.

6) Wait about four to six months. Upload the bulk upload data and do sortby using lowest Google rankings first.

006-002

7) Remove the 1st-page rankings. Download the updated PDF rankings report and send it to the prospect. Be sure to approach them with a helpful and consultative attitude and not in a way that can make them feel like they’ve made the stupid mistake by not hiring you.

006-003

8) If the attempt fails, try again in later months. Keep in mind that people like to work with a person who cares and remembers.

Steven Kang

………………………………………………………………………………………………….

Get the latest foot-in-the-door ideas for your digital marketing agency and scale. http://footinthedoor.io/

For an honest discussion about data-driven SEO, you can join my non-spammy group. https://www.facebook.com/groups/SEOSignalsLab/

Join the discussion on digital marketing services sales strategies. https://www.facebook.com/groups/LeadStacking/

Learn to systematize your business and scale.  https://www.facebook.com/groups/seocheatguides/

Does Google Stacking Work? or Is It a Hype?

If you are a seasoned SEO guy, you know that Google stacking has been a buzzword for some time in the SEO community. What is Google stacking? It’s an SEO strategy leveraging Google properties in hopes of improving rankings using Google properties’ high authority status. To test to see whether the strategy can influence rankings, I’ve decided to order gigs from Konker SEO marketplace and try them on various keywords.

Since each gig costs $40 and I didn’t want to spend thousands on the test, I’ve decided to invest $400 on various keyword terms with different search volumes.

On June 5th, I checked the rankings and didn’t see much improvement. To ensure all the stack pages have been indexed by Google, I manually typed in site:GOOGLE PROPERTY URL in Google and found all of them listed in the SERP. Once I’ve realized that indexation wasn’t the cause for the rankings stall, I’ve decided to turn to Money Robot for tier 2 links.

One June 6th, I applied Money Robots tier 2 links to each Google stack page.

On June 19th, I’ve decided to pull the plug and checked the rankings. Here are my findings.

google-stack

Rankings Improved: 6

Rankings Worsened: 3

Same Rankings: 1

Verdict: So far, I’ve spent $460 on the test and can’t say I’ve seen a conclusive evidence that this is a solid strategy which provides positive ROI. To be fair, more time may be needed to test the long term effects but I’ve decided to move on to other tests.

……………………………………………………………………………………………………………….

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

How Long-tail SEO Strategy Leveraging Relevancy Clustering Can Fuel Your Agency Growth

On April 22, my client sent me a list of keywords to rank in a medium competition manufacturing niche. As soon as I received them, I grouped the keywords using the relevancy clustering method I’ve been using for years and sent him a file containing the page structure information. Per my instructions, the client created pages and sent me 3 URLs a week later. Upon realizing that the URLs weren’t indexed by Google, I submitted the URLs to Google by typing ‘submit url to Google’ in Google. After taking a break for a couple of hours, I entered the keywords into a rank tracking software. Lo and behold, I was able to see that the keywords were already ranking in the top 100 positions. I immediately sent a screenshot of the rankings to my client who became excited with the instant traction the new pages had gained.

submit-url-to-google

lakeland2

Imagine the scenario where you are able to rank a large number of keywords and have them ranked while you work on off-page for stronger rankings. While most SEO agencies struggle to keep 10 or so competitive keywords ranked and often get fired from clients within 6 months, I’ve been leveraging the long-tail strategy on a large scale to grow my agency business with hundreds of clients with a retention rate of over 95% for years. To be fair, long-tail SEO is a really effective strategy but I am not going to praise long-tail SEO and claim that it is the solution to end all SEO issues. The truth is it works better in some industries than others. It is, however, an important weapon to have in your arsenal if you are running a digital marketing agency. Let’s look at some of the aspects long-tail SEO strategy and relevancy clustering.

The Basis for Relevancy Clustering

While most SEO strategies are based on a single word per page silo structure, I started lumping keywords together for stronger relevancy when I learned that Google had purchased multiple semantic web technology companies. The theory behind relevancy stacking is that Google favors relevancy and you can easily leverage this by placing closely related keywords in all SEO elements such as URL, title, description, H tags, and content. Using this approach, you can have a head start in rankings.

If you are an AdWords user, it’s no secret that Google spits out hundreds of related terms based on a seed keyword. You might ask, “How does Google know all these related terms?” The answer is for every keyword, there are related words in Google’s database and each has a relational numeric value assigned. By organizing related terms in clusters for all SEO elements for both on and off-page, you can form a stronger relational presence and help Google to connect relevancy dots.

Matt Diggity has recently shared with us his observation of larger sites doing better in rankings than smaller sites. This reinforces the strategy of relevancy clustering. From an algorithm perspective, Larger sites have more chance to present itself in a topically organized way than smaller sites with thin content. I’m also currently witnessing that smaller sites are losing grounds in the rankings on a day to day basis.

An Example of Relevancy Clustering

For local markets, you need to treat each keyword phrase with geo-modifier in it as one phrase and come up with multiple variations to form a relevancy cluster. Here is an example of related keywords for ‘New Work Lawyer.’

NY attorney, Lawyer in Manhattan, Legal counselor in NY, NYC lawyer, etc…

One major benefit of creating a cluster is that it helps you avoid over optimization penalty. For years, I haven’t had one website penalized by using this technique alone.

Strategy Comparison

Not all SEO strategies are created equal and as an agency, you must think like a strategist and decide which strategy best serves your client needs. Since running an agency is a business, you must think in terms of what’s the likelihood of achieving and maintaining rankings while staying within the budget as there is no such thing as an SEO project with an unlimited budget. So, what’s the ideal strategy? It really depends on your client needs.

For a lead gen market, mass page builders can work well as they are designed to go after lots of super easy-to-rank keywords on a large scale. Platforms like Lead Gadget can launch sites with a large number of pages effortlessly. For highly competitive niches, you need to get a few competitive keywords to gain traction and there is no shortcut for it. For local markets with multiple locations and lots of niche specific keywords, somewhat competitive user intent-based long-tail keyword strategy works well. Since going after all three strategies simultaneously is not practical from a budget standpoint, I’d personally prefer to go with the long-tail with relevancy clustering strategy.

For many agency businesses, long-tail leveraging relevancy clustering is often a golden strategy since it’s not too difficult to maintain rankings and you can target large intent-based keywords with relative ease.

strategy-comparisonIdeal Client Candidates

From my experience, local clients with lots of niche keyword variations serving multiple locations in a population of over 100,000 or more are ideal clients. I have a client who is serving multiple counties and towns in the home care niche and they have become the number one producer in their franchise group due to long-tail organic SEO. For this particular client, I have identified over 100 niche related keyword variations. Since they are servicing 20 locations, the number of combined keyword phrases becomes 20 times 100 which is 2,000. Once you’ve demonstrated that you can rank a good chunk of the keywords from the start, they’ll see the value and will get hooked on it.

When done right, on-page alone will rank a large number of keywords as soon as the pages get indexed due to relevancy clustering. Even if you were to rank 10% the first month from the example client above, that’s 200 keywords ranked you can present to your client. With proper backlinking, the number of keywords getting ranked will continuously grow and you can always show the result. In most cases, traffic grows as long as the market size supports demand for user intent keywords.

The Logistical Challenges of Relevancy Clustering

Although it’s true that each intent-based keyword is easier to rank than a short-tail niche keyword, it can easily swell to hundreds of keywords. In many cases, it can even swell to thousands of keyword variations. Easy rankability just became a logistical challenge as you don’t want to have to deal with optimizing hundreds of pages manually for each client that comes along. To alleviate this challenge, I’ve come up with several techniques and tools to handle the process. One is relevancy layering and the other is the master content strategy. Both tools are designed to quickly launch relevancy clustering optimized pages while helping with rapid content deployment.

Relevancy layering tool is designed to quickly create pages by forming relevancy clusters from a large set of related keyword phrases. I am currently using this tool to quickly launch hundreds of pages to help rank thousands of keywords for my clients. The master content strategy is designed to help systematize the content creation process by leveraging writers who are good at creating relevant articles. To help with off-page, I’ve built a sizeable network to help with contextual backlinks as well as image syndication and maps embed. As a result, I can maintain a large percentage of keywords ranked and it gives an assurance to clients that the approach they are paying for is working.

Recommended Strategy Going Forward

Since long-tail SEO strategy is only a means to acquire highly targeted traffic, it is recommended that you consider maximizing all traffic potentials such as paid, social, retargeting, and other digital marketing channels. Whatever SEO strategy you opt for, it’s only a method of generating traffic. After all, digital marketing is about creating a well-oiled marketing and sales machine for a business. It’s not about preference or ego, it’s about what delivers an ROI.

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

Can Trusting TF (Trust Flow) Metric Hurt Your SEO?

One of the biggest problems with SEO these days is that the entire SEO community is fixated on the third party metrics for evaluating a website for trustworthiness. Unfortunately, it can hurt your rankings if you don’t have the right SEO process in place and can’t read between the lines. In other words, your SEO could suffer greatly if don’t have the ability to decipher all the right signals. After all, what Google thinks of your site matters the most, not the third party metrics or crawlers.

Recently, I had an opportunity to audit a website which was doing well for months but lost rankings during Google’s Fred update. Upon closer inspection of the backlinks, I ran into a site which stuck out like a sore thumb. According to Majestic, it’s TF (trust flow) value showed 20. Most SEO guys would tell you that it’s a decent metric. In fact, you’ll see lots of link peddlers selling link packages and touting it’s one of the safest metrics out there. When I examined other elements, there were signs of red flags.

001

One of the metrics I use for evaluating a site is its index count as it comes directly from Google’s database. According to Majestic, the index count showed 68. I went to Google and it showed 37. For whatever reason, Google has de-indexed 31 pages.

008

007

When I visited the homepage, everything became more clear to me as the homepage relevancy was out of whack. Let’s look at why this could cause an issue. If you are using the homepage to send backlinks, its page relevancy matters. Since the domain name is holy-redeemer-mcc.org, we know that the homepage content should be relevant to religion in order to maximize its relevancy signal.

Let’s scroll down and see if any content piece or outbound links are related to the domain name.

006

005

004

003

002

As you can see, all the articles are totally irrelevant content. Relevancy dilution is real and it is a serious disease you can easily cure.

Conclusion

Does this mean you should entirely ignore TF value? TF can be useful as long as you understand how it was derived. The third party crawlers try to crawl everything and have a long-term memory. Unfortunately, it may not necessarily coincide with Google’s memory. Perhaps it once used to be a good ol’ trusted site. The problem is that adding irrelevant content can devalue its SEO standing with Google. If you are buying links from link vendors, beware of what they are doing. Page irrelevancy can hurt you in the long run.

————-

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/. We have great discussions going and I always add my insights gained from 20 years of SEO experience and 30 years of business and marketing experience.

If you are interested in receiving advanced SEO training, you can visit http://www.relevancystacking.com. The course is designed to help you scale your SEO using cutting edge strategies.

To get the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com.

Here is a Proof That Outbound Link Makes Google Nervous

post-template-obl-test

One of the benefits of owning a network of digital properties is that you can carefully monitor and perform SEO tests using various methods and parameters. Coming from a programming background, I can set up my own metrics and test them for numerous cause and effect relationships. With over 1,200 hundred sites in my network and growing, I can gain insights no other SEO can provide.

For years, I’ve been leveraging expired domains as a part of my overall SEO strategy. In most cases, expired domains carry some SEO value if you have the right vetting process. Here is one major problem. Even with the best third party metric data in your hand, it’s not a guarantee that Google will like the domain. With so many expired domains coming into the scene, Google has a reason to get nervous as there are literally millions of expired domains which can affect the overall SEO landscape in an unwanted way.

Since links play a crucial role in SEO, I’ve decided to do a test based on a hypothesis stating that Google is cautious with outbound links that are added to an expired domain. To test this, I programmed revived expired domains to wait for the number of indexed pages reach at least 10 before outbound links are added. When it drops below ten, I made the links disappear. I’d figured it would be safer to send links from a domain with a certain number of index count.

I was able to minimize external SEO factors by making sure the domain has no social shares or other links coming to the domain. If all other variables remain the same, then Google should allow more pages to get indexed when outbound links are not present as pages without links should pose no threat to Google. On the contrary, Google should start reducing the number of index count once Google detects outbound links.

Here is the result of Google reacting to outbound links.

index_test

As you can see, the index count has changed over time based on the site’s outbound link status. The result has significant implications and below are some conclusions based on an observation of the pattern.

1) Generally speaking, Google is cautious with seeing outbound links coming from a new site or a revived expired domain.

2) Many people are adding links without considering Google’s reaction.

3) Link exchange is not necessarily a good idea.

4) You can’t simply rely on the third party metric data. The domain used in the example above had DA of 21 and hasn’t changed during the entire test.

Steven Kang

…………………………………………………………………………………………………………

If you are interested in getting the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com.

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/.

If you want to grow your SEO business by implementing my cutting edge SEO framework, you can visit http://www.relevancystacking.com/.

How to Safely Dodge Google’s Algorithm Updates Using TRAP Framework

post-template-trap

For years, the ultimate challenge for SEO has been how to deal with the Google updates. Many unnecessary wars broke out between black hatters and white hatters because of this subject. Quite often, white hatters accused black hatters of causing Google updates and black hatters accused the other side of spreading false rumors for their profit motive. What’s my verdict? It’s neither. Google’s algorithm looks at signals, not whether your links are resulting from outreach efforts or have been bought.

Don’t get me wrong. If you are doing content marketing, you should reach out to sites with readership. Getting a link back to your converting page is a good strategy for marketing purposes. But from a pure algorithmic standpoint, you may be satisfying unintended objectives whether you’ve realized it or not.

After many sleepless nights, I came up with a unique framework by sidestepping away from how everyone is looking at the entire SEO process. The conceptual framework has been the result of almost 20 years of SEO experience, observation, and testing.

SEO blogs and marketers often mention 200 factors at play for rankings. The number was derived from Google patent filed in 2008. If we were to account for 50 variations for each factor, it can easily swell to 10,000. From a logistics perspective, it doesn’t make sense for an SEO marketer to go through 10,000 line of checklist every time you launch an SEO campaign for a website.

One of the major challenges I’ve faced was how to deal with an ever-growing number of ranking factors and algorithm updates. When I realized that if I can figure out a way to describe the entire ranking process with a simpler model, not only would I have a way to scale, but I also knew I would have a better way to create a preventive measure against Google updates. Just like the concept of Yin and Yang, which describes the forces of nature in a simplified way, I was on a quest to discover one for SEO.

Several years ago, I came up with the concept of 4 major signals. By condensing all SEO processes into four, I was able to track and describe how Google behaved every time Google came up with an update. The 4 major SEO signals are Technical, Relevancy, Authority, and Popularity, which now I call TRAP framework.

Let’s look at each signal and roughly define what they are.

Relevancy – For every keyword, there is a related keyword that exists in Google’s database. Two keywords can be related in various ways such as semantic, geo, categorization, and brand.

Authority – Authority is a trust signal. Google looks for how often other trusted sites are referencing your site.

Popularity – Popularity has changed over the years as the web has evolved. Google made its algorithm changes to detect the signal from different places. Popularity signals can be divided into link popularity, social popularity, and search popularity.

Technical – Site security such as https and speed falls into this signal. Although some of the signals do not play a major role in rankings now, they are expected to play a bigger role in the future.

Now that we have roughly defined 4 major signals, let’s look at why understanding them has major benefits.

1) You now have a conceptual model to describe its behavior and understand what Google is looking for after each algorithm update.

2) You can easily catalog whatever SEO activity or link building scheme some SEO guru comes up with.

3) It gives a better way to identify the deficiencies with SEO activities you may have not known before.

4) You can plan a preventative measure and guard against Google updates.

4-signals
Dissecting TRAP Framework

In order to tame Google’s algorithm, which reacts to the four major signals, we first need to trap and dissect it. It’s a similar concept to colliding two atoms together in Hadron Collider in order to understand the inner workings of subatomic particles. The more we can dissect, the more tools we’ll have at our disposal to create a preventative measure against Google’s updates. TRAP is the acronym for the four major signals: Technical, Relevancy, Authority, and Popularity. From now on, I’ll refer to the four major signals as TRAP as it has all the components we’ve discussed.

Characteristics of TRAP

Each component of TRAP has a specific contributing role for the set.

Technical – It has a qualifying role for TRAP. To be even considered for a Google’s crawler visit, your site needs to load within a reasonable time. Remember, ranking starts with indexation whether it’s on-page or off-page.

Relevancy – It has a relational role for TRAP. Google values content which caters to users’ search intent and its relational value.

Authority – It has an amplifying role for TRAP. Authority signal is used to amplify the site’s overall standings with Google called site authority which translates to faster and higher rankings.

Popularity – It has a validating role for TRAP. Once pages and links are built, Google will attempt to validate a site’s off-page activity by looking at popularity metric which is directly associated with human activity level.

Using TRAP Framework to Shield Against Google Updates

One of the major benefits of using TRAP is that we now have a conceptual model to describe everything related to SERP dynamics. We no longer have to describe a cause and effect relationship with complex diagrams or a long checklist. From a strategist’s perspective, it allows an efficient way to interpret the ranking patterns and create a preventative measure against future algorithm updates. The term ‘holistic SEO’ no longer needs to be associated with fluff words like ‘synergy’ and ‘natural.’ Holistic SEO means using TRAP to fulfill SEO requirements.

One of the major issues with Google’s algorithm update is that no one can predict what the changes will look like as it has been manufactured and maintained by a corporation. Its motive is simple. Google wants to maximize profit for its shareholders and keep marketers out from figuring out its algorithm with a high degree of accuracy. This, however, doesn’t mean that we can’t prepare ourselves better than any SEO marketer on the planet.

By recognizing that a healthy SEO campaign needs to contain all the elements of TRAP, we now have a way to monitor and look for deficiencies in any SEO campaign. Here is an analogy. Think of owning a car with a set of 4 wheels. You can technically drive a car with less than all four wheels intact. You can even make the car move with only one wheel. This, however, can have a bad consequence as the car can come to a halt which is an equivalent to losing all rankings and possibly even deindexation. But if you can maintain the set of all four wheels well, you can ride out any bumps headed your way with relative ease.

One of the best ways to survive Google update is by making sure that TRAP is in good standing. Here are the steps I’ve been using for years and have shielded my clients from major Google updates.

1) Create a checklist for TRAP components and do the best to keep up with activities which satisfy each component per defined period. Each page doesn’t need to have TRAP components satisfied but the site as a whole needs to satisfy the TRAP checklist.

2) Monitor rankings progress and use TRAP checklist to augment or minimize signals.

3) Whenever there is a news about a major update, wait for the update to be over and identify how Google’s TRAP requirement has shifted. Google algorithm update usually ends up with a shift in technical, popularity, authority, and relevancy signal.

4) Recognize the shift in Google’s algorithm and make up for deficiencies using TRAP checklist.

trap-trend
After many years of implementing TRAP framework into my SEO business, here are some results.

1) One of my clients is a part of a franchise group with more than 2,000 franchisees. After years of applying TRAP framework, the client became the number one producer in the organization. His secret weapon? A consistent organic traffic resulting from SEO. All the franchisees in the group are looking up to him and he is bragging about how great my SEO is for generating traffic. As a result, I have a continuous supply of new clients.

2) My marketing became easier as I was able to develop methodologies not currently available in the marketplace. I became a strategic partner to numerous niche agencies.

3) I was able to drastically increase SEO fulfillment capacity and efficiency while lowering expenses by leveraging TRAP framework. I no longer need to spend countless hours wasting my resources on unnecessary activities since I can classify each SEO activity under TRAP Framework.

If you are interested in receiving in-depth training, you can visit http://www.relevancystacking.com/. On the homepage, you can scroll down and read the course objectives and member reviews.

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/. We have many great discussions going and I always add my insights gained from 20 years of SEO experience and 30 years of business and marketing experience.

To get the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com. I promise I will not bombard your email with affiliate offers or sell your email to a North Korean spam camp.