Blog

How to Get Your Foot in the Door With Yelp Enhanced Profile

As opposed to a basic Yelp profile, Yelp Enhanced Profile helps a brand represent its listing page with features that include an expanded “About Us” page as well as the ability to block ads from the competition. Since consumers want more than just a good restaurant on Yelp, many home improvement stores and retail shops can gain more visibility. To put it bluntly, all local businesses can benefit from Yelp, and their marketing ROI can significantly increase via Yelp Enhanced profile.

enhanced-profile

This Yelp profile option has all of the features provided with the Basic profile plus the following advanced features:-

Agency access
Multiple logins
Customized user access
Bulk claiming locations
A dedicated support manager
Enhanced reporting abilities
Single location and bulk update
Continuous location data verification
Performance metric reporting by location
Ability to remove or merge duplicate profiles
Individual, regional or master location access
Aggregated reporting covering multiple locations
A centralized dashboard for multi-location management
Video and photo slideshow features that let you add a promotional video or images in any preferred order

Why Yelp Enhanced Profiles Are So Important

Brands that upgrade to a Yelp Enhanced profile significantly increase their ability to rank higher, engage consumers and convert their target audience. By restricting competitor’s ads, businesses can keep visitors on their profile page instead of distracting them with ads that take them away.

Using Yelp Enhanced Profile to Get Your Foot in the Door

You can hire a VA to go through Yelp and pull the businesses with a basic profile. Contact the business either via contact forms, cold email, or direct mail. You can use the template below to create a compelling offer. Keep in mind that SEO or PPC upsell can wait until your foot in the door offer gains traction.

—————— sample email copy ——————

Hi (Prospect),

I’ve noticed you have a basic business listing on Yelp and there is a way to drastically increase targeted traffic to the page.

If you have a few minutes this week, I can go over how a Yelp Enhanced Profile can drive more customers and sales.

Each New Yelp Enhanced Profile account comes with $300 in yelp ads credit at no charge for your business so you can see it working right away.

Thank you for your time.

Best,

(Sender)

 

………………………………………………

For an honest discussion about data-driven SEO, you can join my non-spammy group. https://www.facebook.com/groups/SEOSignalsLab/

Join the discussion on digital marketing services sales and marketing strategies. https://www.facebook.com/groups/LeadStacking/

Get the latest foot in the door ideas for your digital marketing agency and scale. http://footinthedoor.io/

Recapture Lost SEO Leads Using This Ninja Method

If you’ve ever lost a prospect to your competition even though you’ve poured your heart and soul into creating a fantastic proposal, you are not alone. In fact, your hope may not be lost as SEO is always a moving target and you can use it to your advantage. Quite often, many SEO companies will charge a competitive pricing just to keep the cash flow going regardless of what they’ve promised to deliver is possible or not. If this happens to be your competition, here are the steps you can follow to recapture the lost lead:

1) Wipe your tears right away. It’s not the end of the world. Just get even by creating a neat database of niche related keywords. If the seed keyword happens to be ‘personal injury lawyer’, use Google keyword planner or a site like mangools.com and find 100 related keyword phrases. For this example, I am going to use 10 keywords to save the writing space.

injury lawyer
personal injury attorney
accident lawyer
car accident lawyer
personal injury lawyers
slip and fall attorneys
accident attorney
personal injury claims lawyer
car accident attorney
personal injury law firm

2) If the lost prospect targets multiple cities, combine them with the keywords. Below are the combined keyword phrases using Dallas and Plano.

Dallas injury lawyer
Dallas personal injury attorney
Dallas accident lawyer
Dallas car accident lawyer
Dallas personal injury lawyers
Dallas slip and fall attorneys
Dallas accident attorney
Dallas personal injury claims lawyer
Dallas car accident attorney
Dallas personal injury law firm

Plano injury lawyer
Plano personal injury attorney
Plano accident lawyer
Plano car accident lawyer
Plano personal injury lawyers
Plano slip and fall attorneys
Plano accident attorney
Plano personal injury claims lawyer
Plano car accident attorney
Plano personal injury law firm

3) Go to SerpBook.com and create an account. I am mentioning SerpBook only because they’ve accommodated my request by adding a bulk upload format which is handy. You can, however, use any rank tracking software.

4) Create a bulk upload format by adding a domain and “:” after it. Below is the combined list.

006-001

jrlawfirm.com:Dallas injury lawyer
jrlawfirm.com:Dallas personal injury attorney
jrlawfirm.com:Dallas accident lawyer
jrlawfirm.com:Dallas car accident lawyer
jrlawfirm.com:Dallas personal injury lawyers
jrlawfirm.com:Dallas slip and fall attorneys
jrlawfirm.com:Dallas accident attorney
jrlawfirm.com:Dallas personal injury claims lawyer
jrlawfirm.com:Dallas car accident attorney
jrlawfirm.com:Dallas personal injury law firm
jrlawfirm.com:Plano injury lawyer
jrlawfirm.com:Plano personal injury attorney
jrlawfirm.com:Plano accident lawyer
jrlawfirm.com:Plano car accident lawyer
jrlawfirm.com:Plano personal injury lawyers
jrlawfirm.com:Plano slip and fall attorneys
jrlawfirm.com:Plano accident attorney
jrlawfirm.com:Plano personal injury claims lawyer
jrlawfirm.com:Plano car accident attorney
jrlawfirm.com:Plano personal injury law firm

5) Store the information in an Excel spreadsheet or any other database-driven program for later access.

6) Wait about four to six months. Upload the bulk upload data and do sortby using lowest Google rankings first.

006-002

7) Remove the 1st-page rankings. Download the updated PDF rankings report and send it to the prospect. Be sure to approach them with a helpful and consultative attitude and not in a way that can make them feel like they’ve made the stupid mistake by not hiring you.

006-003

8) If the attempt fails, try again in later months. Keep in mind that people like to work with a person who cares and remembers.

Steven Kang

………………………………………………………………………………………………….

Get the latest foot-in-the-door ideas for your digital marketing agency and scale. http://footinthedoor.io/

For an honest discussion about data-driven SEO, you can join my non-spammy group. https://www.facebook.com/groups/SEOSignalsLab/

Join the discussion on digital marketing services sales strategies. https://www.facebook.com/groups/LeadStacking/

Learn to systematize your business and scale.  https://www.facebook.com/groups/seocheatguides/

How to Quickly Check For Domain Index Count in Bulk Using SerpBook

If you are doing SEO on a large scale, you need a systematic approach to your SEO or you will run into a serious bottleneck situation. Since I use a lot of expired domains and domains, keeping track of indexation is important as it’s an indication of how Google gauges each site. To make the process quick and painless, I use a rank tracking site called SerpBook.com to efficiently track the indexation count using its nifty bulk upload feature. If you are using SerpBook, you can follow these easy steps.

1) Login to SerpBook.com and create a category. Using its bulk upload feature, type in the following format for multiple domains.

domain1.com:site:domain1.com
domain2.com:site:domain2.com
domain3.com:site:domain3.com
domain4.com:site:domain4.com
domain5.com:site:domain5.com
domain6.com:site:domain6.com
domain7.com:site:domain7.com

expireds0

2) Once added, you’ll instantly see whether the domain has been indexed by Google or not.

expireds

3) If you want to extract the data for the exact number of index count for each domain, you can download the data in a JSON format. You can hire a programmer from Upwork to create a database and generate a custom report.

expireds3

by Steven Kang

……………………………………………………………………………………….

If you are interested in learning large scale SEO strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

Is White Hat Link From a Real Site Worth Getting?

White hat link is the buzzword these days and their demand is soaring. What’s fueling its demand? More and more SEO guys are discovering that gray and black hats are less effective these days. The evidence is all around us. You hear SEO guys fretting about manufactured links are not passing their link juice like they used to. Many forums and SEO groups are full of comments indicating PBN is getting harder to justify its ROI.

Here is a recent message sent by a member of my Facebook SEO group.

 

link-juice

I personally have done well for years by spending tens of thousands of dollars per month on links built via my network and the network built by other vendors. I am, however, seeing the effectiveness slowing down. It’s no secret that SEO guys are now forced to incorporate links from real sites into the off-page SEO mix. To find out whether links from real sites, aka white hat links, are affecting rankings in a positive way, I’ve decided to dedicate a test for it.

On May 25th, I took the snapshot of rankings for a dental site I’ve been working on. I have dedicated a page to a group of select keywords and decided to send 20 white hat links and observe what happens. The target city has a population of over 250,000 and the keywords have monthly search volume ranging from 50 to 1300.

chandler-dentist-blur.png

chandler-blur

After taking the snapshot, I went ahead and targeted the page with varying anchors assigned to the page as best as I can. A vendor I partnered up with did his best to accommodate the requested anchors.

Links From Real Sites vs. PBN Links

Before I go on, you may be wondering about how to distinguish real sites from PBN sites. The metric I use is Google index count. While most PBN sites will return pages with a low index count due to everyone leveraging homepage links, real sites have anywhere from hundreds to hundreds of thousands of pages indexed. The theory behind it is that Google will not de-index pages on real sites as long as their pages have real traffic.

After receiving the links, I checked the domains for its DA value and their indexed page count. They all had varying DA value and varying index count ranging from several hundred to tens of thousands. One site had over 13,000 pages indexed. This was a good indication that the links were placed on real sites, not a PBN.

white-hat

For about a month, rankings have steadily climbed upwards and have finally settled down. Other than emergency dentist + city, all keywords have gained significant rankings. The test has clearly produced positive upward rankings movements.

Now that the test has produced a positive result, should you go all in on it? My answer would be to hold your horses as there some things you need to consider.

1) Cost – Most white hat link vendors will charge anywhere from $60 to hundreds of dollars per link. Unless your client has a huge budget, the cost to build links can add up quickly and can drain your profit margin. Plus, there is no guarantee that the amount spent on links will produce the positive ROI. I know a friend who spent over $40,000 on white hat links and had no rankings to show for. Unfortunately, what you pay for doesn’t always equate to what you get in rankings in return.

If you are doing outreach on your own or have a team who can do the work for you, each link acquisition can cost $30 and up. I’ve recently hired a team and I am averaging between $30 and $35 per link acquisition.

2) Turnaround Time – Unless you are working with a vendor who happens to have a relationship with webmasters and bloggers, it can take a long time to acquire a link. When your client is expecting to see the result within a reasonable time, a long wait time can cause an issue.

3) No Guarantee That It Will Work – Unless you work with a vendor with a large database of established relationships with webmasters, it’s often hard to estimate where your next link will come from. You just have to live with whatever links you can get.

Conclusion

Although white hat links can be powerful and effective, it is by no means a solution to end all SEO issues. I typically rank thousands of keywords per client and switching to entirely white hat links for off-page SEO is out of the question. The budget required to achieve what I normally achieve with gray hat links would be astronomical. Based on the test result and the amount I had to spend, it would be a wise strategy to incorporate white hat links for select keywords and leverage gray hat links for less competitive keywords.

……………………………………………………………………………………….

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

Does Google Stacking Work? or Is It a Hype?

If you are a seasoned SEO guy, you know that Google stacking has been a buzzword for some time in the SEO community. What is Google stacking? It’s an SEO strategy leveraging Google properties in hopes of improving rankings using Google properties’ high authority status. To test to see whether the strategy can influence rankings, I’ve decided to order gigs from Konker SEO marketplace and try them on various keywords.

Since each gig costs $40 and I didn’t want to spend thousands on the test, I’ve decided to invest $400 on various keyword terms with different search volumes.

On June 5th, I checked the rankings and didn’t see much improvement. To ensure all the stack pages have been indexed by Google, I manually typed in site:GOOGLE PROPERTY URL in Google and found all of them listed in the SERP. Once I’ve realized that indexation wasn’t the cause for the rankings stall, I’ve decided to turn to Money Robot for tier 2 links.

One June 6th, I applied Money Robots tier 2 links to each Google stack page.

On June 19th, I’ve decided to pull the plug and checked the rankings. Here are my findings.

google-stack

Rankings Improved: 6

Rankings Worsened: 3

Same Rankings: 1

Verdict: So far, I’ve spent $460 on the test and can’t say I’ve seen a conclusive evidence that this is a solid strategy which provides positive ROI. To be fair, more time may be needed to test the long term effects but I’ve decided to move on to other tests.

……………………………………………………………………………………………………………….

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

Does Google Maps Embed in Blogs Help With Local Rankings?

Ever since Google reduced the number of local listings from 7 packs to 3, the competition just got stiffer for everyone. If you are doing local SEO for clients, you want all the competitive edge you can get. One of the hottest trends in local SEO space has been leveraging maps embed via blog syndication to influence local rankings.

001

002

 

To test whether the trend has any merit, I’ve decided to add the maps embed tool in my mastermind course website. Since I have over 1 million indexed pages and growing, I thought I would put them to use for a good cause. I’ve selected a client site and entered a keyword which wasn’t ranking for maps but was ranking for position 9 on the 1st page of Google.

000

A month after adding the maps embed code into the tool, I found the website on top of the 3 pack for the test keyword.

map

If you are an SEO guy, you understand that seeing the needle move is like winning a jackpot in a poker tournament. To confirm my finding, I’ve decided to double check with the mastermind group members before sending the result to my client. Below are the responses I received.

003

004

005

Analysis and Conclusion

Based on the responses I received, the result gave me some insights.

1) Maps embed syndication does influence local rankings. If you are creating an external blog post, be sure to include your client’s maps embed code.

2) The rankings will not be consistent due to Google’s data centers in different geographic locations seeing different amounts of embed. It takes time for Google to recrawl the pages that have been updated with the embed code.

3) If you use a ranking tracking software to collect data using proxy IPs, you’ll rarely see consistent rankings data. It’s no wonder why people are complaining about ranking inconsistencies found via rank tracking services these days.

…………………………………………………………………………………………………..

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

How Long-tail SEO Strategy Leveraging Relevancy Clustering Can Fuel Your Agency Growth

On April 22, my client sent me a list of keywords to rank in a medium competition manufacturing niche. As soon as I received them, I grouped the keywords using the relevancy clustering method I’ve been using for years and sent him a file containing the page structure information. Per my instructions, the client created pages and sent me 3 URLs a week later. Upon realizing that the URLs weren’t indexed by Google, I submitted the URLs to Google by typing ‘submit url to Google’ in Google. After taking a break for a couple of hours, I entered the keywords into a rank tracking software. Lo and behold, I was able to see that the keywords were already ranking in the top 100 positions. I immediately sent a screenshot of the rankings to my client who became excited with the instant traction the new pages had gained.

submit-url-to-google

lakeland2

Imagine the scenario where you are able to rank a large number of keywords and have them ranked while you work on off-page for stronger rankings. While most SEO agencies struggle to keep 10 or so competitive keywords ranked and often get fired from clients within 6 months, I’ve been leveraging the long-tail strategy on a large scale to grow my agency business with hundreds of clients with a retention rate of over 95% for years. To be fair, long-tail SEO is a really effective strategy but I am not going to praise long-tail SEO and claim that it is the solution to end all SEO issues. The truth is it works better in some industries than others. It is, however, an important weapon to have in your arsenal if you are running a digital marketing agency. Let’s look at some of the aspects long-tail SEO strategy and relevancy clustering.

The Basis for Relevancy Clustering

While most SEO strategies are based on a single word per page silo structure, I started lumping keywords together for stronger relevancy when I learned that Google had purchased multiple semantic web technology companies. The theory behind relevancy stacking is that Google favors relevancy and you can easily leverage this by placing closely related keywords in all SEO elements such as URL, title, description, H tags, and content. Using this approach, you can have a head start in rankings.

If you are an AdWords user, it’s no secret that Google spits out hundreds of related terms based on a seed keyword. You might ask, “How does Google know all these related terms?” The answer is for every keyword, there are related words in Google’s database and each has a relational numeric value assigned. By organizing related terms in clusters for all SEO elements for both on and off-page, you can form a stronger relational presence and help Google to connect relevancy dots.

Matt Diggity has recently shared with us his observation of larger sites doing better in rankings than smaller sites. This reinforces the strategy of relevancy clustering. From an algorithm perspective, Larger sites have more chance to present itself in a topically organized way than smaller sites with thin content. I’m also currently witnessing that smaller sites are losing grounds in the rankings on a day to day basis.

An Example of Relevancy Clustering

For local markets, you need to treat each keyword phrase with geo-modifier in it as one phrase and come up with multiple variations to form a relevancy cluster. Here is an example of related keywords for ‘New Work Lawyer.’

NY attorney, Lawyer in Manhattan, Legal counselor in NY, NYC lawyer, etc…

One major benefit of creating a cluster is that it helps you avoid over optimization penalty. For years, I haven’t had one website penalized by using this technique alone.

Strategy Comparison

Not all SEO strategies are created equal and as an agency, you must think like a strategist and decide which strategy best serves your client needs. Since running an agency is a business, you must think in terms of what’s the likelihood of achieving and maintaining rankings while staying within the budget as there is no such thing as an SEO project with an unlimited budget. So, what’s the ideal strategy? It really depends on your client needs.

For a lead gen market, mass page builders can work well as they are designed to go after lots of super easy-to-rank keywords on a large scale. Platforms like Lead Gadget can launch sites with a large number of pages effortlessly. For highly competitive niches, you need to get a few competitive keywords to gain traction and there is no shortcut for it. For local markets with multiple locations and lots of niche specific keywords, somewhat competitive user intent-based long-tail keyword strategy works well. Since going after all three strategies simultaneously is not practical from a budget standpoint, I’d personally prefer to go with the long-tail with relevancy clustering strategy.

For many agency businesses, long-tail leveraging relevancy clustering is often a golden strategy since it’s not too difficult to maintain rankings and you can target large intent-based keywords with relative ease.

strategy-comparisonIdeal Client Candidates

From my experience, local clients with lots of niche keyword variations serving multiple locations in a population of over 100,000 or more are ideal clients. I have a client who is serving multiple counties and towns in the home care niche and they have become the number one producer in their franchise group due to long-tail organic SEO. For this particular client, I have identified over 100 niche related keyword variations. Since they are servicing 20 locations, the number of combined keyword phrases becomes 20 times 100 which is 2,000. Once you’ve demonstrated that you can rank a good chunk of the keywords from the start, they’ll see the value and will get hooked on it.

When done right, on-page alone will rank a large number of keywords as soon as the pages get indexed due to relevancy clustering. Even if you were to rank 10% the first month from the example client above, that’s 200 keywords ranked you can present to your client. With proper backlinking, the number of keywords getting ranked will continuously grow and you can always show the result. In most cases, traffic grows as long as the market size supports demand for user intent keywords.

The Logistical Challenges of Relevancy Clustering

Although it’s true that each intent-based keyword is easier to rank than a short-tail niche keyword, it can easily swell to hundreds of keywords. In many cases, it can even swell to thousands of keyword variations. Easy rankability just became a logistical challenge as you don’t want to have to deal with optimizing hundreds of pages manually for each client that comes along. To alleviate this challenge, I’ve come up with several techniques and tools to handle the process. One is relevancy layering and the other is the master content strategy. Both tools are designed to quickly launch relevancy clustering optimized pages while helping with rapid content deployment.

Relevancy layering tool is designed to quickly create pages by forming relevancy clusters from a large set of related keyword phrases. I am currently using this tool to quickly launch hundreds of pages to help rank thousands of keywords for my clients. The master content strategy is designed to help systematize the content creation process by leveraging writers who are good at creating relevant articles. To help with off-page, I’ve built a sizeable network to help with contextual backlinks as well as image syndication and maps embed. As a result, I can maintain a large percentage of keywords ranked and it gives an assurance to clients that the approach they are paying for is working.

Recommended Strategy Going Forward

Since long-tail SEO strategy is only a means to acquire highly targeted traffic, it is recommended that you consider maximizing all traffic potentials such as paid, social, retargeting, and other digital marketing channels. Whatever SEO strategy you opt for, it’s only a method of generating traffic. After all, digital marketing is about creating a well-oiled marketing and sales machine for a business. It’s not about preference or ego, it’s about what delivers an ROI.

If you are interested in learning about advanced strategies and want to access custom tools, you can join http://www.RelevancyStacking.com.

For information about white label partnership, please visit http://www.contentblognetwork.com/. Please note that there is a long wait for this service.

For an honest discussion about marketing and SEO, you can join my non-spammy group: https://www.facebook.com/groups/SEOSignalsLab/

To access my latest SEO tests, you can register at http://www.SEOSignalsLab.com.

Can Trusting TF (Trust Flow) Metric Hurt Your SEO?

One of the biggest problems with SEO these days is that the entire SEO community is fixated on the third party metrics for evaluating a website for trustworthiness. Unfortunately, it can hurt your rankings if you don’t have the right SEO process in place and can’t read between the lines. In other words, your SEO could suffer greatly if don’t have the ability to decipher all the right signals. After all, what Google thinks of your site matters the most, not the third party metrics or crawlers.

Recently, I had an opportunity to audit a website which was doing well for months but lost rankings during Google’s Fred update. Upon closer inspection of the backlinks, I ran into a site which stuck out like a sore thumb. According to Majestic, it’s TF (trust flow) value showed 20. Most SEO guys would tell you that it’s a decent metric. In fact, you’ll see lots of link peddlers selling link packages and touting it’s one of the safest metrics out there. When I examined other elements, there were signs of red flags.

001

One of the metrics I use for evaluating a site is its index count as it comes directly from Google’s database. According to Majestic, the index count showed 68. I went to Google and it showed 37. For whatever reason, Google has de-indexed 31 pages.

008

007

When I visited the homepage, everything became more clear to me as the homepage relevancy was out of whack. Let’s look at why this could cause an issue. If you are using the homepage to send backlinks, its page relevancy matters. Since the domain name is holy-redeemer-mcc.org, we know that the homepage content should be relevant to religion in order to maximize its relevancy signal.

Let’s scroll down and see if any content piece or outbound links are related to the domain name.

006

005

004

003

002

As you can see, all the articles are totally irrelevant content. Relevancy dilution is real and it is a serious disease you can easily cure.

Conclusion

Does this mean you should entirely ignore TF value? TF can be useful as long as you understand how it was derived. The third party crawlers try to crawl everything and have a long-term memory. Unfortunately, it may not necessarily coincide with Google’s memory. Perhaps it once used to be a good ol’ trusted site. The problem is that adding irrelevant content can devalue its SEO standing with Google. If you are buying links from link vendors, beware of what they are doing. Page irrelevancy can hurt you in the long run.

————-

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/. We have great discussions going and I always add my insights gained from 20 years of SEO experience and 30 years of business and marketing experience.

If you are interested in receiving advanced SEO training, you can visit http://www.relevancystacking.com. The course is designed to help you scale your SEO using cutting edge strategies.

To get the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com.

Here is a Proof That Outbound Link Makes Google Nervous

post-template-obl-test

One of the benefits of owning a network of digital properties is that you can carefully monitor and perform SEO tests using various methods and parameters. Coming from a programming background, I can set up my own metrics and test them for numerous cause and effect relationships. With over 1,200 hundred sites in my network and growing, I can gain insights no other SEO can provide.

For years, I’ve been leveraging expired domains as a part of my overall SEO strategy. In most cases, expired domains carry some SEO value if you have the right vetting process. Here is one major problem. Even with the best third party metric data in your hand, it’s not a guarantee that Google will like the domain. With so many expired domains coming into the scene, Google has a reason to get nervous as there are literally millions of expired domains which can affect the overall SEO landscape in an unwanted way.

Since links play a crucial role in SEO, I’ve decided to do a test based on a hypothesis stating that Google is cautious with outbound links that are added to an expired domain. To test this, I programmed revived expired domains to wait for the number of indexed pages reach at least 10 before outbound links are added. When it drops below ten, I made the links disappear. I’d figured it would be safer to send links from a domain with a certain number of index count.

I was able to minimize external SEO factors by making sure the domain has no social shares or other links coming to the domain. If all other variables remain the same, then Google should allow more pages to get indexed when outbound links are not present as pages without links should pose no threat to Google. On the contrary, Google should start reducing the number of index count once Google detects outbound links.

Here is the result of Google reacting to outbound links.

index_test

As you can see, the index count has changed over time based on the site’s outbound link status. The result has significant implications and below are some conclusions based on an observation of the pattern.

1) Generally speaking, Google is cautious with seeing outbound links coming from a new site or a revived expired domain.

2) Many people are adding links without considering Google’s reaction.

3) Link exchange is not necessarily a good idea.

4) You can’t simply rely on the third party metric data. The domain used in the example above had DA of 21 and hasn’t changed during the entire test.

Steven Kang

…………………………………………………………………………………………………………

If you are interested in getting the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com.

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/.

If you want to grow your SEO business by implementing my cutting edge SEO framework, you can visit http://www.relevancystacking.com/.

How to Safely Dodge Google’s Algorithm Updates Using TRAP Framework

post-template-trap

For years, the ultimate challenge for SEO has been how to deal with the Google updates. Many unnecessary wars broke out between black hatters and white hatters because of this subject. Quite often, white hatters accused black hatters of causing Google updates and black hatters accused the other side of spreading false rumors for their profit motive. What’s my verdict? It’s neither. Google’s algorithm looks at signals, not whether your links are resulting from outreach efforts or have been bought.

Don’t get me wrong. If you are doing content marketing, you should reach out to sites with readership. Getting a link back to your converting page is a good strategy for marketing purposes. But from a pure algorithmic standpoint, you may be satisfying unintended objectives whether you’ve realized it or not.

After many sleepless nights, I came up with a unique framework by sidestepping away from how everyone is looking at the entire SEO process. The conceptual framework has been the result of almost 20 years of SEO experience, observation, and testing.

SEO blogs and marketers often mention 200 factors at play for rankings. The number was derived from Google patent filed in 2008. If we were to account for 50 variations for each factor, it can easily swell to 10,000. From a logistics perspective, it doesn’t make sense for an SEO marketer to go through 10,000 line of checklist every time you launch an SEO campaign for a website.

One of the major challenges I’ve faced was how to deal with an ever-growing number of ranking factors and algorithm updates. When I realized that if I can figure out a way to describe the entire ranking process with a simpler model, not only would I have a way to scale, but I also knew I would have a better way to create a preventive measure against Google updates. Just like the concept of Yin and Yang, which describes the forces of nature in a simplified way, I was on a quest to discover one for SEO.

Several years ago, I came up with the concept of 4 major signals. By condensing all SEO processes into four, I was able to track and describe how Google behaved every time Google came up with an update. The 4 major SEO signals are Technical, Relevancy, Authority, and Popularity, which now I call TRAP framework.

Let’s look at each signal and roughly define what they are.

Relevancy – For every keyword, there is a related keyword that exists in Google’s database. Two keywords can be related in various ways such as semantic, geo, categorization, and brand.

Authority – Authority is a trust signal. Google looks for how often other trusted sites are referencing your site.

Popularity – Popularity has changed over the years as the web has evolved. Google made its algorithm changes to detect the signal from different places. Popularity signals can be divided into link popularity, social popularity, and search popularity.

Technical – Site security such as https and speed falls into this signal. Although some of the signals do not play a major role in rankings now, they are expected to play a bigger role in the future.

Now that we have roughly defined 4 major signals, let’s look at why understanding them has major benefits.

1) You now have a conceptual model to describe its behavior and understand what Google is looking for after each algorithm update.

2) You can easily catalog whatever SEO activity or link building scheme some SEO guru comes up with.

3) It gives a better way to identify the deficiencies with SEO activities you may have not known before.

4) You can plan a preventative measure and guard against Google updates.

4-signals
Dissecting TRAP Framework

In order to tame Google’s algorithm, which reacts to the four major signals, we first need to trap and dissect it. It’s a similar concept to colliding two atoms together in Hadron Collider in order to understand the inner workings of subatomic particles. The more we can dissect, the more tools we’ll have at our disposal to create a preventative measure against Google’s updates. TRAP is the acronym for the four major signals: Technical, Relevancy, Authority, and Popularity. From now on, I’ll refer to the four major signals as TRAP as it has all the components we’ve discussed.

Characteristics of TRAP

Each component of TRAP has a specific contributing role for the set.

Technical – It has a qualifying role for TRAP. To be even considered for a Google’s crawler visit, your site needs to load within a reasonable time. Remember, ranking starts with indexation whether it’s on-page or off-page.

Relevancy – It has a relational role for TRAP. Google values content which caters to users’ search intent and its relational value.

Authority – It has an amplifying role for TRAP. Authority signal is used to amplify the site’s overall standings with Google called site authority which translates to faster and higher rankings.

Popularity – It has a validating role for TRAP. Once pages and links are built, Google will attempt to validate a site’s off-page activity by looking at popularity metric which is directly associated with human activity level.

Using TRAP Framework to Shield Against Google Updates

One of the major benefits of using TRAP is that we now have a conceptual model to describe everything related to SERP dynamics. We no longer have to describe a cause and effect relationship with complex diagrams or a long checklist. From a strategist’s perspective, it allows an efficient way to interpret the ranking patterns and create a preventative measure against future algorithm updates. The term ‘holistic SEO’ no longer needs to be associated with fluff words like ‘synergy’ and ‘natural.’ Holistic SEO means using TRAP to fulfill SEO requirements.

One of the major issues with Google’s algorithm update is that no one can predict what the changes will look like as it has been manufactured and maintained by a corporation. Its motive is simple. Google wants to maximize profit for its shareholders and keep marketers out from figuring out its algorithm with a high degree of accuracy. This, however, doesn’t mean that we can’t prepare ourselves better than any SEO marketer on the planet.

By recognizing that a healthy SEO campaign needs to contain all the elements of TRAP, we now have a way to monitor and look for deficiencies in any SEO campaign. Here is an analogy. Think of owning a car with a set of 4 wheels. You can technically drive a car with less than all four wheels intact. You can even make the car move with only one wheel. This, however, can have a bad consequence as the car can come to a halt which is an equivalent to losing all rankings and possibly even deindexation. But if you can maintain the set of all four wheels well, you can ride out any bumps headed your way with relative ease.

One of the best ways to survive Google update is by making sure that TRAP is in good standing. Here are the steps I’ve been using for years and have shielded my clients from major Google updates.

1) Create a checklist for TRAP components and do the best to keep up with activities which satisfy each component per defined period. Each page doesn’t need to have TRAP components satisfied but the site as a whole needs to satisfy the TRAP checklist.

2) Monitor rankings progress and use TRAP checklist to augment or minimize signals.

3) Whenever there is a news about a major update, wait for the update to be over and identify how Google’s TRAP requirement has shifted. Google algorithm update usually ends up with a shift in technical, popularity, authority, and relevancy signal.

4) Recognize the shift in Google’s algorithm and make up for deficiencies using TRAP checklist.

trap-trend
After many years of implementing TRAP framework into my SEO business, here are some results.

1) One of my clients is a part of a franchise group with more than 2,000 franchisees. After years of applying TRAP framework, the client became the number one producer in the organization. His secret weapon? A consistent organic traffic resulting from SEO. All the franchisees in the group are looking up to him and he is bragging about how great my SEO is for generating traffic. As a result, I have a continuous supply of new clients.

2) My marketing became easier as I was able to develop methodologies not currently available in the marketplace. I became a strategic partner to numerous niche agencies.

3) I was able to drastically increase SEO fulfillment capacity and efficiency while lowering expenses by leveraging TRAP framework. I no longer need to spend countless hours wasting my resources on unnecessary activities since I can classify each SEO activity under TRAP Framework.

If you are interested in receiving in-depth training, you can visit http://www.relevancystacking.com/. On the homepage, you can scroll down and read the course objectives and member reviews.

If you are interested in becoming a member of my non-spammy discussion group, go to https://www.facebook.com/groups/SEOSignalsLab/. We have many great discussions going and I always add my insights gained from 20 years of SEO experience and 30 years of business and marketing experience.

To get the latest updates on SEO experiments, please fill out the form at http://www.SEOSignalsLab.com. I promise I will not bombard your email with affiliate offers or sell your email to a North Korean spam camp.