GSA SER tutorial how to rank a site

I love to experiment. I love to find things out. Reading stuff online in one thing and believing is another. But the experiment is a different level.

It’s been a while after I finally decided to do some experiments with some automated tools. I wanted to try and see that if I could rank a site with automated link building tools. Though, it is totally against Google guidelines.  But what is wrong in trying when you are not hurting someone else’s website.

This should bring some insight to the never-ending debate of black hat methods doesn’t work. But this experiment, when successful, won’t be able to justify that the black hat methods do work. Since, this is just a tiny drop of water in the ocean of SEO world.

So what domain will I use?

A fresh new domain or an expired domain with some backlinks pointing to it? Neither of them!

I am a big fan of expired web 2.0 blogs and for this case study I will be using an expired tumblr account with some backlinks. The tumblr blog is not indexed in google because I don’t really know when it expired. But, with some contents on it and few links pointing to it, the indexing should happen soon.

September 13, 2017

With the help of Dipesh, I selected an expired Tumblr blog.  When a quick backlink analysis check was done with ahrefs.com, the tumblr blog had no backlinks at all. But Majestic was showing some backlinks pointing at it already.

September 14, 2017

I wrote a quick content around 800 words and posted it to the blog. Instead of a blog post, I use the content as a post.

Dipesh helped me customize the appearance of the blog and added few more post. He even scheduled some posts so that they could be published on the planned time automatically.

i.e username.tumblr.com/my_page_with_content_to_rank ( with an exact match keyword in url).

September 16, 2017

Almost after 1.5 years, I was using GSA SER so I had difficulties using it. I spent some time going through the tutorials videos.

September 17, 2017

Now it was time to start building backlinks to the url. If you ask me why I didn’t build any backlinks so far, I don’t have any answer :P. There was no proper reason why.

I loaded up GSA with contents generated from Kontent machine. Selected SEREngines only because I wanted to see how many unique web 2.0 sites I can get the contents submitted and verified. So during the early hours of the morning, I started my first campaign on GSA SER.

Campaign Setup 1:

I started this campaign selection only SEREngines web 2.0 sites using

  • SEREngines
  • GSA Captcha breaker
  • 10 private proxies
  • 3 custom domain emails

Result: Only 5 unique web 2.0 accounts verified.

Campaign Setup 2:

Modified the same campaign

  • SEREngines
  • GSA Captcha breaker (secondary captcha service)
  • 2Captcha (primary captcha service)
  • 5 custom domain emails

Results:  14 unique web 2.0 verified.

Still, I was missing something. Out of 36 web 2.0 sites only 14 verified got verified. That is not even 50%.

Then I started hustling, finding out what I was missing. Within 2 hours I was able to figure out.

About 10 pm in the evening, I found out that the main url of the tumblr was indexed already but not the url that I was trying to rank.

username.tumblr.com  was indexed.
username.tumblr.com/my_page_with_content_to_rank  was not indexed yet.

Read More: What is a naked url?

The main URL got indexed because it was an old URL that existed in the past while the 2nd URL was a completely new URL.  I was a regular user of instantlinkindexer (an indexing tool which doesn’t work now in 2020) for the past few years until I stopped using automated tools. It was time to bring it back to use. I submitted the url to the indexing tool for good.

I loaded up my keywords to serblab.co.uk a cool rank tracker tool. It’s an awesome free rank tracker tool. The paid version has much more options to play with.

September 20, 2017

The URL was indexed already. To be honest, I can’t exactly tell if it was the backlinks or the indexing tool responsible for this. For a while, I thought I shouldn’t have built the links until the URL was indexed because I couldn’t figure out the actual reason.

Now it was time, I needed to test the SEREngines to its limit. Before that, I had to figure out how to get maximum success ratio with SEREngines.

  • 6 emails (2 mail.ru + 2 inbox.eu + 2 custom domain email)
  • 2captcha
  • 10 private proxies

This time I had 24 unique verified web 2.0 accounts. That was it, I was trying to figure out.

Nothing on the serplab.co.uk interface!

September 21, 2017

A different campaign set up with the following

  • 15 email accounts: 5 mail.ru + 5 inbox.eu + 5 custom domain emails
  • 10 private proxies
  • 2Captcha
  • GSA captcha breaker

Impatient to the campaigns results, I decided to go with hard and fast rule. Build as many links as I can and see the success rate with SEREngines.

So with 1 email account, I was trying to register to 36 different sites and with total of 15 email accounts that would be 15 * 36 = ( if the success rate was 100%). And 100% success rate is not very possible because of many reasons.

BOOM! Within 2 hours, I had 687 verified URLs. Of course, I did my calculation and then figured out that I had screwed up the setup.

I have set GSA to do multiple post to web 2.0 accounts from 2-4 posts. When I stopped the campaign I realized that all of the posts had a link pointing to the Tumblr URL, which seems to be very unnatural. I realized I should have modified that and set only one link from each of the web 2.0 sites pointing to the Tumblr blog.

Since I stopped the campaign, no more link building was taking place with GSA. I drip feed the verified links to indexing service.

September 22, 2017:

When I logged into my serplab.co.uk account, I noticed this. Wow! Just Wow!

Note: I noticed that google was set to google.com.np and the keywords position are all happening in .com.np

September 23, 2017:

I set up another campaign and started running it immediately. The 15 email accounts that I used earlier could be reused so I tested if they were still working and yes, they were. This time I selected only Article directories and Wiki platforms in GSA.

The purpose of this campaign was to maintain the anchor text diversity because the earlier campaign had almost similar anchor text from different post with in a single web 2.0 site.

My anchor texts were just generic and naked url this time. I let the campaign run as much as it could find sites to post.

September 24, 2017:

Things started getting interesting. But I am not a believer that I would rank this easily with GSA throwing just some backlinks to it.

This could just be the calm before the storm or I could actually be ranking, who knows? We got to watch the results until uncle Google determines what goes with the Tumblr blog.

September 25, 2017

My second campaign in GSA is stilling running and a total of 227 urls has been verified. But “no target to post” message has already appeared which means GSA is running out of sites to post.

Things are getting scary here or it’s just the Google dance?

Campaign Stopped

I stopped the campaign on 27th and let the Tumblr blog alone for good. But I was checking with the keywords positions every day. The keywords ranking started dropping down and down but still with in the page 10.

Then something hit my mind. I started a second campaign on serplab.co.uk and this time I added only 3 of my main keywords and then set google as google.com

The next day I noticed that position of the 2 keywords were really impressive! I wished I could have update the blog with in each 7 days but I had very less of time to do that.

Another web 2.0 (SER Engines) was started on Oct 12 and it is still running since I started to do it slow.

October 16, 2017:

The campaign started on Oct 12 is still running.

Below is the screenshot of the keywords for google.com.np

Google.com is still doing great!

October 27, 2017:

Change of Plan: I decided to track only 3 of my main keywords for convenience as well as it is much easier to compare those keywords between the two versions of Google search engine.

Some positive changes in both versions of Google search engine. Seems like the 2nd campaign is working!

Google.com

27 oct, 2017 keywords update seomandu google.com.np


The 2nd and the 3rd keywords has jumped back to their best positions!

Google.com.np

27 oct, 2017 GSA GSA Search Engine Ranker keywords update


The 1st and the 3rd keywords has now jumped back to their best positions of all the time.

Today, I created a 2nd tier to the last campaign. That includes all the platforms that supports contextual links. In addition, I am creating a big list of auto approve blog comments using  Scrapebox. I haven’t yet decided what I am going to do with the AA list but, I am pretty sure I am going to use it for this case study. Do check my tutorial on how to use Scrapebox for SEO which I wrote few months back.

I am not encouraging anyone to use automated tools for SEO. It’s my own personal preferences since I love to experiment SEO rather than just run after any articles that I read.

The best way to rank a website is still following on the Google guidelines and doing things the proper way.

14 comments

Leave a Reply

Your email address will not be published. Required fields are marked *