Saturday, 1 April 2017

40+ Incredible SEO strategies to improve google search rankings

There are almost 200 search engine optimization factors available over the web which defines whether your blog post will rank higher on google or will get lost among the other web pages of google and can never be reached to the first page of google. Writing your content with passion and nobody reading it is very sad and painful but with using the following strategies of google search ranking can immediately rank your blog over the top of google first pages.

As i said over 200 ranking factors are there but only 21 strategies you need to focus on(because you can't focus on all) and which will take your blog to the next level and also bring traffic and fame over the internet. Google search engine algorithm has evolved since 20 years and has become more accurate and user friendly. Natural search traffic to your blog post is the only way you are going to make lots of money and authority on google but depending on the paid services can only give you visitors for some period of time.

SEO is the most important thing which will define whether your blog will be noticed over on the top of search results page or will get lost among the others.

SEO is again divided into two parts:

1. On page seo
2. Off page seo

Search engine optimization (inside story)

Besides these there are also many things which depends on the user activity and experience when they visit your website and the number of shares that your posts are getting when they are shared among the social media giants like facebook, google plus and twitter. In 1997, the search engine optimization techniques came into web when some people over the web started using black hat strategies and getting started to high page rankings on google by over keyword stuffing, fake visitors, fake back links and inbound links from different low quality websites.

Webmasters started using different techniques to rank quickly over the web stuffed with lots of keywords.

In 1998, Larry Page and his associate brought out the Page Rank factor on the web to rank only authority based and logical content on the web so that the visitors can get suitable content for what they are searching for. Page rank refers to a percentage of rank factor which a website develops slowly by providing high quality and accurate content on the web and getting back links to their website by other authority sites. Page rank is given between the ranks from 1 to 10. The more, the rank of the page the more likely it is for the website to be ranked on the google.

After the page rank incorporation on the web, the webmasters(who trick google search) started to buy, sell and distribute back links for money .

In 2009, a new techniques to fight the webmasters came into work, when the factors of "nofollow" and "dofollow" were introduced to the web development. Matt Cutts, head of the google search authority said that the links from other pages which are entitled with the "nofollow" attribute will be no longer be treated as the back links and that links would be only meant for the information purposes but the links which have been used over the web for linking purposes will be entitled with the "dofollow" attribute and will add authority to other websites.

In 2011, search becoming more user friendly and built security when the Panda update came in action and stopped and started penalizing which contained copied content from other websites and that's the reason half of the top traffic websites got erased from google. Thus, providing unique content became the other factor.

Again in 2012, with the Penguin update, the search became more advanced when short and low quality content on the web got penalized and many websites disappeared but the people who used to provide high quality content on the web got the advantage and the chance to rank higher for their keywords. Thus, 2500+ words lengthy and informative content on google became a factor.

In 2013, a Hummingbird update came into action which helped the web developers who used planned and targeted content on their websites.

The websites with more content strategy which provides detailed information to the users rather than facts but also figures and proof and examples. Thus, long tailed keywords which contained 4+ words which described more about the topic got more benefited and advantage from the google algorithm.

Blogging is not for earning it's about helping others with the knowledge that you have -Syed Faizan Ali

My blog's 3rd post:

Finally I am going to post my third lengthy high quality post to improve google search rankings and it have done a lot of research writing this blog so maybe you guys will like it. My 1st and 2nd post were

1. 10+ Accurate content strategies to increase blog traffic fast(in 6 months)
2. 7 Strategies to Kick-Ass followers, upvotes and increase traffic views on quora

The first blog post got me about 200 visitors and the 2nd one got me 116 visitors from search engines, communities, forums, social media like twitter where i am having about 3600+ followers and google plus. You should know that it is a new blog and started almost a week ago, so the visitors will be less and 200 visitors till now is great as many blogs who are new does not get at least 20 views within a month if not written properly. At first, the traffic on your blog post will be like that and it will only increase when after 3-6 months, anyone of your blog posts get ranked on the google for a specific keyword. If your blog posts are of high quality with unique content and of average length 2500+ words then the results may come fast. Let's see what happens, I am just testing whether this blog will change my tomorrow or destroy my tomorrow. I am just following the foot steps of a webmaster who have received millions of visitors following these strategies within a year.

This post will be describing all the important SEO factors which must be applied to a professional website so that website can improve it's appearance in the google search results and can bring lots of targeted traffic to a particular website.

Step by step strategies to improve Google search results, Authority and Page Rank

Here I will be providing a complete list to all the on-site and off-site factors with also the guide to how to fix it. If your website can improve by checking and analyzing all the strategies then your website will be 100% friendly and ready to be appear in the search results.

Improve rankings #1: Define your meta title and description including targeted keywords

Meta Title and Meta Description are extremely important to mention google that for what keyword you want to rank for and explaining the google algorithm that for what keyword, you have made this website. The limitation of Meta Title is that not more than 70 characters can be allocated for mentioning the title and the limitation of Meta Description is that not more than 160 characters are allocated for describing what your website is about and what keywords it really focuses in the google search. The meta title is mostly displayed in the page title bar when somebody opens your website and this title is also shown when somebody bookmarks your website on their browser.

#2: Maintain Keyword usage cloud and avoid keyword stuffing

keyword usage cloud

Crawlers and spider bots usually index your website and identify the type of information that you really want to provide it to the people. Many websites do not use keywords in their blog domain name, but use the keywords in their title and description of their website which clearly explains that it is not necessary to have keywords in every part of the web page. Bots usually crawl each web page and identify how many times and in what density a keyword has been used on a particular website. But there is also a spam factor and limitation called keyword stuffing which has been earlier used by webmasters to rank a page higher in searches. The fact is using a keyword more than 100+ times on page can often appear as a spam and seemed to google that you have highly stuffed your article to get ranked higher in search rankings.

#3: Accurate use of heading tags like h1 and h2 are of utmost importance

Header tags in html should be used efficiently and properly. Only h1 tag should be used and several sub header tags h2 can be used on web page. h1 header tag is necessary to define the purpose of your blog which should be similar to the title of your blog post and several h2 tags should be used to divide the content so that the user can face less difficulty to understand the whole blog that you have written. The header tags help the spider bots to identify about what purpose you have written the blog post and rank it accordingly. You can also use different colors and styles to your sub header so that it look nice in appearance to the user.

#4: Have a sitemap according to your website contents and size

A sitemap is very important for a web page and search algorithms and bots to identify the exact details of the content uploaded to a web server.

Sitemaps are of different types for different types of contents like text, images and videos,documents. The sitemap should be created and uploaded to a directory in web server. A sitemap should be less than 10,000 kilobytes and can be compressed using gzip to fasten the website speed. A sitemap is required to do various things like submitting the list of blog posts that have been uploaded, providing more accurate information about the images, texts and videos.

#5: Use of Custom Seo friendly url instead of automatic underscore url 

Using underscores in your url can make your post less seo friendly and hard to read by crawlers. You can use hyphens(-) instead of underscores as google cannot identify underscores very much but using hyphens(-) often act as spaces between the titles and google can identify it. Many pages are set to default when you are changing it like

These addresses above are not seo friendly and effect in your search rankings. You just need to change the address to a specific URL like

Note: You should only change your URL address when it is not ranked well by search engines but if the page is well ranked among the search engines just there is no need to change it.

#6: Correct the links or redirect it having a broken links

free broken link checker

Broken links can highly effect the overall rank of a web page on google. They are non user friendly and can effect your seo friendly site. For fixing it, a custom 404 page can be enabled which will automatically bring back the user to your respective 404 page. You can check your broken links of your website on free broken link checker or w3c validator. These broken link checker can find out both the internal broken links and external broken links.  

#7: Must use a robots.txt file within your website directory

The proper use of robots.txt file is required here as the wrong use of this file can disable the search engines to crawl and index your website.

Robots.txt file disallows the crawlers to do not crawl some portions of the website like the disclaimer page, contact us page, privacy policy page, etc as it may create a duplicate content issue on your blog. Having a robots.txt file is necessary nowadays as a seo factor and the websites must use it. The spider bots after visiting the site first go through the file and then crawls the allowed pages.

#8: Have a external directory file for storing Inline CSS Styles

Inline CSS styles does not create much problem on a site but it helps in having a unique design and style for your blog and a good user experience. The only problem, it creates is that it increases the page load size and the loading speed and that's why you need to move it to a external css file to do not increase the page loading speed. They are new technology in demand and require more space and size. In blogger, you need to add it in the Theme>customization>advanced seo>add css. In wordpress, you need to upload it to the web server directory and remove it from your template html.

#9: Organizing images is necessary before publishing on blog post(use image expires tag)

Google algorithm cannot read images so it takes the help of alt attributes and image file names that are already saved in your computer. Image expires tag are also important for a image because it saves the image cache the next time the visitor visits your website and does not need to reload again the same image and takes less time.

#10: Must use latest HTML version in your template( with DOCTYPE declaration)

Latest html language are more fast and almost compatible with all the search engines and browsers. Browsers are frequently updated to read the latest html technology, css, java script and all types of programming languages and delete away the older technology of html so your template html should be advanced with latest programming languages to increase speed and compatibility. Mentioning the DOCTYPE is essential in template, with this declaration any browser or bots can find out which and what (X)HTML you are using in your website. They help in proper web page loading and working on different types of devices.

#11: Customize the use of java script in your template and avoid errors

Java script errors can hamper the working of your website and display inappropriate and incomplete widgets to the users. These errors cannot respond to loading and destroy the user engagement and quality. More defects in java script can make your rankings more poor and low.

The methods to fix the errors are

First of all , open your template and find out the java script errors that your website checker is providing. Nowadays Java script have been got advanced and reading the documentation and functioning before just using or copy pasting from other sites is important. The JS plugins and third party codes also cannot respond with the current format of HTML that you are using so you must add some functions or define some functions to your code like jquery. A missing character or symbol can also create the problem.

#12: Social media engagement is necessary to rank your blog

Your website connection to one or more social media is of utmost needs in order to rank your blog for a keyword. Connection and having a profile in each social media and with your website url embedded in it can bring you back links for free. You need to immensely focus on the high social signals like when your blog will start getting traffic from here, google will notice and rank you higher. Getting user engagement and shares from social media is only noticed by the bots.

#13: Your HTML template must fit the average page load size (including compression)

The average load size of html to be used on a page is 35 kb and if your website load size is less than 35 kb then it is good and you will be seeing a good increase and user response on your website like a increase in email lists, social followers, revenue from ads. This size do not includes the external css and java script files and images, slide shows, and videos embedded in your website and hosted on different servers.

#14: Reduce page loading speed 

Page loading speed is everything and an important SEO factor in terms of ranking and user experience. The fact is not well known that large businesses suffer a 50% loss in their revenue in online marketing if the website server slows down for even a minute. Website speed is everything and for hosting large and traffic rich websites a cloud based hosting is required to power the user experience. Gzip compression is the best technique used to compress about 80% of html content and making the website 40% more faster than usual. The average site loading speed is about 5 seconds and if your website takes more than 5 seconds to appear then in no way your website gonna rank in search engines.

To fix the problems you must use

*Gzip compression technique
*Minify and shift all the css and javasript files to external file
*Use css layouts
*Lower the HTTP requests
*Use HTTP caching strategy
*Reduce redirecting technique
*Reduce the number of plugins used
*Optimise the images
*Use the leverage browsing strategy to minify JS scripts

#15: Lower the use of number of objects on a web page

page objects

HTTP request if not retrieved hinder the site speed and the user experience.
The objects which effect the HTTP requests are:

*HTML pages and files
*CSS files
*Number of scripts
*Use of flash files

Have you noticed that sometimes your page text loads immediately but the images and videos takes time to appear, this is known as HTTP requests.

How to fix it:

*Reduce it through different methods
*Use text instead of images
*Use CSS sprites
*Host the files on different fast servers and use urls on your site
*Combining different external files into a single one

#16: Make sure your server provides pre-loaded pages over cache

Having a cache mechanism enabled on your server can help to save the pre-loaded pages and the post-loaded pages once loaded and you do not need to load it again. The forward and backward option on your browser is for that purpose only. It save the loading and execution of the several java scripts and php scripts over time.

#17: Lower or do not use url redirects

Broken links should not be fixed by the use of redirects and should be changed completely with a new url of the same purpose. The unnecessary redirects like www to non-www and http to https can cause serious issues with the bots and your content may be considered as spam and hamper the user experience. These can cause minor understanding issues by crawlers, duplicate content issues and page loading speed.

#18: Using frames in web page results in bad user experience

Frames are advanced technology which are available in separate formats of content on a computer screen, when you click on a particular section a new html document appears which can effect bots completely identify your content. This frame sets can also hinder the printing process and the front/back button where it loads incompletely. Spider bots cannot index the whole content too and are also not mobile friendly.

#19: Don't use outdated technologies on your website (flash, nested tables)

Flash is better defined as a old technology which was used to show up rich multimedia content over the web. It was more space taking so there emerged new technology to revive the internet speed of the web. Nowadays Flash does not works on handset devices and cannot be properly indexed by the crawlers. Updated browsers have came which have removed their code to decode and show up the flash on their computer.

Nested tables are not the same thing, they are just tables containing html code within another table embedded within it, but the use of them can slow down the page loading speed and show up incomplete content over the browser.

#20: Minify the use of Java sript and CSS styles over your html

Minifying the java script and css files can improve the page speed with much more efficiency and lower the html size of web template. The first thing which is that to transfer all the files of both codes to an external separate file and then use the following code minification tool like YUI Compressor to compress it and again save it in the exact location where they were present.

#21: Don't provide any personal information in plain text on website(email obfuscation)

Providing the emails and phone number in plain text on your contact us page or about page can result in spamming of your website and can be hacked by the hacking webmasters and you can certainly loose your own website by just one single mistake. The another reason not to provide any email for contact purposes is that there are various email collector available like email hunter and atomic email hunter which can scan several websites and collect all the information and emails which are shared over the page and can be used for sending spam emails and product selling advertisements. Providing images, removing @ and dot(.), hide using css of personal information is good if you want to share, users will automatically get your message.

#22: Using HTTPS over HTTP a secure communication protocol is considered a ranking factor 

HTTPS has been considered more secure and privacy over HTTP due to the fact it adds an extra authentication and encrption layer between the server and the user. HTTPS is mainly used when there is a need of sharing personal information like credit card details, aadhar card datails, but now the use of HTTPS has been considered an important ranking factor as in personal websites there is also a sharing a personal information like emails and social media user id of facebook, google, twitter and others.

#23: Do not upload any malicious and spam content for download over your website

Website developers should always keep in mind that google knows everything so anyone sitting at home operating personal laptop cannot trick google algorithm. Google is strictly banning several sites that are not using malicious spam content for downloads on their website.

#24: Make sure your server identity and directory is hidden and kept private

Server signature or identity when it is turned on usually shows all the software technologies which you are using and are detailed risk of vulnerabilities. Directory browsing is also said to be unsafe that anyone can get to see your all index.html files and gets to know all about your software.

#25: Block libwww-perl access on your server

Many botnet scripts are available which are searching for any vulnerabilities in your software but they are simply malicious sometimes and attack your server database and can take away all the user information and emails. Apache server often used to allow access but you can block it by changing your server or attaching some lines below the Rewrite engine on line in(comment for the html code) your .htaccess file.

#26: Your HTML should be responsive over different handsets and using media query strategy

As you all know that in 2015, the number of mobile users are greater than the number of desktop users. Your template should be responsive in all the different types of devices of different sizes and length otherwise your advertisements will not appear properly and you will automatically loose your audience clicks. The new media query technique(@media rule) if attached in your template then it will make sure that your website properly fits to the size and display of the screen and checks whether the display of all the widgets and the ads are proper or not.

#27: Make sure tabs are user friendly in mobile template

There are several pages like contact us, popular posts, archive posts, about us, several converters which are not included in your template and may be not seo friendly sometimes so you must check it once before publishing the blog post.

#28: New strategy of HTML5 micro data should be enabled

microdata schema seo

This advanced seo strategy is better known as the structured data content. This micro data collect information from your blog post and understand the overall content on your site and provide rich snippets, data, title and description to be appear in the search engines. They collect various informational properties from the blog post like name, headlines, url addresses, description, article body and image snippet url to be presented over various places on the search engines.

#29: Enable SPF( Sender Policy Framework) policy on your website

Sender Policy Framework is a type of mail that looks after the sender emails and make sure that it is from a verified and authorized sender of email. When your website will start getting popular then many people will try to spam your website and close it and SPF security policy looks after it. Your DNS server(Domain Name Server) is having this function and you need to enable it from your server. On the DNS server, you need to create your sender policy framework which is as simple as adding an CNAME and make sure you choose how to detect and handle the emails send to you from different proxy servers.

#30: Must use "nofollow" and "dofollow" attribute on your webpage

Here "nofollow" attribute means that the links which you will add to your blogs will not serve as back links to any other websites and will be used only for informative purposes. The "dofollow" attribute means that the link which you have added to your website will serve as back link and will add value and page rank to the linked website for money or for free. This policy is too serious like having too many "dofollow" links from a single web page can make your website a spam by google but having too many "nofollow" links does no harm to your website like wikipedia.

#31: Knowing the use of Canonical tags

This tag is mainly used to select a basic URL address for the use of representing your website like your website is having non-www, www, http, and https page but you are primarily focusing on the https secured address of your website.

Success usually comes who are too busy to be looking for it - Henry Fielding

Note: These were the SEO techniques which you must follow now to compete for your keywords you must focus on the following strategies. All the professional websites have these techniques applied on their websites and you must also do to make your website 100/100 user friendly on mobile and desktop. Applying these will make your website more SEO friendly but these will not help you much in getting more traffic and achieve millions of visitors.

#32: Focus on your targeted keywords 

For getting higher rankings you must focus on a topic for which you are going to write and for which you are going to rank in the google search results. Your primary keyword should be present in your domain name, your meta tag, meta description, and header and should be used twice in your body of blog and should be used once at the ending. Nobody can be ever able to achieve success in blogging without writing over a particular topic of content.

#33: Write 2500+ lengthy and informative content blog posts

Nowadays the competition have reached to about more than 150+ million people who are writing and publishing blog daily. Without writing 2500+ words blog post you are not going to be noticed and appear in the search results. Due to the panda and penguin algorithm updates by google, low quality and short 200 words blogs have been got vanished from the search engines. Now the high quality blog publishing websites like wikipedia and neilpatel are getting most traffic and fame in the long run.

#34: Try to create natural links from higher page rank and authority websites

Links from higher page rank websites is only possible if you are writing awesome guest posts or buying "dofollow" links from them. Matt Cutts have said that the only way to create natural back links is to write lengthy and awesome content on blog so that people reference you on their websites. But for me, the only way to get links from high quality websites is by guest blog and suggesting them broken links. By suggesting them broken links you can email them to redirect your blog's content address instead of that broken link.

#35: Start increasing your feed readers and email collection 

Feed readers and loyal followers on email are the first part of getting traffic on google. They are the readers who will support you in the long run and will act as your army of your blog who will share and retweet your blog post on various social media.

#36: Try to be active on social forums and networks 

When running a brand new website, generating views and readers on your blog often gets hard and tough. The only way to generate traffic to your blog is by joining different community, forums and groups where you can share your content and get views to your blog. For the first three months of your blog, in no way your blog is going to get thousands of traffic from search engines.

#37: Submit your website to all the search engines available worldwide

Thare are many search engines available with less audience than google where you need to submit your website and get the back links from Yahoo, Bing, Alexa, Alltheweb, MSN, etc and others.

#38: Having a nice website structure and navigation 

Making your website more navigational and movable is also an major SEO factor like tab for previous posts and next posts, archive, horizontal navigational bar for 1,2,3 and more pages,popular posts and most commented posts.

#39: Make your images public and easily share

Adding pin it button to your images and making it highly share at your blog by social buttons is recommended by web experts. These images can rank in google images and act as great source of bringing back links to your website and videos also act as a great source for bringing natural back links.

#40: Interview your competitors or famous bloggers on your website

Getting interview dates of popular bloggers and software developers and uploading the videos and descriptive details on your youtube and blog can and asking them to share it on their website can make your blog traffic reach thousands of visitors overnight.

No comments:

Post a Comment

Comment if you like our post and encourage us