Category Archives: SEO Training Institute

HTTP Status Codes SEO:Definitive Guide of Http Status Codes


CRB Tech reviews focuses on the HTTP status codes through this blog. Let us try to understand this concept in detail.

Definition of HTTP Status Codes:

HyperText Transfer Protocol (or HTTP) reaction or response status codes are returned at whatever point web search engines or site viewers send a request to a web server. These three-digit codes demonstrate the reaction and status of HTTP requests.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

HTTP Status codes are three-digit numbers returned by servers that demonstrate the status of a web component.

Understand that the main or the initial digit of every three-digit status code starts with one of five numbers, 1 through to 5. From the 100s through the 500s, status codes fall into the accompanying classifications:

  • 500s – Server side error. The request raised by the client was valid. However, the server did not complete the request.

  • 400s – Client Side error. Request was sent by the client, page turns out to be invalid.

  • 300s – Redirection. Request has been accepted, but an additional step is necessary to complete it.

  • 200s – Success. Request was accepted and processed with success.

  • 100s – Informational. Request has been accepted and is in the process.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

Although, a lot of HTTP status codes exist, all of them are not vital from the SEO point of view.

A few top Tips:

  • It is vital to have altered 404 pages with prescribed navigational alternatives when site guests request pages that give a 404 response code.

  • Utilize 301 diverts as opposed to 302 redirects while diverting URLs on a website to guarantee that link juice (ranking power) is gone between the diverting website pages.

  • Website pages that come up with 404 (File Not Found) for augmented time frames and that have significant links ought to be 301 diverted to other site page.

See More:SEO Site Structure:How to Create SEO friendly Website Structure

Important HTTP Status Codes from SEO Perspective:

  1. 404 Status Code:

The server has not discovered anything coordinating the Request-URI. No sign is given of whether the condition is impermanent or lasting. This ought to happen whenever the server can’t locate a coordinating page request. Periodically, web-masters will display a text 404 error however the response code is a 200. This tells Internet searcher crawlers that the page has rendered effectively and commonly the website page will get mistakenly indexed.

See More:URL Optimization:5 Best Tips for URL Optimization-Crb Tech

  1. 503 Status Code:

The server is right now not able to handle the request because of a transitory over-loading or server maintenance. The 503 ought to be utilized at whatever point there is an interim blackout (for instance, if the server needs to descend for a brief period for maintenance). This guarantees the engines know get back soon in light of the fact that the page/site is down for a brief span.

  1. 302 Found Status Code:

The server is at present reacting to the request made with a page from an alternate location, yet the requester keeps on utilizing the initial location for requests in future. This methodology is not prescribed. It is not a compelling approach to teach search engine bots that a page or website has moved. Utilizing 302 will bring about Internet search engine crawlers to regard the redirect as an interim one and not give it the link juice (ranking power) capacities of 301 redirects.

See More:Mobile SEO : Mobile Optimization for Website -Crb Tech

  1. 410 Status Code:

The asked for request is no more available on the server and no sending location is known. This condition is relied upon to be viewed as perpetual. Customers/Clients with link altering capacities SHOULD erase references to the Request-URI after client approval. In the event that the server does not know–or has no way to determine–whether or not the condition is changeless, the status code 404 (Not Found) ought to be utilized rather than 410 (Gone). This response is cacheable unless demonstrated otherwise.

This was an insight into the world of HTTP status codes.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

SEO Site Structure:How to Create SEO friendly Website Structure


SEO Site Structure:How to Create SEO friendly Website Structure

CRB Tech reviews would elaborate on the concept called SEO friendly site structure, through this blog. This is one area which most of the resources and articles ignore. In spite of being an important aspect of SEO strategy.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

As of now, your site could be productive and streamlined, with site pages that are sensibly ordered and very much put like a sorted out documenting system. Then again, your site could have ranked organically, without much beginning thought to it’s structure, and has now turned into a scattered and elusive chaos of pages. In any case, on the off chance that you give some cautious thought to your structure as you develop, it could turn into a basic piece of your SEO success.

5 steps to create SEO friendly website structure:

1.Plan Site Hierarchy:

The most basic hierarchy structure is to have the most essential search ranking pages at the top, and to channel down to the less vital pages. There are pages that will be a special case to this, and these pages will be unavoidable so that exploring around your site will be clear and straightforward, yet this is the general guideline.

See More:Mobile SEO : Mobile Optimization for Website

It is imperative to design your site hierarchy before constructing your site, so that your pages don’t turn into a scattered chaos. In the event that you are uncertain where you will reach with your site later on, then attempt to tail this general structure as you go.

2.URL and Site Structure Should Match:

You ought to likewise guarantee that your URL structure coordinates with your hierarchy. For instance, on the off chance that you are selling a charming pink summer dress on your attire site, your URL may appear as though this:

The URL is unmistakably separated into Category → Sub-Category → Product. This is simple for the crawlers to peruse and simple for your clients to explore.

See More:Title Tag SEO: How to Optimize Title Tag

3.Website Depth Should Be Low:

Another key component of your structure is to have a shallow site. This means, each page on your site can be open inside a couple of clicks of your landing page. For good SEO, you ought to hope to make your site 3-clicks profound or less. It becomes easy for your site to be crawled, and the less demanding the crawl-ability, the simpler Google can locate your content and rank it.

The other purpose behind a shallow site is for a decent User Experience. On the off chance that a client is experiencing issues finding the data they need, then they can without much of a stretch get disappointed and proceed to some other site. Also, as we probably am aware, a high ricochet or bounce rate implies terrible SEO.

4.Make Use of Internal Linking:

In spite of the fact that making a low depth site is a decent technique, this can be troublesome, particularly for a huge site with many pages. Grouping your pages into a constrained measure of classes can be unimaginable, or simply doesn’t bode well. This is the place where internal linking can offer assistance.

An internal link is an inner association between two pages on the same site, and they have various focal points that are useful for SEO. The first is that they give a valuable approach to explore through your site, which decreases the quantity of clicks expected to navigate to any page on your site. Considering those crawlers adore a low profundity site, Google cherishes it. This, once more, makes a superior client experience for our visitors.

The other primary point of interest is that it gives a chance to utilize keyword anchor text.

5.Associate SEO Weight and Site Navigation:

The Site Navigation is frequently just seen as an apparatus to explore around a site, however truth be told it can be utilized as an instrument to give SEO weight to your favored pages.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

What we have to recall about the links that navigate is that, they are indeed internal links, as they link to different site pages on the same site. Our internal links function in the same way to external links – the more connections that point towards a specific site page, the more significance that search engines will give that page. In the event that a specific site page is on the navigation,this implies EVERY page on the site has an internal link guiding towards it. This tells search engine tools that, this is a critical page on your site, and this is a page that ought to rank.

Your site all in all will have a specific measure of domain authority. On the off chance that you have an individual page on your navigation, then a more noteworthy bit of the site’s authority – or SEO weight – will be given to this page. In this manner, the technique you ought to take is to put the pages you need to rank for on your navigation.

This inside balance obviously. Despite everything you need to your navigation to be viable in it’s initial use, which is giving a straightforward and clear approach to explore through your site.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

URL Optimization:5 Best Tips for URL Optimization-Crb Tech


URL Optimization:5 Best Tips for URL Optimization

CRB Tech reviews  would like to cover today, the most fundamental things optimization of url in SEO. These are the structure of URLs and domain name.Here we define 5 best tips for URL optimization in SEO. It is high time that we throw some light on it. One thing which we would like to clear is that, each of these techniques are not essential and critical to implement on each of the pages that you create. In other words, these practices would fall under the category of those things which would prove to be great if followed. If not, then there is no loss as such. It is not the ultimate thing in on page optimization. One funda which each SEO guy needs to remember is, the easier we make things for the search engines, the better it is for us.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

1.Remember, the more easy to read, the better it is:

It is but obvious that, the easier a URL is for reading, the better it is for the search engines. Easy accessibility is always important in SEO. Even more these days, when search engines can use advanced users and data signals as a lever to find out what people are engaged in vs not. It is not necessary that every bit of the URL is clean and perfect, but it should be easily grasped by the viewers of it.

This is basically a subjective topic.

2.Usage of Keywords in URLs:

It is supposed that you have done your keywords research before writing the content on the site. Using the keywords in the URL is a great idea for various reasons.

In the first place, keywords in the URL demonstrate to the individuals who see your URL on social media, in an email, or as they drift on a link to click that they’re getting what they need and expect, as appeared in the Meta-filter case beneath (note how floating on the link demonstrates the URL in the bottom left-hand corner).

Second, URLs get copied and pasted routinely, and when there’s no use of anchor text utilized as a part of a link, the URL itself serves as that anchor text (which is still an intense contribution for rankings).

See More:Mobile SEO : Mobile Optimization for Website

Third and the last thing, keywords in the URL are displayed in search results. This makes clear that, URL is a primary thing viewers consider while choosing which site to click.

3.Multiple URL’s with same content:

Copy content isn’t generally a search engine penalty (at any rate, not until/unless you begin copying at extensive scales), however, it can bring about a split of ranking signals that can hurt your pursuit traffic potential. On the off chance that Page A has some amount of ranking capacity and its copy, Page A2, has a comparative amount of ranking capacity, by canonicalizing them, Page A can have a superior opportunity to rank and acquire visits.

Thus, if there are two URLs with similar content, canonicalize them by using 301 redirect or a rel=canonical. To avoid this, content optimization and uniqueness are the preventive measures.

4.Filter out dynamic URL parameters whenever possible:

Static URL in SEO and dynamic URL in SEO are important concepts and need to be taken into consideration.

Some dynamic URL parameters are utilized for tracking links (like those embedded by well known social sharing applications, e.g. Buffer). As a rule, these don’t bring about a gigantic issue, however they may make for fairly unattractive and clumsily long URLs. Utilize your own particular judgment around whether the following parameter advantages exceed the negatives.

See More:Title Tag SEO: How to Optimize Title Tag

In the event that you can abstain from utilizing URL parameters, do as such. On the off chance that you have more than two URL parameters, it’s most likely worth making a genuine speculation to revamp them as static URL, meaningful, content.

5.Shorter vs Longer:

Shorter URLs are, as a rule, ideal. You don’t have to take this to the extreme, and if your URL is now under 50-60 characters, don’t stress over it by any stretch of the imagination. Be that as it may, in the event that you have URLs pushing 100+ characters, there’s likely a chance to revamp them and increase value.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Do Competitor Analysis in SEO-Crb Tech Reviews

seo competitor analysis

How to Do Competitor Analysis in SEO

CRB Tech reviews emphasizes on the benefits of a competitive analysis workflow in SEO.

Whenever you consider the SEO process, one of the important activities of the initial phase is carrying out competitive analysis. By doing this, one should be able to identify the SEO related targets, and get inputs to plan the strategy ahead.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

The analysis would differ based on the type of the industry or the domain and the scope of the entire SEO process. This isn’t a simple process at all, and many things are needed to be taken into consideration.

Why Chalk Out A Workflow Plan?

In order to simplify this competitive analysis process, one can take the help of a work-flow plan. When such a workflow is present before you, it becomes easier to replicate, document and control the flow. In simple terms, it is just like a flowchart. Various activities like filtering out the SEO competitors, identifying the keywords to target, their difficulty level and so on, can be highlighted in the workflow. This would give you a clear picture in order to plan a strategy that is effective.

See More:7 Best Content Marketing Techniques works in 2016

This can be considered to be one of the vital primary phases in the SEO process. You can get samples of workflow on the Internet.

Getting to know the four distinct analysis phases of SEO workflow:

  1. Identify and extract the potential competitors from SEO perspective:

This first stage is particularly useful in case you’re beginning with the SEO procedure for a new customer/client or industry which you are not at all aware about, and you have to begin sans zero to distinguish the majority of the possibly important competitors. As it is said, you may or may not know your friends. But you need to know about your enemy!

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

It’s essential to note that these are not as a matter of course constrained to organizations or sites that offer the same kind of content, products, services etc., that you do, yet can be any site that contends with you in the search results for your objective keywords.

  1. Validate the identified SEO competitors:

When you have the list of potential competitors that you have accumulated from various applicable sources, it’s an ideal opportunity to validate them, by breaking down and sifting those which are truly effectively ranking, and to which degree, for the same keywords that you’re focusing on.

Furthermore, at this stage you’ll additionally increase your rundown of potential target keywords by performing keyword research. This ought to utilize sources past the ones that you had effectively distinguished coming from your rivals and your current organic search data resources for which your rivals or yourself are still not ranking, that may provide you with new open doors.

  1. Do a comparison with your SEO competitors:

Since you have your SEO competitors and potential target keywords, you can collect, enlist and compare your site with your rivals, utilizing the majority of the available information to choose and organize those keywords. This will probably incorporate keyword relevance, current rankings, search volume, ranked pages, and domains, content optimization, link popularity and page results qualities, among others.

  1. Choose the target keywords:

This is the last phase in the workflow. It’s at last, time to dissect the already accumulated data for your own particular site and your rivals, utilizing the predetermined criteria to choose the best keyword to focus for your own particular circumstance in the short-, mid-, and long term amid your SEO process: Those with the most astounding relevance, search volume, and profitability. The best beginning stage is in rankings where you are competitive from a popularity and content perspective.S

See More:Top Image Optimization Techniques for SEO

There are many data sources and tools available which can assist you in the implementation of the process.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Optimize Page Speed-Tips for Optimizing Page Speed


How to Optimize Page Speed-Tips for Optimizing Page Speed


Page speed can be defined as the time taken a specific page of any website to fully display its contents or the time taken by the browser to receive the first byte from the server. A faster page speed is always better. a website taking more page load is less visited and sometimes resulting in an increase in the bounce rates of the website.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

 However, page load time of a website can be decreased by page speed optimization. By carrying out various activities in page optimization, gradually the page load time of any website can be easily reduced.

Listed below are some of the ways due to which the page load time can be reduced thus influencing page speed.


At the point when a client reaches your call is made to your server to convey the asked for documents.

The Larger these files are the more page load time it will require for them to get to your program and show up on the screen.

See More:7 Best Content Marketing Techniques in 2016

Gzip packs your pages and templates before sending them over to the program. This radically lessens exchange time subsequent to the records are much littler.


By upgrading your code (counting evacuating spaces, commas, and other pointless characters), you can significantly upgrade your page speed. Additionally evacuate code remarks, arranging, and unused code. Google suggests utilizing YUI Compressor for both CSS and JavaScript. Thus by removal of unnecessary characters, page speed can be enhanced and page load time is reduced.

See More:How To Do Keyword Research-In Depth Guide of Keyword Research.


Every time a page sidetracks to another page, your guest confronts page load time sitting tight for the HTTP ask for reaction cycle to finish. For instance, if your portable sidetrack design appears as though this: “ – > – > – >,” both of the two additional diverts makes your page stack slower the visitor instantly leaves the page and moves on to another website thus resulting in addition to bounce rates. Reduction in Redirects proves in influencing page speed.


Programs reserve a considerable measure of data (templates, pictures, JavaScript documents, and that’s only the tip of the iceberg) so that when a guest returns to your site, the program doesn’t need to reload the whole page. Utilize a device like YSlow to check whether you as of now have a lapse date set for your store. At that point set your “lapses” header for to what extent you need that data to be reserved. Much of the page load time, unless your site plan changes as often as possible, a year is a sensible time period. Google has more data about utilizing storing here.

See More:How to Optimize Sitemap.


Your server reaction time is influenced by the measure of activity you get, the assets every page utilizes, the product your server utilizes, and the facilitating arrangement you utilize. To enhance your server reaction time, search for execution bottlenecks like moderate database inquiries, moderate steering, or an absence of sufficient memory and fix them. The ideal server reaction time is under 200ms. Take in more about improving your opportunity to first byte.


Content distribution systems (CDNs), likewise called content delivery networks systems, are systems of servers that are utilized to disperse the heap of conveying substance. Basically, duplicates of your site are put away at different, topographically assorted server farms with the goal that clients have speedier and more solid access to your site, enhancing sessions of your sites.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags


Use CSS sprites to make a layout for pictures that you utilize often on your like catches and symbols. CSS sprites consolidate your pictures into one extensive picture that heaps at the same time (which implies HTTP solicitations) and afterward show just the areas that you need to appear. This implies you are sparing burden time by not making clients sit tight for numerous pictures to stack.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to do Interlinking in SEO – Definitive guide of interlinking

A Guide Illustrating How to Do Interlinking

CRB Tech reviews would like to illustrate on how to do effective interlinking through the medium of this blog.

First and foremost, let us understand as to what are internal links?

See More:7 Best Content Marketing Techniques in 2016

In simple terms, they are nothing but hyperlinks that navigate to the same domain from the one on which they are present. Basically it is a target and source kind of a relationship. For a common man, it can be stated like this:

A link that points or navigates to a different page on the same website. Here is an example of the same:

<a href=”” title=”Keyword Text”>Keyword Text</a>

How are these useful for on page SEO?

These links or such links can serve three purposes:

  1. Website navigation

  2. Help in developing the information hierarchy of a particular website.

  3. Help to spread ranking power of websites.

See More:How to Optimize Sitemap.

Interlinking for SEO:

In this section, we will see how to develop internal link based SEO-friendly website architecture.

If we consider a single page, then the search engines consider the content on it as a criteria to rank the pages on their indices. Therefore, deploying a proper content strategy is important. They also look out for a proper pathway in order to navigate all the pages of a particular website. Many of the sites become culprits by hiding the primary navigation denying search engines the access. This obstructs them from listing of pages in the indices.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

If a web crawler cannot access pages, then they simply do not exist. Leave alone the content, keywords present on it. It is totally a waste. The standard structure for a website should look similar to a pyramid. Here the top vertex is the home page of your site with the hierarchy flowing down.

Then, how can we achieve this for our website? Think!

The answer is, internal links or link building. This is the most fundamental format of a link—and it is prominently understandable to the search engines. The spiders realize that they ought to add this link to the engine’s link graph of the web, use it to ascertain query free variables (like MozRank), and tail it to index the contents of the referenced page. Let us see an illustration to understand the concept in a better manner.

Over here, the “a” tag shows the beginning point of the link. This tag can consist of text, images etc. These provide a clickable area on that particular page. This induces the visitors to navigate to other page. The referral location of a hyperlink indicates to the browser and the search engines where the link navigates. In this illustration, the page indicated is about custom belts made by a man named Jon Wye, so the link utilizes the anchor content “Jon Wye’s Custom Designed Belts.” The </a> label shuts the link, so that components later on in the page won’t have the connection credit connected to them.

See More:How To Do Keyword Research-In Depth Guide of Keyword Research.

Still, sometimes pages are not accessible. Hence, they cannot be indexed. Here is why?

  1. Links Accessible through local search boxes:

Web crawlers won’t endeavor to perform searches to discover content, and subsequently, it’s assessed that a huge number of pages are hidden up behind totally difficult to reach inaccessible internal search boxes.

  1. Links in flash, or other Plug-ins:

Any links implanted inside Flash, Java applets, and other modules are normally distant to search engines.

  1. Links on pages having thousands of links:

Search engines are programmed to access 150 links on an average per page. They might stop crawling further pages that are linked from the original page. This limit is not rigid, and vital pages may have more than 200 links on them. But, it is recommended that the links are kept limited to 150. Else, there is a risk of many pages remaining non visited. This in turn would affect the ranking and indexing of your site.

These are some of the important factors that are covered here.

For blogs on topics like heading tags, digital marketing, off page SEO etc. please visit CRB Tech reviews.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Optimize “Robots.txt”


How to Optimize “Robots.txt”

CRB Tech Reviews would like to guide you about Robots.txt through this blog. As the name suggests, it is nothing but a text file which webmasters often create, to command search engine robots and crawlers like Google bot on ways to crawl & index pages on their website.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

To show signs of comprehending it, consider robots.txt a visit guide for crawlers and bots. It takes the non human guests to the astonishing regions of the site where the content is and demonstrates to them what is critical to be and not to be indexed. Every one of this is finished with the assistance of a couple lines in a txt document design. Having an all around experienced robot aide can build the velocity at which the site is indexed, carving the time robots experience lines of code to locate the content the clients are searching for in the SERPs.

The Robots protocol called Robots Exclusion Protocol or REP is a collection of web standards that control web robot behavior and search engine indexing as well. It comprises of the following:

  • The first REP from 1994, expanded 1997, characterizing crawler orders for robots.txt. Some web indexes bolster augmentations like URI patterns (wild cards).

  • Its augmentation from 1996 characterizing indexer mandates (REP tags) for use in the meta robots component, otherwise called “robots meta tag.” Meanwhile, web indexes support extra REP tags with a X-Robots-Tag. Website admins can apply REP tags in the HTTP header of non-HTML assets like PDF reports or images.

  • The Microformat rel-nofollow from 2005 characterizing how search engines ought to handle links where the A Element’s REL property contains the value “nofollow.”

Important Standards or Rules:

  • Meta robots having the parameters “noindex, follow” should be deployed as a method to restrict crawling or indexation.

  • Only single “Disallow” line is permitted for each of the URL.

  • Subdomains associated with a root domain make use of separate robots.txt files.

  • Filename of this file is case sensitive. “robots.txt” is proper way, not “Robots.TXT.”

  • Query parameters cannot be separated by spaces. e.g. “/category/ /product page” would not be honored by robots.txt.

See More:7 Best Content Marketing Techniques in 2016

SEO Best Practices:

Blocking a Domain Page:

There are a few methods which help to block search engines from getting access to a particular domain.

Block with robots.txt:

This instructs the search engine not to crawl the provided URL. On the other hand, keep it in index and display in search results.

Block With Nofollowing Links:

A poor method to use and not recommended. Using this way, the search engines can find pages through toolbars of browser, links from various pages, analytics etc.

URLs blocked due to robots.txt errors:

Google was not able to crawl the URL because of a robots.txt confinement. This can happen for various reasons. For example, your robots.txt file may deny the Googlebot totally; it may restrict access to the registry in which this URL is found; or it may preclude access to the URL particularly. Frequently, this is not an error. You may have particularly set up a robots.txt file to keep us away from crawling this URL. On the off chance that this is the situation, there’s no compelling reason to settle this; we will keep on respecting robots.txt for this file.

For other topics related to SEO; like, on page optimization and off page optimization, heading tags, blocked links etc. please visit crb tech reviews.

General Information:

The robots.txt file is public—know that a robots.txt document is a freely accessible file. Anybody can see what areas of a server the webmaster has hindered the engines from. This implies if a SEO has private client data that they don’t need freely search-able, they ought to utilize a more secure methodology, for example, password protection—to keep viewers from surveying any classified pages they don’t need filed.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How To Fix URL Canonicalization Error In SEO


How To Fix URL Canonicalization Error In SEO

When one talks about SEO, we need to take into consideration various factors. The factors are related to both on-page as well as off-page. But if you do not do on page SEO in an optimum manner, then the efforts taken on your off page SEO optimization go totally waste. Final result being no output as expected.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

CRB Tech would like to explain the concept of Canonicalization to you and its optimum use through this blog. By reading this, you would be able to carry out Canonicalization of a website in an optimum manner.

Let us first understand what is URL Canonicalization?

The concept can be a bit tough to grasp. So, we would try to explain it in easy language and terms.

Suppose that a website has two URL’s. They are as follows:


These pages have content, however they do not redirect to even one of the URLs. This would lead to a copyright issue with Google showing duplicate content issue. You might have to pay penalties. Google’s Panda algorithm is a filter introduced to catch poor or duplicate content sites from ranking in the top results.

Let us look at one more example where there are two URLs on a website page, and the end up in same page resolution.

Now in this case if both of the pages show same content, then it is again a problem of sorts.

This problems tend to get ignored easily and result in heavy fines for copyright related issues. When such a thing happens, a search engine like Google is confused on which URL to add to its index. If two pages are found to be same content wise, then they are bound to be said to be copy of one another, resulting into a penalty.

Such an issue therefore needs to be resolved. Your server settings should be such that even if the user accesses your website with or without www, the site should open on any single version. This is the method to resolve a canonical error. But, in case you wish to display similar content on two different pages, then the use of rel=”canonical” tags becomes necessary. This tag allows a search engine to know as to which page is original and which is its copy. This technique helps to avoid content duplication.

Canonicalization Application and Canonical Tag Optimization:

As mentioned above, just a rel=”canonical” tag is enough to apply canonicalization.

Take into consideration above two URLs. Suppose that the second of the two has content same as the first one. Therefore, applying the canonical tag becomes a must. This would mark the index.php page URL as the canonical of the first one.

Syntax for its application.

<link rel=”canonical” href=””>

HTTP Header Tag Canonicalization:

The above tag can be also used in case of HTML scripted content. In case of non HTML content, HTTP Header Canonicalization needs to be used. Here is a sample of the same:

> HTTP/1.1 200 OK

> Content-Type: application/pdf

> Link: <>; rel=”canonical”

> Content-Length: 785710

Read more on this at Google’s official Webmaster blog.

Correct Use of Canonicalization:

  1. Never to be used outside of heading tags:

Search engine bots or spiders will absolutely disregard the labels that are set outside the <head> are of the site so with a specific end goal to apply a legitimate authoritative label, you have to include it between <head></head>.

  1. Multiple Canonical Tags a Strict No:

Utilizing more then one Canonical tags is pointless. Web crawlers will disregard both of the tags and you will confront peculiar SEO conduct and issues. Multiple canonical tag URLs are once in a while brought on because of plugin glitches so you may need to watch out for that.

  1. Do not utilize it for PageRank Sculpting:

Page Rank is no more a public entity or measurement to a site, however it is still considered by the web crawlers. In the event that you are wanting to utilize Canonical tags for PageRank sculpting and to show signs of better ranking, let us make it clear that it will accomplish more harm to your site than great.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags


Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

Heading tags form an important part of SEO. Today, we will see how to use these tags in an effective manner and in the right way. This would form an important part of the search engine optimization tips.

Heading tags are vital from the point of view of formatting or structuring the content for any kind of web document. By making use of the heading tags, we distinguish or assort the content that is present on the web page. It is recommended that you use the heading tags for headings only and not for making the text Large or Bold.

What Do You Mean By Heading Tags?

Heading tags <h1> to <h6> are used to define headings at various levels in the web document.

Amongst these, <h1></h1> tag happens to be the first in the hierarchy of headings. It provides you with the heading with the largest font. It comes at the first level. On the other hand, <h6></h6> is at the lowest level as far as the heading hierarchy is concerned.

Heading tags can be defined as follows, in your HTML code :




<h4>text </h4>



The h1 tag is supposed to be used for the main heading. Further, for sub headings, h2, h3 and so on tags can be used. The h1 tag is the boldest and the largest as opposed to h6 which is the smallest in size.

Do not be under the impression that, heading tags are only used for formatting of the web pages. Also, that search engines heading tags are just for the proper division of content on a web page. Instead, it forms one of the on page techniques that help to built user friendly websites. The search engines scan these defined headings for structure indexing. In short for, search engine rankings. So now you see the importance of heading tags optimization.

Making Use of Heading Tags for SEO:

Apart from the page structuring part, there are a few more uses of these tags that you should be acquainted with. Here are they:

  1. One Page Websites:

If you want to have immaculate on page SEO of your blog posts, then heading tags optimization is necessary. The common mistake committed by starters is over using the h2 and h3 tags. Sometimes, h3 tags are used instead of h2 tags.

This would affect your on-page SEO strategy and forms one of the important on page factors.

  1. Hierarchy:

Hierarchy of headings needs to be maintained in a proper and pre-defined manner on your web page. H1 tag would come first, followed by the others namely h2, h3, h4…h6. This is vital for a better SEO of your page and site.

  1. Purpose of Using Them:

The main aim of using the heading tags is to give the user a better feel of the web page with sorted content under the headings. They would get an idea by just reading the sub-headings. The h1 tag should be used in such a manner that it says everything about your page in one line. It is like the content summary of your page.

  1. Heading Tags with Keywords:

Keywords form the core part of SEO. Be it on page or off page SEO. The first step is to find the keywords by searching them. The second step is to assort them as focus keyword, primary keyword and so on. Using the focus keyword in the h1 tag is recommended. It should appear in the title.

Per a web page, there should be at least one h1 tag that acts like a newspaper headline. Sub headings should be used for sorting other content on the page.

  1. H1 Tag In HTML5:

In the older versions of HTML, use of only a single h1 tag per web document was allowed. But in the HTML 5 version, more than one h1 tag can be defined as per the requirement of the design.

Hope that you have understood the use of various heading tags for designing your web page.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

7 Best Content Marketing Techniques works in 2016


7 Best Content Marketing Techniques works in 2016

Usually, SEO organizations concentrate all their efforts on content strategy, totally ignoring competitors sites and pages. This is actually a surprising and shocking fact. By competitors, we mean organic search competitors.

In fact, some tools like SEMRush are available in the market for carrying out analysis between the owner site and other sites in the market. By doing this, one can understand the reason behind the high traffic on competitor site, instead of theirs.

Back-linking is considered to be one of the popular strategies. However, one should understand the fact that this can serve the purpose for a short time. If you want a permanent solution, then better content is the only option. Your content should be many times richer and better than that of the rival site. When one does so, links are a guarantee 100%.

Now, lets’ move on to the actual ways to rank content in a better manner:

  1. Content Types/Kind of Content:

When you speak about content marketing techniques, the most important thing is the type of content used on your site. Content can be presented in many types. Right from audio, to animations, video, info graphics etc. Any of the types can be used. Only the thing is that, one needs to choose wisely as stand out and easily reachable content will surely grab more attention. Info and micro graphics are used widely these days and have become immensely popular.

Also Read: Research & Data For Better Content Marketing

  1. Simplified Writing:

Avoid use of high fundo and complex words and phrases in your content. Keep the vocabulary as simple as possible. Here we do not have the aim to win any award for writing. We just need to keep our visitors engaged and make them understand as to what is written. If they do not understand what you have written, then they would simply close your page and move to the next site.

  1. Load Timing:

Slow load times simply kills your content. A number of factors are there which decide the loading time of your page. You need to understand these factors and work on them constantly. The funda is to step up the loading time of your site, and make it faster and more speedier. No user likes to view slow loading pages. People thrive for speed in this Internet era. Value of your page content would become zero if it takes too much time to load.

  1. Infographics:

One requirement of a big screen or a huge page is number of images that are aesthetic large and attractive. Also known as infographics if they contain text on them. Wide infographics are better avoided as they cannot be shared on a number of sites. Tall and visually attractive infographics attract, engage and convert a large number of visitors. Visual methods are very powerful and appealing.

  1. Content Length:

More in length, the better. It is advisable to write descriptive and wholesome articles on various topics. Bullets can be used to distinguish within the points. Sub headings are also powerful to divide the content properly. Properly divided content is easier for reading. Becomes easier to the eye rather than reading a complete para. Tags like strong, italics can be used to capture the attention of the user.

  1. Page Title:

It is agreed that, keywords and their usage is important. But having a page title that is worth a click is a master strategy. Often, a page title is published that is different than the title of the article. This is done to optimize the search. Never to catch search users with irrelevant titles. Initially, it may work but you may loose credibility with your visitors in the long run.

  1. Visuals:

It is found that on an average, article having 9 images is ranked first. Therefore, making use of diagrams, charts, images and other visual medium is essential. They should be shareable in nature and compelling.

These are a few powerful content marketing techniques in seo  to rank your content high on the search engine.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page