Category Archives: SEO training course

HTTP Status Codes SEO:Definitive Guide of Http Status Codes


CRB Tech reviews focuses on the HTTP status codes through this blog. Let us try to understand this concept in detail.

Definition of HTTP Status Codes:

HyperText Transfer Protocol (or HTTP) reaction or response status codes are returned at whatever point web search engines or site viewers send a request to a web server. These three-digit codes demonstrate the reaction and status of HTTP requests.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

HTTP Status codes are three-digit numbers returned by servers that demonstrate the status of a web component.

Understand that the main or the initial digit of every three-digit status code starts with one of five numbers, 1 through to 5. From the 100s through the 500s, status codes fall into the accompanying classifications:

  • 500s – Server side error. The request raised by the client was valid. However, the server did not complete the request.

  • 400s – Client Side error. Request was sent by the client, page turns out to be invalid.

  • 300s – Redirection. Request has been accepted, but an additional step is necessary to complete it.

  • 200s – Success. Request was accepted and processed with success.

  • 100s – Informational. Request has been accepted and is in the process.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

Although, a lot of HTTP status codes exist, all of them are not vital from the SEO point of view.

A few top Tips:

  • It is vital to have altered 404 pages with prescribed navigational alternatives when site guests request pages that give a 404 response code.

  • Utilize 301 diverts as opposed to 302 redirects while diverting URLs on a website to guarantee that link juice (ranking power) is gone between the diverting website pages.

  • Website pages that come up with 404 (File Not Found) for augmented time frames and that have significant links ought to be 301 diverted to other site page.

See More:SEO Site Structure:How to Create SEO friendly Website Structure

Important HTTP Status Codes from SEO Perspective:

  1. 404 Status Code:

The server has not discovered anything coordinating the Request-URI. No sign is given of whether the condition is impermanent or lasting. This ought to happen whenever the server can’t locate a coordinating page request. Periodically, web-masters will display a text 404 error however the response code is a 200. This tells Internet searcher crawlers that the page has rendered effectively and commonly the website page will get mistakenly indexed.

See More:URL Optimization:5 Best Tips for URL Optimization-Crb Tech

  1. 503 Status Code:

The server is right now not able to handle the request because of a transitory over-loading or server maintenance. The 503 ought to be utilized at whatever point there is an interim blackout (for instance, if the server needs to descend for a brief period for maintenance). This guarantees the engines know get back soon in light of the fact that the page/site is down for a brief span.

  1. 302 Found Status Code:

The server is at present reacting to the request made with a page from an alternate location, yet the requester keeps on utilizing the initial location for requests in future. This methodology is not prescribed. It is not a compelling approach to teach search engine bots that a page or website has moved. Utilizing 302 will bring about Internet search engine crawlers to regard the redirect as an interim one and not give it the link juice (ranking power) capacities of 301 redirects.

See More:Mobile SEO : Mobile Optimization for Website -Crb Tech

  1. 410 Status Code:

The asked for request is no more available on the server and no sending location is known. This condition is relied upon to be viewed as perpetual. Customers/Clients with link altering capacities SHOULD erase references to the Request-URI after client approval. In the event that the server does not know–or has no way to determine–whether or not the condition is changeless, the status code 404 (Not Found) ought to be utilized rather than 410 (Gone). This response is cacheable unless demonstrated otherwise.

This was an insight into the world of HTTP status codes.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

SEO Site Structure:How to Create SEO friendly Website Structure


SEO Site Structure:How to Create SEO friendly Website Structure

CRB Tech reviews would elaborate on the concept called SEO friendly site structure, through this blog. This is one area which most of the resources and articles ignore. In spite of being an important aspect of SEO strategy.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

As of now, your site could be productive and streamlined, with site pages that are sensibly ordered and very much put like a sorted out documenting system. Then again, your site could have ranked organically, without much beginning thought to it’s structure, and has now turned into a scattered and elusive chaos of pages. In any case, on the off chance that you give some cautious thought to your structure as you develop, it could turn into a basic piece of your SEO success.

5 steps to create SEO friendly website structure:

1.Plan Site Hierarchy:

The most basic hierarchy structure is to have the most essential search ranking pages at the top, and to channel down to the less vital pages. There are pages that will be a special case to this, and these pages will be unavoidable so that exploring around your site will be clear and straightforward, yet this is the general guideline.

See More:Mobile SEO : Mobile Optimization for Website

It is imperative to design your site hierarchy before constructing your site, so that your pages don’t turn into a scattered chaos. In the event that you are uncertain where you will reach with your site later on, then attempt to tail this general structure as you go.

2.URL and Site Structure Should Match:

You ought to likewise guarantee that your URL structure coordinates with your hierarchy. For instance, on the off chance that you are selling a charming pink summer dress on your attire site, your URL may appear as though this:

The URL is unmistakably separated into Category → Sub-Category → Product. This is simple for the crawlers to peruse and simple for your clients to explore.

See More:Title Tag SEO: How to Optimize Title Tag

3.Website Depth Should Be Low:

Another key component of your structure is to have a shallow site. This means, each page on your site can be open inside a couple of clicks of your landing page. For good SEO, you ought to hope to make your site 3-clicks profound or less. It becomes easy for your site to be crawled, and the less demanding the crawl-ability, the simpler Google can locate your content and rank it.

The other purpose behind a shallow site is for a decent User Experience. On the off chance that a client is experiencing issues finding the data they need, then they can without much of a stretch get disappointed and proceed to some other site. Also, as we probably am aware, a high ricochet or bounce rate implies terrible SEO.

4.Make Use of Internal Linking:

In spite of the fact that making a low depth site is a decent technique, this can be troublesome, particularly for a huge site with many pages. Grouping your pages into a constrained measure of classes can be unimaginable, or simply doesn’t bode well. This is the place where internal linking can offer assistance.

An internal link is an inner association between two pages on the same site, and they have various focal points that are useful for SEO. The first is that they give a valuable approach to explore through your site, which decreases the quantity of clicks expected to navigate to any page on your site. Considering those crawlers adore a low profundity site, Google cherishes it. This, once more, makes a superior client experience for our visitors.

The other primary point of interest is that it gives a chance to utilize keyword anchor text.

5.Associate SEO Weight and Site Navigation:

The Site Navigation is frequently just seen as an apparatus to explore around a site, however truth be told it can be utilized as an instrument to give SEO weight to your favored pages.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

What we have to recall about the links that navigate is that, they are indeed internal links, as they link to different site pages on the same site. Our internal links function in the same way to external links – the more connections that point towards a specific site page, the more significance that search engines will give that page. In the event that a specific site page is on the navigation,this implies EVERY page on the site has an internal link guiding towards it. This tells search engine tools that, this is a critical page on your site, and this is a page that ought to rank.

Your site all in all will have a specific measure of domain authority. On the off chance that you have an individual page on your navigation, then a more noteworthy bit of the site’s authority – or SEO weight – will be given to this page. In this manner, the technique you ought to take is to put the pages you need to rank for on your navigation.

This inside balance obviously. Despite everything you need to your navigation to be viable in it’s initial use, which is giving a straightforward and clear approach to explore through your site.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Do Competitor Analysis in SEO-Crb Tech Reviews

seo competitor analysis

How to Do Competitor Analysis in SEO

CRB Tech reviews emphasizes on the benefits of a competitive analysis workflow in SEO.

Whenever you consider the SEO process, one of the important activities of the initial phase is carrying out competitive analysis. By doing this, one should be able to identify the SEO related targets, and get inputs to plan the strategy ahead.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

The analysis would differ based on the type of the industry or the domain and the scope of the entire SEO process. This isn’t a simple process at all, and many things are needed to be taken into consideration.

Why Chalk Out A Workflow Plan?

In order to simplify this competitive analysis process, one can take the help of a work-flow plan. When such a workflow is present before you, it becomes easier to replicate, document and control the flow. In simple terms, it is just like a flowchart. Various activities like filtering out the SEO competitors, identifying the keywords to target, their difficulty level and so on, can be highlighted in the workflow. This would give you a clear picture in order to plan a strategy that is effective.

See More:7 Best Content Marketing Techniques works in 2016

This can be considered to be one of the vital primary phases in the SEO process. You can get samples of workflow on the Internet.

Getting to know the four distinct analysis phases of SEO workflow:

  1. Identify and extract the potential competitors from SEO perspective:

This first stage is particularly useful in case you’re beginning with the SEO procedure for a new customer/client or industry which you are not at all aware about, and you have to begin sans zero to distinguish the majority of the possibly important competitors. As it is said, you may or may not know your friends. But you need to know about your enemy!

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

It’s essential to note that these are not as a matter of course constrained to organizations or sites that offer the same kind of content, products, services etc., that you do, yet can be any site that contends with you in the search results for your objective keywords.

  1. Validate the identified SEO competitors:

When you have the list of potential competitors that you have accumulated from various applicable sources, it’s an ideal opportunity to validate them, by breaking down and sifting those which are truly effectively ranking, and to which degree, for the same keywords that you’re focusing on.

Furthermore, at this stage you’ll additionally increase your rundown of potential target keywords by performing keyword research. This ought to utilize sources past the ones that you had effectively distinguished coming from your rivals and your current organic search data resources for which your rivals or yourself are still not ranking, that may provide you with new open doors.

  1. Do a comparison with your SEO competitors:

Since you have your SEO competitors and potential target keywords, you can collect, enlist and compare your site with your rivals, utilizing the majority of the available information to choose and organize those keywords. This will probably incorporate keyword relevance, current rankings, search volume, ranked pages, and domains, content optimization, link popularity and page results qualities, among others.

  1. Choose the target keywords:

This is the last phase in the workflow. It’s at last, time to dissect the already accumulated data for your own particular site and your rivals, utilizing the predetermined criteria to choose the best keyword to focus for your own particular circumstance in the short-, mid-, and long term amid your SEO process: Those with the most astounding relevance, search volume, and profitability. The best beginning stage is in rankings where you are competitive from a popularity and content perspective.S

See More:Top Image Optimization Techniques for SEO

There are many data sources and tools available which can assist you in the implementation of the process.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Optimize Page Speed-Tips for Optimizing Page Speed


How to Optimize Page Speed-Tips for Optimizing Page Speed


Page speed can be defined as the time taken a specific page of any website to fully display its contents or the time taken by the browser to receive the first byte from the server. A faster page speed is always better. a website taking more page load is less visited and sometimes resulting in an increase in the bounce rates of the website.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

 However, page load time of a website can be decreased by page speed optimization. By carrying out various activities in page optimization, gradually the page load time of any website can be easily reduced.

Listed below are some of the ways due to which the page load time can be reduced thus influencing page speed.


At the point when a client reaches your call is made to your server to convey the asked for documents.

The Larger these files are the more page load time it will require for them to get to your program and show up on the screen.

See More:7 Best Content Marketing Techniques in 2016

Gzip packs your pages and templates before sending them over to the program. This radically lessens exchange time subsequent to the records are much littler.


By upgrading your code (counting evacuating spaces, commas, and other pointless characters), you can significantly upgrade your page speed. Additionally evacuate code remarks, arranging, and unused code. Google suggests utilizing YUI Compressor for both CSS and JavaScript. Thus by removal of unnecessary characters, page speed can be enhanced and page load time is reduced.

See More:How To Do Keyword Research-In Depth Guide of Keyword Research.


Every time a page sidetracks to another page, your guest confronts page load time sitting tight for the HTTP ask for reaction cycle to finish. For instance, if your portable sidetrack design appears as though this: “ – > – > – >,” both of the two additional diverts makes your page stack slower the visitor instantly leaves the page and moves on to another website thus resulting in addition to bounce rates. Reduction in Redirects proves in influencing page speed.


Programs reserve a considerable measure of data (templates, pictures, JavaScript documents, and that’s only the tip of the iceberg) so that when a guest returns to your site, the program doesn’t need to reload the whole page. Utilize a device like YSlow to check whether you as of now have a lapse date set for your store. At that point set your “lapses” header for to what extent you need that data to be reserved. Much of the page load time, unless your site plan changes as often as possible, a year is a sensible time period. Google has more data about utilizing storing here.

See More:How to Optimize Sitemap.


Your server reaction time is influenced by the measure of activity you get, the assets every page utilizes, the product your server utilizes, and the facilitating arrangement you utilize. To enhance your server reaction time, search for execution bottlenecks like moderate database inquiries, moderate steering, or an absence of sufficient memory and fix them. The ideal server reaction time is under 200ms. Take in more about improving your opportunity to first byte.


Content distribution systems (CDNs), likewise called content delivery networks systems, are systems of servers that are utilized to disperse the heap of conveying substance. Basically, duplicates of your site are put away at different, topographically assorted server farms with the goal that clients have speedier and more solid access to your site, enhancing sessions of your sites.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags


Use CSS sprites to make a layout for pictures that you utilize often on your like catches and symbols. CSS sprites consolidate your pictures into one extensive picture that heaps at the same time (which implies HTTP solicitations) and afterward show just the areas that you need to appear. This implies you are sparing burden time by not making clients sit tight for numerous pictures to stack.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to do Interlinking in SEO – Definitive guide of interlinking

A Guide Illustrating How to Do Interlinking

CRB Tech reviews would like to illustrate on how to do effective interlinking through the medium of this blog.

First and foremost, let us understand as to what are internal links?

See More:7 Best Content Marketing Techniques in 2016

In simple terms, they are nothing but hyperlinks that navigate to the same domain from the one on which they are present. Basically it is a target and source kind of a relationship. For a common man, it can be stated like this:

A link that points or navigates to a different page on the same website. Here is an example of the same:

<a href=”” title=”Keyword Text”>Keyword Text</a>

How are these useful for on page SEO?

These links or such links can serve three purposes:

  1. Website navigation

  2. Help in developing the information hierarchy of a particular website.

  3. Help to spread ranking power of websites.

See More:How to Optimize Sitemap.

Interlinking for SEO:

In this section, we will see how to develop internal link based SEO-friendly website architecture.

If we consider a single page, then the search engines consider the content on it as a criteria to rank the pages on their indices. Therefore, deploying a proper content strategy is important. They also look out for a proper pathway in order to navigate all the pages of a particular website. Many of the sites become culprits by hiding the primary navigation denying search engines the access. This obstructs them from listing of pages in the indices.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

If a web crawler cannot access pages, then they simply do not exist. Leave alone the content, keywords present on it. It is totally a waste. The standard structure for a website should look similar to a pyramid. Here the top vertex is the home page of your site with the hierarchy flowing down.

Then, how can we achieve this for our website? Think!

The answer is, internal links or link building. This is the most fundamental format of a link—and it is prominently understandable to the search engines. The spiders realize that they ought to add this link to the engine’s link graph of the web, use it to ascertain query free variables (like MozRank), and tail it to index the contents of the referenced page. Let us see an illustration to understand the concept in a better manner.

Over here, the “a” tag shows the beginning point of the link. This tag can consist of text, images etc. These provide a clickable area on that particular page. This induces the visitors to navigate to other page. The referral location of a hyperlink indicates to the browser and the search engines where the link navigates. In this illustration, the page indicated is about custom belts made by a man named Jon Wye, so the link utilizes the anchor content “Jon Wye’s Custom Designed Belts.” The </a> label shuts the link, so that components later on in the page won’t have the connection credit connected to them.

See More:How To Do Keyword Research-In Depth Guide of Keyword Research.

Still, sometimes pages are not accessible. Hence, they cannot be indexed. Here is why?

  1. Links Accessible through local search boxes:

Web crawlers won’t endeavor to perform searches to discover content, and subsequently, it’s assessed that a huge number of pages are hidden up behind totally difficult to reach inaccessible internal search boxes.

  1. Links in flash, or other Plug-ins:

Any links implanted inside Flash, Java applets, and other modules are normally distant to search engines.

  1. Links on pages having thousands of links:

Search engines are programmed to access 150 links on an average per page. They might stop crawling further pages that are linked from the original page. This limit is not rigid, and vital pages may have more than 200 links on them. But, it is recommended that the links are kept limited to 150. Else, there is a risk of many pages remaining non visited. This in turn would affect the ranking and indexing of your site.

These are some of the important factors that are covered here.

For blogs on topics like heading tags, digital marketing, off page SEO etc. please visit CRB Tech reviews.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Optimize Sitemap


How to Optimize Sitemap

CRB Tech found out that during the evolutionary days of SEO, many of the firms functional in this domain, were running pretty good by just submitting your website to a number of search engines. Although, this method is not considered to be the ideal one for gaining SEO nirvana, the present SEO ways gives us with opportunities to shape our content the way we want. In all forms, shapes and sizes and indexed on search engines. Depends on the skills and abilities one has.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

In the crawling phase of SEO process, our main target is to hold from search engines through the medium of robots.txt and meta tag implementation. But, an equal emphasis needs to be given on the URLs and the kind of content we submit to the search engines.

See More:How To Do Keyword Research-In Depth Guide of Keyword Research.

What kind of revelation did the introduction of site-maps bring about ?

Earlier, a HTML sitemap (for the higher-level pages) was created and linked from the footers of all the pages on the site. By this way, the search engines were served with a bunch of site URLs from any single page of your site.

What was the advantage of using XML sitemaps over the HTML ones then?

XML or Extensible Markup Language became the chosen means of data consumption by the search engines. With this technique available for use, the administrator of the site has now the power to submit search engines, the data on those pages that are to be crawled. Along with it, the priority or hierarchy of website content. Information on page updation details can also be provided. So many benefits!

See More:7 Best Content Marketing Techniques in 2016

Let Us Now Learn to Construct a Standard XML Sitemap:

This is the syntax of a basic XML sitemap URL entry


Many XML sitemap generator tools are available. A few of them come at no cost, and possess a crawl cap on site URLs, therefore, these are of no use. Good sitemap tools come at a price but are worth it. A popular tool is “Sitemap Writer Pro”. It costs about 25 dollars.

On the off chance that you do utilize different tools, pick the one that permits you to monitor the crawl of URLs and permits you to effortlessly expel any copied URLs, dynamic parameters, excluded URLs, and so forth. Recall that, you just need to incorporate the pages on the site that you need a web crawler to index and rank.

See More:Heading Tags SEO :Tips of How To Optimize H1 to H6 Tags

Easy method to upload and submit the sitemap

Once the basic sitemap is developed, it is required to be uploaded on the site. This sitemap should connect with the root and a concern page naming standard like /sitemap.xml.

After you are done with the same,

Go To Google Webmaster Tools and submit your sitemap over there. They could find it on your website. But, the better approach is to submit to search engines with this information and provide Google and Bing the power to report indexing problems.

How to debug sitemap errors?

You’ve given your URLs to the top search engines in the favored XML markup, yet how are they indexing the content? Is it accurate to say that they are having any issues? The great admonition of giving this data straightforwardly to Webmaster Tools accounts is that you can survey what content you might withhold from search engines coincidentally.

Google has done an improved job of sitemap issue transparency contrasted with Bing, which gives a much littler measure of information for review.

In this occurrence, we’ve presented a XML sitemap and got an error that URLs in the sitemap are likewise highlighted in the robots.txt file.

It’s critical to pay consideration on this kind of error and warning info. They will most likely be unable to try and read the XML sitemap. What’s more, we can likewise gather data on what critical URLs we are accidentally withholding from crawls in the robots.txt file.

See More:7 Best Content Marketing Techniques works in 2016

Keeping an eye on the sitemap is necessary for a SEO process. It will provide you the data of the number of URLs submitted, the number of indexed ones in Google and the last instance when the file was updated.

Here is an illustration of sitemap for mobile pages:

Have you got mobile pages on your site? Allow the search engines to gain knowledge about the URLs serving to mobile users.


With as much exertion that goes into the development of extraordinary content, particularly these days, taking the additional time of guaranteeing that you’ve done everything possible to guarantee full indexation is basic to recovering the worth out of the effort.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How to Optimize “Robots.txt”


How to Optimize “Robots.txt”

CRB Tech Reviews would like to guide you about Robots.txt through this blog. As the name suggests, it is nothing but a text file which webmasters often create, to command search engine robots and crawlers like Google bot on ways to crawl & index pages on their website.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

To show signs of comprehending it, consider robots.txt a visit guide for crawlers and bots. It takes the non human guests to the astonishing regions of the site where the content is and demonstrates to them what is critical to be and not to be indexed. Every one of this is finished with the assistance of a couple lines in a txt document design. Having an all around experienced robot aide can build the velocity at which the site is indexed, carving the time robots experience lines of code to locate the content the clients are searching for in the SERPs.

The Robots protocol called Robots Exclusion Protocol or REP is a collection of web standards that control web robot behavior and search engine indexing as well. It comprises of the following:

  • The first REP from 1994, expanded 1997, characterizing crawler orders for robots.txt. Some web indexes bolster augmentations like URI patterns (wild cards).

  • Its augmentation from 1996 characterizing indexer mandates (REP tags) for use in the meta robots component, otherwise called “robots meta tag.” Meanwhile, web indexes support extra REP tags with a X-Robots-Tag. Website admins can apply REP tags in the HTTP header of non-HTML assets like PDF reports or images.

  • The Microformat rel-nofollow from 2005 characterizing how search engines ought to handle links where the A Element’s REL property contains the value “nofollow.”

Important Standards or Rules:

  • Meta robots having the parameters “noindex, follow” should be deployed as a method to restrict crawling or indexation.

  • Only single “Disallow” line is permitted for each of the URL.

  • Subdomains associated with a root domain make use of separate robots.txt files.

  • Filename of this file is case sensitive. “robots.txt” is proper way, not “Robots.TXT.”

  • Query parameters cannot be separated by spaces. e.g. “/category/ /product page” would not be honored by robots.txt.

See More:7 Best Content Marketing Techniques in 2016

SEO Best Practices:

Blocking a Domain Page:

There are a few methods which help to block search engines from getting access to a particular domain.

Block with robots.txt:

This instructs the search engine not to crawl the provided URL. On the other hand, keep it in index and display in search results.

Block With Nofollowing Links:

A poor method to use and not recommended. Using this way, the search engines can find pages through toolbars of browser, links from various pages, analytics etc.

URLs blocked due to robots.txt errors:

Google was not able to crawl the URL because of a robots.txt confinement. This can happen for various reasons. For example, your robots.txt file may deny the Googlebot totally; it may restrict access to the registry in which this URL is found; or it may preclude access to the URL particularly. Frequently, this is not an error. You may have particularly set up a robots.txt file to keep us away from crawling this URL. On the off chance that this is the situation, there’s no compelling reason to settle this; we will keep on respecting robots.txt for this file.

For other topics related to SEO; like, on page optimization and off page optimization, heading tags, blocked links etc. please visit crb tech reviews.

General Information:

The robots.txt file is public—know that a robots.txt document is a freely accessible file. Anybody can see what areas of a server the webmaster has hindered the engines from. This implies if a SEO has private client data that they don’t need freely search-able, they ought to utilize a more secure methodology, for example, password protection—to keep viewers from surveying any classified pages they don’t need filed.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

How To Fix URL Canonicalization Error In SEO


How To Fix URL Canonicalization Error In SEO

When one talks about SEO, we need to take into consideration various factors. The factors are related to both on-page as well as off-page. But if you do not do on page SEO in an optimum manner, then the efforts taken on your off page SEO optimization go totally waste. Final result being no output as expected.

See More:Top 5 Digital Marketing Training Institutes in Pune & Their Reviews.

CRB Tech would like to explain the concept of Canonicalization to you and its optimum use through this blog. By reading this, you would be able to carry out Canonicalization of a website in an optimum manner.

Let us first understand what is URL Canonicalization?

The concept can be a bit tough to grasp. So, we would try to explain it in easy language and terms.

Suppose that a website has two URL’s. They are as follows:


These pages have content, however they do not redirect to even one of the URLs. This would lead to a copyright issue with Google showing duplicate content issue. You might have to pay penalties. Google’s Panda algorithm is a filter introduced to catch poor or duplicate content sites from ranking in the top results.

Let us look at one more example where there are two URLs on a website page, and the end up in same page resolution.

Now in this case if both of the pages show same content, then it is again a problem of sorts.

This problems tend to get ignored easily and result in heavy fines for copyright related issues. When such a thing happens, a search engine like Google is confused on which URL to add to its index. If two pages are found to be same content wise, then they are bound to be said to be copy of one another, resulting into a penalty.

Such an issue therefore needs to be resolved. Your server settings should be such that even if the user accesses your website with or without www, the site should open on any single version. This is the method to resolve a canonical error. But, in case you wish to display similar content on two different pages, then the use of rel=”canonical” tags becomes necessary. This tag allows a search engine to know as to which page is original and which is its copy. This technique helps to avoid content duplication.

Canonicalization Application and Canonical Tag Optimization:

As mentioned above, just a rel=”canonical” tag is enough to apply canonicalization.

Take into consideration above two URLs. Suppose that the second of the two has content same as the first one. Therefore, applying the canonical tag becomes a must. This would mark the index.php page URL as the canonical of the first one.

Syntax for its application.

<link rel=”canonical” href=””>

HTTP Header Tag Canonicalization:

The above tag can be also used in case of HTML scripted content. In case of non HTML content, HTTP Header Canonicalization needs to be used. Here is a sample of the same:

> HTTP/1.1 200 OK

> Content-Type: application/pdf

> Link: <>; rel=”canonical”

> Content-Length: 785710

Read more on this at Google’s official Webmaster blog.

Correct Use of Canonicalization:

  1. Never to be used outside of heading tags:

Search engine bots or spiders will absolutely disregard the labels that are set outside the <head> are of the site so with a specific end goal to apply a legitimate authoritative label, you have to include it between <head></head>.

  1. Multiple Canonical Tags a Strict No:

Utilizing more then one Canonical tags is pointless. Web crawlers will disregard both of the tags and you will confront peculiar SEO conduct and issues. Multiple canonical tag URLs are once in a while brought on because of plugin glitches so you may need to watch out for that.

  1. Do not utilize it for PageRank Sculpting:

Page Rank is no more a public entity or measurement to a site, however it is still considered by the web crawlers. In the event that you are wanting to utilize Canonical tags for PageRank sculpting and to show signs of better ranking, let us make it clear that it will accomplish more harm to your site than great.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

Searching The Experience for Optimization

Searching The Experience for Optimization The last few years, search engines such as Google, Bing, and even Apple, have been upgrading their algorithms and machine learning processes to account for the end-user’s experience. But, since their algorithms are built upon the work completed by automated crawling bots (pieces of software that manually scour the internet), it has always been difficult for them to truly simulate the actions of a flesh and blood user. And it’s not feasible for them to create an algorithm that’s based on the anecdotal feedback of an army of individual users that submit their findings.


Instead the search engines have started to write logic that, to their best estimation, is what a user experience should be on a website. Some of the criteria they are now measuring are site speed, mobile optimization, site structure, content, and dozens of other signals that should give the algorithm an idea of whether or not search engine users are getting what they expect from a website.

So, what does this mean for companies, marketers, and website owners when it comes to their SEO?

Basically what I, and dozens of other SEO industry experts, have been writing about for years has now come to fruition. We’ve exited the era of search engine optimization (SEO), and have now entered the new age of search experience optimization (also… SEO).

And this is great news for anyone that performs digital marketing correctly. It means that “gaming” the system has become less and less viable, and that groups who rely on black hat techniques are seeing their efforts become less effective.

So, how should websites be optimized for the search engines now that user experience plays such a big role?

Ask questions, provide answers.

Previously, marketers used to obsess over ideas like keyword density, meta descriptions, and link profiles. They had everything down to percentages and numbers and it all made sense when it was placed into an excel sheet. But how on earth was a website that was built from data on an excel sheet supposed to appeal to a human being?

That’s the problem the search engines set out to fix. And you need to accommodate the changes they’ve made. Our seo training institute is always there for you to make your profession in this field.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page

SEO Work To The Next Well

SEO Work To The Next Well Getting visitors or traffic has become the main to any company that goes online. Every marketer today has a acutely described space for Look for Engine Optimization (SEO) in their marketing strategies. Therefore, it’s important to understand the changing characteristics of SEO and keep changing with the modification.

Principally, promoters have to spend time on understanding customer behavior when it comes to discover. What it is that they are looking for or writing naturally. This behavior research types the tool kit for optimization. Hence, every industry and industry will have to focus on their classification and tap the industry accordingly.


Going forward, SEO explore cellular phones will be popular, as most everyone is searching for small company, i.e., physicians, home rentals, beauty products products, dining places close by, using cellular search. This is applicable to companies that industry on nationwide as well as worldwide levels, as these basic principles are key to any company.

If we carefully notice what Search engines is doing, we will realize that in the last few years it has been pushing manufacturers and companies to avoid certain techniques with SEO, such as poor material, key word filling, bad links. In other words, asking them not to build a bad encounter for users who are searching for excellent material and excellent companies.

Keeping our hearing to the ground, we have to modify when Search engines changes. Search engines is pushing manufacturers to develop excellent material and customer encounter by creating mobile-friendly websites that individuals helpful and useful. It has smart individuals innovative technology to figure out whether your website is really useful and everyone is discussing material on social networking and has a variety of alerts it paths to determine how it should position your company.

SEO is a complex marketing self-discipline, and it can be a problem to conduct high-level, high-quality SEO perform on a daily basis. When you’re striving to get the best outcomes for your client, it can be easy for cut ends here and there to meet up with (or even beat) work work deadlines.

While this process may be effective sometimes, gradually, it can make more do the job down the line. By trying to be thorough, you can estimate and stop future difficulties that avoid enhancement and create problems for you and the client. Our seo training course is always there for you to make your profession in this field.

Don't be shellfish...Digg thisEmail this to someoneShare on RedditShare on TumblrShare on FacebookShare on Google+Tweet about this on TwitterShare on StumbleUponBuffer this pageShare on LinkedInPin on PinterestPrint this page