If you are new to SEO, and not aware of SEO terms used by marketers or bloggers, then this is the right place to start.
This article explains the important SEO terms, that you should understand as a marketer or as a new comer to web and SEO industry.
These SEO terms will help you to communicate with your online marketing team.
SEO Terms you should know
1. SEO / Search Engine Optimization?
SEO stands for “Search Engine Optimization.” It’s a process of getting traffic from the free, organic, editorial or natural listings on search engines. All major search engines such as Google, Yahoo and Bing have such results, where web pages and other content.
For Example, Google ranks videos or local listing based on relevancy to users.
SEO is an essential part of any website. If a website wants to improve the performance through the search engine like Google, Bing, yahoo and much more.
In short, it means getting high traffic to your site.
SEO affects the visibility of your site in search engine. Higher and more frequently site will appear on the search result, the more traffic it will get from the search engine.
PageRank is a link analysis algorithm used by the Google web search engine to rank websites in their search engine results. According to Google,
“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.”
Every website is given a Google PageRank score between 0 and 10 on an exponential scale.
3. Googlebot and how Googlebot works
Googlebot is Google’s web crawling bot (a computer program, also called a “spider”). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
Some important points about GoogleBot
- It can quickly build a list of links that come from the Web.
- recrawls popular frequently changing web pages to keep the index current.
- can only follow HREFlinks and SRC links.
- takes up an enormous amount of bandwidth.
- Some pages may take longer to find, so crawling may occur once a month vice daily.
- must be setup/programmed to function properly.
4. Google Indexing
Googlebot crawls your website to save the web pages in the database, this process is called indexing.
According to Google, Google index is like an index in a library. It lists information about all the books the library has available. But, instead of books, the Google index lists all the web pages that Google knows about. When Google visits your site, it detects new and updated pages and updates the Google index.
But, this is not guaranteed that Google will index every page. To see indexed pages of your website in Google search, try this “site: yoursiteurl.com”. This will display the URLs currently indexed by the Google.
A website owner can use the robots.txt file, to provide instructions to web robots (like Googlebot). This is called The Robots Exclusion Protocol.
This is a text file with specific code. It informs search bots about the structure of the website. For example, You can block any specific search bot using robots.txt. To restrict the access to specific folders of the section inside the website.
Place this text file in the root folder of your site. It tells Googlebot about the structure of your site. You can block any specific page, restrict the access.
Before a bot visits your website, its firsts checks for http://www.mydomain.com/robots.txt, and finds:
- The “User-agent: *” means this section applies to all robots.
- The “Disallow: /” tells the robot that it should not visit any pages on the site.
There are two important considerations when using /robots.txt:
- robots can ignore your /robots.txt. Especially, malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
- the /robots.txt file is a public file. Anyone can see what sections of your server you don’t want robots to use.
Suppose you write an article about latest Google algorithm update. You will add a link to Google official page for this information to support your blog post. This link will be called a backlink to Google. Same thing happens when a website adds a link to your website, you get a backlink.
Getting a backlink is no enough though. A high-quality backlink is which will help you in SEO. As explained by Matt Cutts, creating backlinks using directory method no longer works.
Similarly, if a spammy website points to you, it hurts your SEO negatively. Therefore, it’s recommended, to keep your site’s backlinks clean.
Having more high quality backlinks, tells search engine that your web page is amazing enough for their users, and they boost your ranking in search engine results.
Recommended : Link Building: The Definitive Guide
7. Keywords in SEO
SEO terms Keywords is a word or phrase, that defines the content of your web page. To learn more about Keywords, please check this article.
8. Keyword stuffing
If you place the keyword on your page, multiple times and without relevancy, Google calls it Keyword stuffing. It is done to manipulate the site’s ranking in search engine and comes under black hat technique. Google suggests, that you should strictly avoid the Keyword stuffing. Here is an example of Keyword stuffing, by Google.
Repeating the same words or phrases so often that it sounds unnatural. For example:
We sell custom cigar humidors. Our custom cigar humidors are handmade. If you’re thinking of buying a custom cigar humidor, please contact our custom cigar humidor specialists at [email protected]
9. Keyword density
keyword density is percentage of words on a web page which are a particular keyword. If this value is unnaturally high the page may be penalized.
Keyword density is the ratio of the keyword with the total words in your web page.
Formula for Keyword Density = (Nkr / Tkn ) * 100
Nkr is how many times you repeated a specific keyword and Tkn the total words in the analyzed text.
Many SEO experts consider the optimum keyword density to be 1 to 3 percent.
Please check this video to know the ideal keyword density of a page.
10. Canonical tag
The rel=canonical element, often called the “canonical link” tells the search engine about the preferred version of the web page. It helps webmasters to avoid duplicate content issue. You can learn more about Canonical tag.
11. Duplicate Content
Content that appears on more than one pages, or URL, called Duplicate content. According to Google,
In some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience when a visitor sees substantially the same content repeated within a set of search results.
12. Meta Description tag
Meta description is the extract of your web page content. Recommended length for meta description is 160 words. Meta descriptions are generally used on search engine result pages (SERPs) to display preview snippets for a given page. Here is an example
Google will sometimes use the meta description of a page in search results snippets if Google thinks it gives users a more accurate description than would be possible purely from the on-page content. Check the Google Guideline for Meta Description.
13. Link Building?
Link Building is a method of getting backlinks for your website from another website. It is one of the top SEO trick and can harm your website, if done in wrong manner.
In early days of SEO, webmasters used to allow as many as backlink, even from a spam website. This used to help them to bring their website on top. This case is history now. It no longer works.
Search engines now prefers quality over quantity. So while creating link building strategy, focus on high quality links, rather than more low quality links. I would recommend to start with Moz’s guide for link building.
14. 301 redirect
A 301 redirect is a permanent redirect from one URL to another. If you want to change the URL of the page as it appears in search engine, Google recommends 301 redirect.
This will ensure that users and search engine will land to the correct page. This is also helpful to pass the page ranking to the new page.
Also, check this video, for detailed insights.
15. Black Hat SEO
Earlier, when SEO was in his early days, it was easy to rank for any keyword. People use to manipulate Google, thinking that your page is more relevant than others. Here are some of the most popular methods.
- Keyword stuffing
- Website cloaking
- Link farming
16. White Hat SEO
These are the SEO techniques that focus on users or audience, rather than search engines like Google.
Cloaking is a practice, where you show something to user, but show an entirely different thing to search engine. As per Google, you should strictly avoid clocking.
Check this video for details.
18. Bounce rate
In Google’s words, “Bounce rate is single-page sessions divided by all sessions, or the percentage of all sessions on your site in which users viewed only a single page and triggered only a single request to the Analytics server.” Read more about Bounce Rate here.
These are just a few SEO terms, out of a long list. I will try to add some more, but if you have more suggestion, please let me know in comments.