Home SEO Glossary

SEO Glossary

301 Redirect

The 301 redirect is a server redirect that is permanent. While useful for handling canonical issues, the 301 redirect is also an address change for a web page located on apache servers with an htaccess file.

Whenever one wants to change a page’s web address, one needs to employ a 301 direct to make the former address shift to the new address. This makes sure that people who have bookmarked or linked the old address will readily access the new address, and the indexes of search engines can be updated. The new address must be provided in the ‘location field’ included.

An example would be:

Client:
GET /index.php HTTP/1.1
Host: www.redirect.com
Response of Server:
HTTP/1.1 301 Moved Permanently
Location: http://www.redirect.com/index.asp
RFC 2616 states, clients who can link edit should, where possible, re-link references to the Request-URI to the new references that the server returned. Unless otherwise indicated, the response is cacheable.

Alt Text
Also called an attribute or tag, an ALT Text (alternative text) is an image’s description in a website’s HTML. Search engines, unlike humans, read only images’ ALT text, not the images. Whenever possible, ALT text must be added to the images.

Additionally, ALT text describes an image, which is not usually displayed to the user, unless the image is undeliverable, or a browser that does not display graphics is utilized. The ALT text is vital as search engines are unable to tell one image from another.

The ALT text is the only place wherein the ‘spider’ can get varied content than the human user, but it is so only because the user can access ALT text, and when it is used properly in an associated image’s accurate description. Specific browsers for visually-challenged individuals depend on the ALT text the make the images’ content accessible to users.

Whenever the user cannot access the image, the ALT text makes sure that no functionality or information is lost.

Anchor Text
The anchor text (also link title, link text, or link label) is a hyperlink’s clickable text. In SEO, the best practices state that the anchor text should be relevant to the directed page, instead of text that is generic. Search engines use the anchor text to help determine a linked-to document’s subject matter.

The anchor text on most websites is usually underlined and dark blue. It turns purple if the page has been visited. It will remain purple until the browser cache and history would be cleared, then it will return to its original blue color.

The anchor text can help search engines ‘understand’ the destination page’s description. It describes what a person will see if the anchor text is clicked through. Search engines also utilize the anchor text to indicate how relevant the referring site is and if the landing page’s content link is also relevant. There may be some shared keywords.

Algorithm
An algorithm, in scientific and mathematical terms, is a step-by-step self-contained operations set to be performed. In short, an algorithm is a set of instructions or a how-to guide.

Algorithms exist that perform data processing, automated reasoning, and calculation. When related to the SEO, an algorithm is a search engine’s technology that is used to deliver a query’s results.

Search engines utilize various algorithms in tandem to deliver a search ads that are keyword-targeted or web page that contains search results. In the Internet, there are many algorithms and many do not realize it. One of the algorithms is the ‘Google search.’ It is an algorithm that changes constantly in order to bring users the best possible search results.

There are also routing algorithms. Examples are heuristic routing, adaptive routing, and fuzzy routing, with each one helping to make possible network communication. Without routing, there is no Internet as routing is one of the Internet’s core technologies.

Backlink
Also known as incoming link or inlink, a backlink, in SEO, is any link into a website or page from any other site or page. The links are vital in determining the importance or popularity of the website. There are search engines that would consider sites with more backlinks more important in search results webpages.

Before search engines became relevant, backlinks were used as the main means to navigate webpages. Nowadays, the importance of backlinks lie in SEO. The number of backlinks indicates the importance or popularity of the website.

Outside SEO, a webpage’s backlinks may be of vital semantic, cultural, or personal interest. Backlinks indicate who is looking at that webpage. Changes to algorithms put a special focus on a particular topic’s relevance. As some backlinks could be derived from sources with highly valuable metrics, those backlinks could also be unrelated to the person’s interest or query.

One example would be a link from a food blog to a site selling shoes and bags. While such link may seem valuable, the consumer may not find it relevant.

Blog
Truncated from weblog, a blog is a website that presents content in a chronological series. The content produced may not be time-sensitive. Many blogs utilize what is called a ‘content management system,’ of which an example is WordPress. Due to such system, the blogger may create content in place of using arcane code.

Published content on a blog can include videos, photos, event descriptions, commentary on company/industry topics, food, and other topics. Every post is a new page that is seen by a search engine and it becomes an opportunity to be located online. It is best that a blog should be kept within one’s own domain.

Blogs can now be linked and accessed through social media platforms like Twitter, Instagram, and Facebook. The rise of such systems helps integrate blogs into social media streams. Blogs growth during the 1990s coincided with the start of web-publishing tools, which facilitated non-technical users’ content posting.

Bounce Rate
While it may be confused with exit rate, a bounce rate is a different thing. Bounce rate is the percentage of people who visit a website and then leave the website without looking at other webpages. There are various factors that lead to a high bounce rate.

Users may leave a website from the entrance page if there are usability or site design issues. Users alternatively may also leave the website after viewing one page if they have found what they are looking for on that particular page and they do not need to visit another page for information.

Some reasons for a high bounce rate are: single page website, site design, incorrect implementation, and user behavior. As the website’s design and the implementation of Google Analytics can affect the bounce rate, improving such needs customized and specific changes to the website and its setup.

Some of the changes include analysis of specific data and adjusting and evaluating factors that could lead to bounce rate like navigation and site layout.

Breadcrumbs
The term ‘breadcrumb’ is not only used for cooking. In computing, breadcrumbs are graphical control elements that feature a website navigation in a horizontal bar that is above the primary content. This helps users to understand their location on the site and how they can get back to the root areas.

There are two kinds of breadcrumbs: attribute and location. Attribute breadcrumbs give information on the current page’s category. Location breadcrumbs, which are static, indicate the page’s location in the hierarchy of the website.

When it comes to usability, location breadcrumbs are not usually appropriate for websites with rich content that solitary categories do not describe fully certain content. A tag may be more applicable, but breadcrumbs can still be utilized for the user to retrace their tracks and find out how they came to the current page.

The term is derived from the ‘breadcrumbs’ left behind by Hansel and Gretel. Another term for breadcrumb that may be used is ‘cookie crumb.’

Canonical URL
The canonical URL is the absolute address by which a user can locate a piece of information. It also helps webmasters to prevent duplicate issues by specifying the webpage’s ‘canonical’ version. There may be situation wherein the same webpage content can be found at more than one address. With specifying the canonical URL, search engines can determine what address is best to locate certain content.

The tag attribute for the canonical URL is similar to a 301 redirect from an SEO viewpoint. Essentially, one ‘tells’ the search engines that multiple pages must be considered as one without really redirecting users to the new URL.

Duplicate content occurs when similar content can be accessed from various URLs. The canonical URL is also useful to resolve duplicate non-www and www content. This happens when two URLs are identical except for the www on the beginning of the URL. The problem can be resolved by properly using the canonical tag.

Content Marketing
Content marketing is defined as marketing that entails creating and sharing publishing and media content in order to gain and retain customers. Information can be handed out in various formats like video, news, e-books, white papers, case studies, infographics, photos, question-and-answer articles, and how-to guides.

The purpose of content marketing is to attract and keep customers by always creating and curating valuable content with the intent to change or enhance consumer behavior. Content marketing is a work-in-progress that should be integrated into a user’s marketing strategy. It focuses on not renting, but owning media.

Excellent content marketing begins with good content because good quality content forms a part of marketing. Some of the ways to market good content is through social media marketing, SEO, PR strategies, PPC, inbound marketing, and content strategy.

Nowadays, content marketing is intriguing as it disseminates thousands of messages daily. Excellent content marketing makes an individual to stop, read, think, and differently behave.

Duplicate Content
Materials over the Internet are flagged as “duplicate content” if what was written or posted is found in more than one website. The duplication of content varies – it can be that content found in one website is exactly the same on another (as if the original post was “photocopied” on another site), if a substantial amount of the content looks the same in two different sites, or even if the same sentences are found in the same website.

Duplicate content is prohibited by search engines. If duplicate content is found, there is a possibility for search engines to refuse displaying the websites where the material is found whenever somebody looks for the keywords found in the content.

In the event that you need to quote something that is written on another website, the best action that you can take so that getting flagged for duplicate content can be avoided is to provide a backlink to the site where your target content is originally posted.

Fresh Content
Material is considered as fresh content if you are providing new information or discussing novel ideas to your target audience. As long as the topic presented in your post is timely, up-to-date, or is just recently released, it is considered as fresh content.

In order to improve the chances of a website to get ranked higher, it is almost a requirement for them to come up with new ideas and to release fresh content regularly. By providing fresh content, you are capturing the interest of people who are searching for updates or information about something. If new content is posted on your site, there is a higher possibility for them to visit your webpage and check out what you’ve written. It is also by posting fresh content that search engine spiders crawl into your page, suggesting it to other people who are searching for your keywords, and ultimately improving your website traffic.

Google Bot
Want to learn how the search engine website Google is able to discover new content and updates on any webpage? It’s with the use of Googlebot.

Googlebot is Google’s “spider”, and its purpose is to “crawl on the Web” so that it could find new information and update Google index. Googlebot is able to identify which websites to crawl into with the help of the system’s algorithm. It starts looking at URLs that came from its previous crawl over the Web. During Googlebot’s visit to these URLs, it will try to detect any links/pages that will be added to the list of URLs that Google Index has.

Learning how the Googlebot works is important for webmasters. This allows them to do what is beneficial for the website. Whether they want to block Googlebot from accessing some of their web pages, or getting their website to be easily crawled, they can do so by making use of Robots.txt or Sitemaps.

Heading Tags
Just as the name implies, heading tags are used so that the title or subheadings can be distinguished from the body text of your material. With the use of heading tags, it will be easier for readers to identify what the following section will be about. This, in turn, enables them to decide if what they’re looking for can be found in a particular section that has a heading tag.

Heading tags in SEO are ranked accordingly, with H1 having the highest priority and H6 having the least. The order at which they are used is also important. Jumping from one heading tag to another (such as from H1 to H4) could create errors for your page.

Heading tags are primarily used by spiders so that your content can easily be referred if ever somebody searches for the keyword that you’re trying to target. Therefore, you need to make sure that your heading tags have those keywords. This will make it easier for search engines to refer your webpage to readers.

HTML
Translated as Hypertext Markup Language, HTML is a computer language that is primarily used for creating web pages. An HTML element is usually written as tags written inside angle brackets. In most cases, these tags are usually written as pairs, indicating the start and end of a particular element in your webpage. However, there are some HTML tags which do not need pairs.

Web browsers do not show HTML tags whenever websites are loaded. However, it is HTML that makes it possible to present the webpage the way that webmasters want it. Using the proper HTML tags and placing it in the right place makes it possible to freely customize web pages.

HTML is useful for increasing traffic to your website. Certain elements, such as the title section and meta-description, are the usual areas where target keywords are placed. This helps your website to easily be found by search engine spiders.

Index
The search engine index refers to the place where information collected by search engine sites is stored. Whenever search engine spiders crawl on every webpage over the Internet, it is collecting data such as targeted keywords, images, and other relevant information contained on those web pages and is placed at the index. Once somebody searches for a particular keyword or information, search engines consult the index so that it can automatically provide the readers with the most relevant results. Without the index, search engines will need to fulfill the time-consuming task of going over each website and look at everything that the person needs to get accurate results.

A website that is properly indexed simply means that the content in your website has a higher possibility of being found by people whenever somebody searches for the keywords that you’re targeting. This, in turn, leads to higher traffic for your webpage.

Indexed Pages
As the name suggests, indexed pages are web pages that were included in a search engine index. If a webpage is indexed, this simply means that it has a higher chance of being found by somebody whenever they search for content or keywords that you are targeting. Obviously, this contributes to higher traffic for the website, higher search engine ranking, and higher revenues for the site owner.

In order to get your webpage indexed on a search engine, it is important that you take a look at the content of your page. If search engine spiders find that the webpage has duplicate content, it deliberately leaves out the page and does not recommend it as a search result. Another suggestion to improve the number of indexed pages is to check if you are linking to another website which has the same content as one of your pages. This flags your webpage as plagiarized and not being recommended by the search engine.

Internal Link
An internal link is referred as any hyperlink found on a webpage whose destination can also be found inside the same domain or website. The internal link is primarily used in order to easily redirect people to another webpage in your website which contains the information that you are mentioning on one of your posts. This eliminates the difficulty on the part of the reader to search for the page that you are referring to in that post.

As the definition implies, the use of an internal link is done in order to further promote your website to the reader. Since your website also contains additional information that they might need, it is easier for them to follow through on your discussions. This is especially true if each of your web pages are focused in one topic, but that topic must also be mentioned in another post. Since traffic stays in your website, search engines flag the website as “reliable”, which in turn contributes to better search engine rankings.

Image Alt Attributes
Web pages commonly include images so that the overall appeal of their post can be enhanced. Unfortunately, there are instances when these images cannot be viewed (such as due to a slow Internet connection). To help readers still imagine what is in the image, it is necessary to use image alt attributes.

Image alt attributes refer to the alternative text description of the image if ever it cannot be viewed for some reason.

Image alt attributes are important for websites because of two reasons: First is that it still gives the readers an idea as to what the image is even if they cannot view the image at all. The other reason is that it allows the webmaster to include targeted keywords on the image. This makes it easier for readers to reach your website if ever they want to access the content found on the webpage where the image was found.

Keyword
Essential to the whole world of SEO, a keyword is a term included in your content which has the possibility of getting searched whenever somebody wants to look for the information that you’ve written.

A keyword can also be referred to as “money word”. This is because if the word is exactly what your target audience types when they want to research for information, search engine will automatically point you to websites which contain that keyword.

In order to be successful in your campaign to increase your website’s search engine rank, it is advised that you consult individuals who can conduct keyword research for your website. They evaluate which keywords are mostly used when searching for content. If the keyword is unique or has a significant number of search hits, you will need to create content where the keyword must be mentioned. Just make sure that the keyword is mentioned sparingly.

Keyword Cannibalization
Keyword cannibalization is defined as a website internal information structure problem associated with multiple subpages within one site that targets one specific keyword or phrase. It is one of the website construction problems that can be difficult to diagnose, primary because not many people (even webmasters themselves) are aware of it. This effect commonly happens if a website owner optimized several pages with the same keyword in an effort to make it more optimal.

Keyword cannibalization is an issue that should not be taken lightly. In fact, as of the moment, it is giving a lot of webmasters and SEO practitioners problems. There are many complications associated with keyboard cannibalization. Because crawlers got to choose one page from your site that best fits a specific search, it causes the other pages to become virtually invisible in searches. By causing internal site competition, this effect basically causes pages within your site to compete for positions, which is basically counterproductive.

Keyword Density
Keyword density is defined as the percentage of times a word or phrase appears on a particular webpage relative to its overall word count. This value is used as a factor by search engines in determining which pages are actually relevant for a specific search. As one of the most basic parameters used in determining a page’s potential for reaching the top of searches for particular keywords, this is one of the more important factors any webmaster must consider. Other factors might have risen in importance, but keyword density remains an important component of SEO.

The most basic formula used for calculating the density of a particular keyword is explained this way: KW% = (Nkr/Tkn) x 100, with Nkr = the number of times a keyword is repeated and Tkn = the number of words in the analyzed text. This means that the longer the keyword, the higher the potential keyword density per repetition.

Keyword Research
Keyword research is a field in SEO where search terms used by people in search engines are determined. This is very important because knowing the keywords people actually use when making online searches gives webmasters an idea on which keywords and keyword phrases must be capitalized on. Once a viable keyword is discovered (also called a “niche”), it is expanded to look for more related keywords.

There are 2 approaches used for doing keyword research: brainstorming and keyword research tools. Brainstorming is a manual approach to searching keywords. This is done by thinking of terms relevant to that page’s content, even if some terms may not directly imply the page subject. Using research tools make this process much easier. Different kinds of keyword research software and searches are available online, with some of them can even be used for free. All gathered data are then used to determine which keywords to use for accomplishing the SEO goal/s on hand.

Keyword Usage
Keyword usage is defined as the way keywords are used within a specific page. All parameters related to keywords, from density to placement, are considered as elements of usage. This is an important consideration for anyone involved in SEO as the right usage of keywords will significantly boost a page’s visibility when doing searches about particular keywords. Do it the wrong way and the exact opposite effect happens. Because of this, just about every SEO expert out there emphasizes proper keyword usage.

There are different elements involved in keyword usage. Choosing which words/phrases will stand as your primary keyword and secondary keywords (when applicable) is the first component of usage. Keyword density is also considered as an element of usage, as density is used by crawlers as a means for determining what a page’s content is all about. Even the placement of keywords is considered as an element of keyword usage.

Landing Page
A landing page is a web page that appears when clicking an option for a specific keyword search. This same page is also utilized by online advertisers. The landing page should ideally expand on the idea related to the executed web search or the ad displayed. Because of its crucial role in online marketing and SEO, much effort has been invested in creating landing pages that effectively engage the visitor and make them do a desired outcome (ex.: making a purchase).

Landing pages are classified under 2 types. A reference landing page mainly aims to present information relevant to the visitor. Information found in such sites can be in the form of text, pictures, videos, or links. A transactional landing page aims to persuade a visitor to complete a specific action (termed as a “transaction”). In such pages, the visitor is usually required to fill up a form, with the info shared in the form used by the site owner to complete the transaction.

Latent-semantic-index
Latent semantic index, also known as LSI, refers to a retrieval method that aims to determine relationship patterns between numerous terms and concepts contained within a specific text or page. It uses singular value decomposition, a mathematical system used for various applications such as statistics and signal processing. It also utilizes correspondence analysis, a form of multivariate statistical technique. The principle behind LSI is that words used in same contexts tend to have same meanings.

The concept of latent semantic index was first used in the 1980s. It mainly stuck because of its remarkable ability to semantically correlate different terms contained within a single body of text. Utilized as a means for categorizing different documents, it takes account of both synonymy (different words with the same meaning) and polysemy (a single word with multiple meanings). LSI is used by information retrieval systems such as search engines, making it an important consideration for all webmasters.

Link Juice
Link juice is an informal term used by SEO practitioners that encompasses different means for increasing site traffic and search rankings. Coined by SEO consultant Greg Boser, this essentially pertains to the number of people that provide links to your pages. Essentially, the more links to your page, especially if the pages are considered as “authority sites”, the stronger the link juice. With stronger juice, your visibility and web page rankings will also increase.

There are many ways to get link juice. You can self-generate it by submitting links to different social networking sites. When people start recommending your page, your online profile starts to expand. Another way to get juice is to ask sites with a similar line of content as yours to provide a link of your site in their page. However, the best way to earn link juice is to build a site that has interesting, helpful content and a solid internal linking structure. Good content generates interest, and more interest means more link juice.

Long Tail Keywords
Long tail keywords are basically long phrases used as keywords. Such keywords are generally entered by users who are looking for a specific product, service, or information. Logically, the longer the keyword, the more specific a search becomes. Capitalizing on the right long tail keywords enables a specific webpage to hit its target audience faster and more effectively. What’s more, visitors that clicked on a result related to the long tail keyword are generally closer to point-of-purchase.

Identifying the right long tail keywords to use has become very important due to its specificity. Because searches involving long keywords tend to be more specific in nature, a web page that emerges from such a search will definitely earn visits, sometimes by default. Nowadays, SEO experts have put an emphasis in using these keywords, especially when considering the difficulty of climbing the ranks using short keywords (there’s just too much competition out there).

Meta Description Tag
A meta description tag is the text located just below the title tag of websites shown in a search. It is a great way to provide a “preview” for potential site visitors, a way to give them an idea on what the website actually contains. Most meta descriptions integrate keywords to aid search visibility. However, the most important role of these tags is to provide an accurate yet unique description of the website. Therefore, the ultimate goal of this tag is to convince a searcher to click on the link upon reading the description!

The length of a meta description tag may vary depending on the site owner’s preference. Ideally it should be kept at 160 characters or less to make it fully readable. An effectively-made meta description tag is concise yet convincing.

Meta Keywords Tag
Meta Keywords are one of the types of meta tags that are inserted into a web page using HTML code. It gives information about the web page but it does not appear visually on the page itself. Unlike regular keywords that are included in the content of an article on the page, meta keywords tags work ‘behind the scene’.

The meta keywords tag contains keywords related to the content of the page. Previously, this allowed the search engines complete and exhaustive cataloguing and indexing of the HTML document. That’s why it’s important to include only keywords that are relevant to the content of the page.

Newer search engines like Google and Yahoo, however, have stopped support for the meta keywords tags, so they have pretty much become useless.

Leave a Reply

Your email address will not be published. Required fields are marked *