Using a 301 redirect command for your website is an ideal way to save time creating links that point towards a certain webpage. This is entirely different from the other redirect commands out there in a sense that the 301 is a permanent one once created by a webmaster. Search engine optimizers use this in order to redirect users to a certain webpage without them knowing it. Search engines have nothing against this technique as long as the redirection is created within the servers of the site itself.
Above the fold simply refers to that area of a webpage which visitors immediately see when they are redirected to a site. Above the fold consists of the website’s banner, name, primary links, and main titles. Below the fold, on the other hand, involves the areas which the users have to scroll down in order to see. It consists of the site’s main content, the secondary links, and even the copyright notices.
First, a link is considered to be absolute if it is complete. The URL that is typed down should contain the protocol, the domain address, and the specified webpage or file to which the online visitor would be redirected to. On the other hand, redirect links merely consider incomplete URL details. Most of the time, these URLs only contain the filename, which hinders search engines from crawling around the website. This is why most people choose to use an absolute link.
Most search engines seem to prioritize older websites compared to new ones. It should be known that search engines give more priority to old websites that makes it a point to display new content every day. On the other hand, link age refers to how long a certain link or page is in existence. According to experts, it really does not matter how old a link or page is, which further certifies the importance of updating a site with fresh content each day. In a way, there exists a certain relationship between the ages of a page to the age of a website. Therefore, most SEO experts would advise you to stick to a single website in order to allow it to grow old and, as a result, make it number one in search results.
Whenever a webmaster would place hyperlinks within a certain sentence or paragraph, the text that is used in order to host the link is referred to as the anchor text. SEO experts use an anchor text that is a keyword in itself. This way, the search engines will easily distinguish that the site is valuing keywords that users search, all the more reason for that webpage to be placed atop the search results. Some writers, though, tend to populate an entire website with the same anchor text, which is considered by search engines to be a form of keyword stuffing.
Google’s advertising program in which their customers pay for click-throughs. Those who host these ads on their own website are paid a commission for each click through by Google.
A site constructed with the primary goal of generating revenue through AdWords. This is a method that has proved to be less than desirable in terms of its success rate and profitability in recent years, but with Google’s change in ad designs it may come back.
A routine or program used by search engines to find out which pages exist in regard to a specific search query.
A description of an image or graphic, that is usually not accessible by the end user, unless there is some trouble with downloading the image. This text is incredibly important because it helps a search engine to differentiate from one image to another. This is one place where the data accumulated by a search engine is different than that taken in by a user. Those who are blind or otherwise unable to view these images utilize alt text in order to know what the image consists of.
A service that streamlines and delivers traffic data pertaining to websites to their owners. Such information is vital in creating keyword-optimized content and other tools for a website owner’s use.
The visible portion of text of a link that an end user can see. Search engines utilize anchor text to determine whether or not the links contained on a site are appropriate for the subject matter. The relevant parties in this evaluation are the referring site, the landing page and the outbound links from the landing page.
API stands for Application Programming Interface. It is a group of digital rules and routines that are being followed by programmers when it comes to making or even analyzing certain software. Operational systems in computers, for instance, each has its own API which software creators will follow so that the OS will be compatible with the software that is being created. In terms of SEO, certain application program interfaces are being designed for websites so that the webmasters will have their own unique ways to drive traffic to their respective websites.
A website that pushes a political or commercial message while putting on an image of being a completely objective party, such as those who are watchdogs of an industry. Such a position can help to establish credibility and rapport with end users very quickly. Often used to improve customer relations and improve branding.
The trust that Google places in a particular website’s authority and relevancy insofar as its page rank and dependability are concerned. Sites that have authority typically have highly desirable positions in search rankings, and have some of the best link profiles that are out there.
Authority sites are websites that are knowledgeable regarding one topic or one field. For example, the number one online marketplace is an authority site for online businesses and the most visited news site is an authority site for news and current affairs. On the other hand, a hub site contains links that fall under one particular theme or genre. Most hub sites online have high page ranks, all the more reason for webmasters to host this type of site when they plan to gain more revenue using the Internet.
Business to business communications and interactions in a wide range of capacities.
Business to consumer communications and interactions in a wide range of capacities.
Black hat in SEO refers to the unfair tricks that are done by programmers and coders alike in order to cheat their sites atop the search results. These may involve link farming, wherein a website hosts a lot of illegitimate and oftentimes illegally-placed links in order to increase the overall page rank of websites. Another technique used in keyword stuffing, wherein writers and webmasters create articles that are stuffed with keywords so that more and more people are more likely to find the websites the minute they search for the keyword. Search engines are highly against these techniques, so it is important to follow ethical rules in SEO in order to avoid these websites from being penalized or, worse, from getting banned forever.
A website that displays information in a way that presents newer information first, and older information second. Many blogs these days use CMS (Content Management System) technology in order to make creating these posts as easy as possible. This makes it easier for the blogger to focus more on content, and less on coding. See Also: CMS.
Blog comment spam is one of the most common forms of spams. There is nothing illegal or unethical about leaving spam comments in blogs. These comments are left by people who want to increase their own websites’ view counts. The comments may include links that point to obscure sites selling obscure items In order to prevent spam from being left within a blog post’s comment section, the webmaster must make it a point to download the best spam filters in order to block unwanted comments.
If a webmaster wants his or her website to be viewed by lots of people, he or she will place a single link of the site to a social bookmarking site with a considerably high page rank. Naturally, more and more people will visit the site as long as the bookmark can be viewed by anyone. The more people find the content of the site interesting, the higher the chance they will rank it high on the social bookmarking website, which has features that involves link popularity. This is why webmasters are always encouraged to post quality content online, since it is up to the online users around the world to rank their websites atop the list of social bookmarking sites.
A program created by search engines that scours the Internet digesting and cataloguing information from the various websites out there. They use this information to create additions to search indexes. Much of this information is used to prevent or at least discourage plagiarism and other issues.
The number of users who only look at one page of a site and then move on to something else divided by the number of users who click on another page.
Site navigation cues that are visible in a horizontal bar above the main content, which aids a user in navigating the site, and provides a link back to the home page.
There is a difference between “www.domain.com” and “domain.com/.” Major search engines crawl through a website and search for any links that are contained within the codes. If it points to a specific website with a URL that is written in two different ways, the search engine crawler will get confused as to which of these links actually point to the website itself. This is why SEO experts will advise webmasters to stick to one form of writer URL in order to avoid confusing search engine crawlers. The more properly written URLs are found by the search engines, the higher the page rank of that website will be. In short, using a canonical URL is a surefire way for the search engine to easily become familiar with a certain website.
Instances in which an affiliate will go through and click the advertisements on their websites. There are a number of different sophisticated ways this can be done, and this lowers the confidence of companies that pay for this type of banner or CPC advertising.
Cloaking is a method in which coders uses advance programming and coding to fool Internet users and search engines. Through this method, two versions of a website are being produced: one for the search engines to view, the other for the regular Internet users. Webmasters sometimes choose to cloak their website in order to increase their site’s rankings without sacrificing loyal readership in the process. Another form of cloaking involves hiding a text full of relevant keywords behind a website by using it with the same color as the background.
Most of these engines make it a point to adapt the process of clustering in order to provide Internet users with the utmost ease and comfort when it comes to looking for a certain keyword. Through this process, search engines provide results that contain varied links that come from a single website. If, for example, a user types in the word “cheap shoes” in a search box, the search engine will then “cluster” the results into two or more links that come from one website. Of course, there are other websites that are included in the results which are also clustered in a similar group.
Altering the content of a website after it has achieved high rank and authority within the search community.
The items on a web page that are meant to be a draw and satisfy the users that visit the page. It does not include marketing materials, navigation bars, various branding methods, or boilerplate.
Conceptual links are links which search engines analyze in a much deeper manner. Instead of analyzing the websites wherein these links lead to, search engines utilizing conceptual link analysis focus more on the concept of the word itself. For example, if the anchor text of a certain link is “cheap shoes,” and there are some words around the link that are highly related the anchor text, the search engine will make it a point to remember the rate as to how much the words appear whenever the keyword is being searched. Think of how easy a certain website will be found by millions of users around the world if conceptual links are being used by all search engines.
Benefits of CMS include the opportunity for a huge number of workers to contribute their outputs on a certain site without the need for a centralized system. All they need to have is a username and password and they will be able to easily edit select webpages within the site itself. It is therefore ideal for a company to choose a content management system that can make the editors make changes as fast as possible. Another feature that they need to look for in CMS is the ability to support RSS feeds. Without it, visitor count of the website will decrease dramatically.
Co-citation refers to process in which a website contains two or more links that point to different websites. For example, a hub page consists of three links: the first one points to a website for shoes, the second one points to a site for T-shirts, while the third, for hats. These three websites are, in a way, co-cited in one website, which means the minute the search engine crawls through the hub page it will come to a conclusion that the three websites that are linked are related to each other. A certain danger lies in co-citations, though. If a certain website that caters to cellular phones is co-cited with a group of links that points to birth control products, then the search engine is most likely to classify the phone site with a site for birth control items.
A type of marketing in which one creates high quality content for the purpose of linking back to the home page. This content is typically generated by professionals to ensure that it is of a high-caliber, because otherwise it can be considered spam.
A piece of marketing that is directly tied to the content. There are a number of different ways in which these advertisements can be tied to the content in question.
The conversion of traffic into sales, for many websites this is the ultimate goal of most websites out there. Clicking on an ad or purchasing a product are both seen as examples of conversion.
The number of users to ultimately convert (for the action desired) while visiting the site.
The amount of money spent on a given keyword through a Pay Per Click Advertising campaign.
A metric of statistical analysis that is used to estimate the average cost of a click within Pay Per Click advertisements.
Another term for the search engine bot that gleans the Internet looking for new information. It does so through the linking structure.
Search engines are programmed to crawl through each website in order to analyze each and every one. After analysis, a page rank will be assigned to each site. The more links point to a certain site, the higher the page rank will be assigned. However, a site is in danger if it contains a few broken links, which are links that point to an expired or non-existing website. Whenever search engines crawl through a site and find that a broken link is present, its crawl process will be interrupted. It is like encountering a dead end for these search engines.
Most webmasters have one goal in mind: to attain a high page rank for their websites in order for it to appear high in search results. It involves a lot of processes, one of which is placing inbound, outbound, and deep links, which are links that point to a website’s page except its home and main page. Some webmasters seem to be happy in submitting their homepage URL to various link bait sites. However, there is a huge chance that search engine crawlers will become suspicious if only the website is being submitted, at which case the deep link ratio is considered too high.
A doorway page, as the name implies, serves as a “doorway” to a certain site, which means a user will not be able to enter the homepage of the website if he or she does not pass through it. Sites that use doorway pages are often considered to cater to search engines instead of the Internet users themselves. Moreover, the fact that users are forced to go through a page that is filled with keywords and widgets makes doorway pages look like unnecessary adware. Of course, there are no stopping search engine optimizers from creating doorway pages for their sites.
A website that hosts links and descriptions of other websites. DMOZ.org is an example of an online directory.
There are some cases wherein a certain paragraph is copy-pasted elsewhere within the same site directory, but the most common cases involve other websites copying content from other sites. In turn, both of these sites are at risk from being penalized by the major search engines in the Internet. If on the occasion a webmaster finds a duplicate or two, he must contact the owner of that website as soon as possible. If no response is given, the website owner may choose to either report the other site to search engines or completely rewrite the original posts that were duplicated.
A website that deals nearly exclusively with retail sales.
In order to ensure the overall quality of websites, major search engines make it a point to rely on what SEO experts call the editorial link. An editorial link is a link that is considered by search engines to be valuable and authentic. There are plenty of unauthentic websites that host links such as link farms and illicit hub pages. It is through editorial links wherein one sees how search engines are stricter when it comes to monitoring websites in order to provide page ranks.
If Person A searches a keyword using a search engine and Person B searches the same keyword, same search engine, yet on a different computer, there is a huge chance that the list of results they will see will be entirely different. This is the result of the search engine phenomena called everflux, which is proof that the Internet is an unpredictable place that needs to be maintained and analyzed by a webmaster. It is ideal for search engine optimizers to take note of how active an everflux is in a certain search engine in order to ensure that a certain website will be able to fit in to the queuing process in the search results.
Within SEO, evergreen refers to content that is “ageless” or “timeless”. A prime example would be discussing a popular celebrities continued run-ins with the law. Celebrities such as Mel Gibson, Charlie Sheen, or Lindsey Lohan. People will continue search and engulf content for years to come, and as such, continue to trickle in traffic number for the content piece.
The extensible markup language, also known as XML, is the language that regulates the feed. An example of a program that is based on XML is RSS which, for most webmasters, is one of the most widely-used types of feed provider. Users can choose to subscribe to a news feed in a certain website in order to get notified whenever that site has produced new content. People can choose which site feed they can subscribe to in order to avoid the unnecessary clutter in their mailboxes.
A page of site with a large number of outgoing links to completely unrelated web pages, containing very little unique content. One of the best examples of this are link farms, which are only intended to lure in bots as opposed to human users, upon being discovered they are sandboxed by the search engines.
A web page design in which there are two or more documents that appear on the same screen, each possessing its own frame. These are actually counter-productive for SEO because bots have a high probability of not being able to navigate frames properly. Many users also dislike the unusual interface and visual style.
A web page designed to attract traffic from a search engine, and then subsequently funnel the traffic towards another page. This is similar to cloaking, and if it is too obvious than a search engine could end up penalizing the site.
Lightweight applications used on web pages to provide specific functions for the user. These can include various measurement applications or IP address displays.
A concerted effort on the part of webmasters or those who are conducting searches to alter search results. Another target of a Google bomb can be to create a phrase that is trending that otherwise never would. Many other aspects of a Google bomb include back links that consist of the same anchor text to drive pages of content for a very particular search term.
A blatant effort to knock a website’s search rankings down, this is typically done by submitting the site in question to link farms. This is a negative form of reputation management that is practiced by some of the more nefarious web masters out there.
A change in the rank or authority given to a website due to a recent algorithm update. This is typically experienced by web masters as slipping in the SERPs for their particular targeted keywords.
The amount of trust or authority bestowed upon a website by Google, which subsequently adds more relevance to their outbound links.
An older term for traffic a website receives, although this has been overtaken by far more specific terms such as “page views” or “impressions”. Since a hit occurs multiple times when a user interacts with a web page during one session it is not that dependable.
A web page that enjoys a good deal of trust and authority from Google, and in turn it links out to related pages.
This is the framework or building blocks for all web pages, and it allows search engines to determine the content and overall layout of a webpage. This is the language which all search engines utilize, so it is imperative that a website adhere to it.
The instance where a user visits a webpage at least one time.
The instance where a user visits a webpage at least one time.
When a search engine adds a web page to their database.
Pages of a website that have been indexed by a search engine.
Inbound links from a related page that are sources of authority or page rank. See Also: Back Link or Inbound Link.
The text a user enters into a search engine to return their desired results.
The overuse of the same phrases on more pages of a website than is really necessary (very high keyword density). This is something that makes it very difficult for users or search engines to figure out if one page is more relevant than another.
The amount of times a keyword has been used on a website. If it has been used too many times a website could end up being penalized.
The act of researching which keywords would be best for a given search and ensuing search results pages or SERP.
Very high keyword density. This technique, through Google’s Panda update is considered very outdated and can result in search penalties.
A page that a user arrives at after selecting a link from SERP, a text link, or banner or other advertisement.
Latent Semantic Indexing is a welcome addition to search engines in response to the growing number of websites and to the oftentimes complex language people use in their everyday lives. With LSI, the search engines make it a point to match content not just with the word that is typed in on the search bar, but also with the words surrounding the keyword. For example, searching “light” will allow the search engine to look at the other words written around the term in a certain webpage. Through latent semantic indexing, Internet users will have an easier time getting what they want whenever they look for a certain term using a search engine.
An element within a web page that is clicked on to travel to another website, or in some cases to another page on the same site.
A web page or post designed to cause people to link to it, and in doing so draw on some link juice.
Actively pursuing more incoming links for a website.
The act of preventing your site’s link juice from going to another page, which could have adverse consequences if the page in question is in the bad graces of Google. There are a number of different ways of doing this, and some web masters are very proactive about such preventative measures.
Web masters or Internet users who are the most popular subjects of being linked to. They can include forum participants, information hubs, bloggers, and any other content creators and gatherers.
A place where web masters can come together in order to offer to link to one another. Sites such as these which are actually useful are typically edited and maintained by a human operator.
A group of sites that are all linking to one another.
Shared Authority and Page Rank passed from one site to another via linking. Typically the link juice passed needs to not come from a site with far higher page rank than the one being linked to.
An outbound link that contains authority and is not restrained by any sort of Link Condom. This sort of link often necessitates that whatever is being linked to is as relevant as possible.
Two sites that link to one another. If the links are reciprocal they will not usually be seen as meaningful by search engines.
Unwanted links like those posted within blog comments or spammy forum posts.
Unwanted links like those posted within blog comments or spammy forum posts.
The portion of a link that a user sees within the text they are reading. Search engines use this to determine the relevancy of the referring site and then link the content to the landing page. In the best case scenario all parties would have the same or similar keywords. See Also: Anchor Text
Keywords that are highly specific (such as geographical in nature), and as a result they are not targeted nearly as much by web masters. A very large percentage of keywords searched for are long tail, making them a very lucrative target for some.
A web page that contains a singular purpose software and other small applications, or maybe links to such programs. Mashups are very easy to create, and quite often they can be highly popular with users. There are a lot of tool collection pages that are sometimes mashups.
A meta description is a type of tag webmasters or coders use that involves a short description about a certain website. It contains an overview of the site’s content without going over a certain amount of words. Most search engine optimization experts will advise that such tag should properly introduce the website using the said description, since most of the Internet users will read it before they try to click on the link that is presented to them. The description should also have the right keywords so that the search engine crawlers will be able to easily find the site.
A unit of measurement that serves as output by many analytics programs.
Websites that are designed from the word go as a place to advertise. This does not have to always be a bad thing, but without the right direction it certainly can be.
A website that is identical but held at a different URL. Search Engines find these sites easily and can discount one or both of them. Not a wise approach to increasing traffic.
The ability to generate revenue from a website. Affiliate marketing is one of the most popular ways to monetize a website.
Search engine results that are not paid for or represented by the company being linked to at all. See Also: SERP
A command found in the head section of a web page or within the code of a link, that tells bots to not follow any links on a specific page. This is a method that could be considered a link condom.
The no-follow attribute is a value that is used in order to send a message to search engines to stop indexing the website. Since the major search engines crawl around the Internet in order to analyze millions and millions of websites, some webmasters may opt to have their sites ignored by the crawling process. Since the feature is considered to be useful, most blog providers are now using such command in the comments section of each blog. Considering the fact that there are millions of blog comments out there, the average Internet user might not want to be bothered with unnecessary search results from the comments section of all blogs around the world.
This is when one site links to another without receiving a link in return. These sorts of links are given more credence and authority by search engines, making them more valuable.
An outbound link is a link found on a website that, when clicked by an Internet user, would take him or her outside that website. In terms of online, prospective, this is the opposite of the inbound link, which is a link that takes a user towards a certain website. In search engine optimization, outbound links play a huge role when it comes to a site’s page rank. The more links that point to a certain website, the higher the page rank of that site is, which is why most webmasters opt to collect outbound links that point to their own websites.
One of the central focuses of search engine optimization is the PageRank, which is an algorithm that focuses on the link analysis capability of major search engines. This is used in order to allow the websites to be ranked based on importance. Search engines have to deal with millions and millions of websites, which is why PageRank is relied on in order to rank the sites during search results. In terms of doing so, the number of outbound links (links that point to an external site) is counted. The more outbound links a certain website has, the PageRank of that site will get higher.
There are lots of ways for a certain website to be displayed high atop the search results. One of these methods involves paid inclusion, which involves webmasters paying a certain amount of fee on major search engines in order to have their sites rank high on search results. Depending on the prerogatives of the search engines, the websites that are under paid inclusion will be treated in two different ways. One involves these sites being displayed as advertisements, while the other involves being included in the regular results. Either way, most webmasters resort to paid inclusion in order to have their site viewed by millions of people.
Penalty plays a role in the way a certain website is displayed in the search engines results page (SERP). Black hat techniques in search engine optimization oftentimes involve unethical spamming methods such as link farming and keyword stuffing, all of which are considered to be grounds for penalty. Search engines would resort to using algorithms or manual methods as a way to prevent these sites from ranking in search results. In order to remove the penalty, the webmasters of the penalized website will then be required to fix the reason for the penalty. Once done, the search engines will then remove the penalty.
Search engine optimization experts are most likely to avoid using poison words when it comes to writing articles and essays for a website. Websites are not the only ones that have reputations to deal with. Even mere words are often connected to content which allow certain sites to rank so low in search results. Search engines find these words within the title, meta descriptions and even the URL of the site and will therefore decrease the PageRank of such website. Most of the time, poison words are released by search engines, to which webmasters would think twice before including these terms in their sites.
A portal is a website that acts as a sort of entryway before the users may start using the Internet. Most people might be more familiar with the term “home page,” which is the first site that is loaded by web browsers upon connection to the Internet. The World Wide Web has plenty of portal sites, most of which have features such as emails, news, community groups, and many more. Portal sites can be changed by the Internet users via the features of their web browsers.
A lot like pay per click, except that the publisher only receives money if there is a conversion that takes place. Conversions can vary from signing up for an email newsletter to a purchase of a product or service. The more difficult the conversion the more expensive the payout for that conversion.
An advertising setup in which advertisers pay search engines whenever a user clicks on their ad and visits their website.
This is a sales term used by some SEOs to imply they can do the impossible, such as top ten search rankings in no time. Since ALL SEOs have to play by the same rules, the rules of search algorithms, proprietary methods can sometimes be considered questionable.
A quality link is a link that is trusted by search engines. Most of the time, these links are considered to be legitimate ones that are being used by the search engines as basis for the increase of a certain website’s PageRank. A link is considered to have good quality if it comes from a trusted website and if it is hard to get from the source itself. A link is also a quality link if it has been in existence in a trusted source for a long amount of time.
A pair of links is considered reciprocal in nature if it is shared directly or indirectly between two websites. This method of link trading is often looked down upon my major search engines. If it so happens that the search engines notice that a certain link is only a result of trade between two bloggers or websites, it will be disregarded in order to make way for more links of higher qualities. If a website contains a huge amount of reciprocal links yet only a few quality links, then it will have a lower PageRank from the search engines.
Referrer simply means a webpage that directs a user to another webpage. If the user clicks a link from Page A and gets redirected to Page B, then Page A is considered to be the referrer. All referrers of a website can be viewed by webmasters via web analysis programs as a means of search engine optimization process. Even SEO experts make it a point to analyze which referrer is the most active when it comes to directing users to a certain site.
A long-tailed keyword phrase that contains the name of a city or specific geographic area. For Example: Car Insurance Houston Texas.
Re-inclusion happens after a certain website is penalized or banned by search engines. Most of the time, the penalty is caused by various black hat methods that are done by these sites. In order for the webmaster to have the site become searchable again, he or she must submit a reinclusion request. The search engines will then evaluate as to whether or not the penalized website has already adhered to ethical SEO techniques.
A relative link is a link that takes the form of only the document’s filename within the same directory or server. Relative links serve as a shortcut to a certain file or webpage without having to include the entire URL in <a href> tags. This is different from the absolute link, which more so often takes the form of a URL that directs users to another webpage.
Relevancy refers to the degree as to how Internet users who are using search engines find the search results highly beneficial. The facts and information are gathered based on surveys and customer reviews. This is why some search engines tend to lean towards specific factors of search results in order to increase relevancy. For example, organic search results, which consist of non-paid websites instead of advertisements, are more preferred by search engines as these results produce higher relevancy.
Reputation management involves makin
A file in the root directory of a website that is used to restrict or prohibit the movements of bots on that website.
Copying the content from a website to another source, often performed by automatic bots.
Pages that are implemented in order to cause SEs to deliver inappropriate or poor results for a given search term.
The practice of promoting a website for the sake of receiving more online traffic or sales, generally the purpose is to increase the overall exposure of the site. This is done through a wide variety of methods such as link building, content marketing, and a whole lot more.
SEOP copywriting is a set of techniques that involve writing texts for websites in order to increase search engine visibility and reader count. Formally known as search engine optimization copywriting, this process considers the way a writer optimizes each written work in order to accommodate the various requirements of search engines that crawl on each and every website. In this process, the PageRank increases depending on the amount of quality keywords that are being placed within the texts. SEO copywriters also take into consideration the title and the anchor texts present within the written copies.
SERP stands for “search engine results page.” When a person types in a keyword in a search engine box, the next page that will display the links of the available websites is referred to as the SERP. Most of the time, the websites that appear on the top of the results page are those that are either the most visited or the ones that have the highest PageRank. In some cases, the results page display a short description of the links that are presented to the Internet user.
A site map is a page on the web that allows search engines to have an easier time navigating through that certain website. Major search engines utilized so-called crawlers that explore website in order to analyze whether or not it adheres to the rules of search engine optimization. Oftentimes the site map is referred to as the skeleton of a website which the search engines have access to. Large websites oftentimes include only the most important and the most visited webpages in order to give the search engines an easier time to navigate around the site map.
A term used to refer to marketing one’s website or product through sites like Facebook, Twitter, Pinterest, LinkedIn and so on. Successful techniques include contests and giveaways to engage the reader anc come back for more.
The act of nefariously accusing the competition of being a spammer, and then attacking the brand or company they represent.
An anonymous account or profile used to hide the true identity of the user while they spam or do other black hat activities pertaining to SEO via Social Media.
A type of Social Media in which users bookmark pages or other resources for other users to see. Digg and Reddit are two examples of this.
Spamdexing, which is a portmanteau of the words “spam” and “indexing” involves a set of techniques that forces search engines to index online information for increased visibility of the website. Most search engine optimization experts consider this as an unethical way to manipulate search engines in displaying a website on the SERP. Some of the most common methods of spamdexing are keyword stuffing, keyword popularity analysis, hidden texts, and even strategically-placed doorway sites.
A page within a website that contains nothing but advertisements, typically seen as a “last ditch” effort on the part of a web master to generate revenue.
A loop of links that are automatically generated in order to keep a spider program trapped. They are often used to keep such programs from scraping or email address gleaning.
A splash page is the part of a webpage that online users first stumble into the minute they click on the main link of the site. Most of the time, a splash page is commonly referred to as a welcome page. Splash pages usually consist of eye-catching pictures or moving texts in order to provide a good impression for first time visitors. These pages, however, serve as hindrances in search engines crawlers, as splash pages prevent a site from being indexed properly by SERP analytic tools.
A spam blog that usually contains very little information that a human being would ever care about. This is typically created by a program or scraped from somewhere else.
A page on a website in which the content does not change or have dynamic properties. These are great for spiders.
A page on a website in which the content does not change or have dynamic properties. These are great for spiders.
Pages with a very low page rank that are still relevant to queries made to search engines.
A plain HTML link that does not have any sort of scripting or graphic enhancement. See Also: Inlink or Back Link
The amount of time a user spends on a web page before they move on to something else.
The amount of trust bestowed upon a webpage by a search engine, typically determined by the number and quality of links given to it.
Simply put, this was created by the Ph.D Tim Berners-Lee, and started the foundation the internet and how computers could find resources on intranets and the internet.
The content seen on many social media websites, wikis, and blogs. There are a number of these resources which are created and subsequently monetized for the sole purpose of profit.
A group of websites that only link to one another, but not to anyone else. These sorts of setups do not typically enjoy a very good page rank.
Techniques that are condoned and encouraged by Google and other search engines. These are strategies that are thought to be a whole lot more effective in the long term, because they involve generating high quality content for the end user.