Google Penguin is a codename for a Google algorithm update that was first announced on April 24, 2012. The update is aimed at decreasing search engine rankings of websites that violate Google's Webmaster Guidelines by using now declared black-hat SEO techniques involved in increasing artificially the ranking of a webpage by manipulating the number of links pointing to the page. Such tactics are commonly described as link schemes. According to Google's John Mueller, Google has announced all updates to the Penguin filter to the public.
By Google's estimates, Penguin affects approximately 3.1% of search queries in English, about 3% of queries in languages like German, Chinese, and Arabic, and an even greater percentage of them in "highly spammed" languages. On May 25, 2012, Google unveiled another Penguin update, called Penguin 1.1. This update, according to Matt Cutts, former head of webspam at Google, was supposed to affect less than one-tenth of a percent of English searches. The guiding principle for the update was to penalize websites that were using manipulative techniques to achieve high rankings. Pre-Penguin sites commonly used negative link building techniques to rank highly and get traffic, once Penguin was rolled out it meant that content was key and those with great content would be recognised and those with little or spammy content would be penalised and receive no ranking benefits. The purpose per Google was to catch excessive spammers. Allegedly, few websites lost search rankings on Google for specific keywords during the Panda and Penguin rollouts. Google specifically mentions that doorway pages, which are only built to attract search engine traffic, are against their webmaster guidelines....
Penguin 3 was released October 5, 2012 and affected 0.3% of queries. Penguin 4 (also known as Penguin 2.0) was released on May 22, 2013 and affected 2.3% of queries. Penguin 5 (also known as Penguin 2.1) was released on October 4, 2013, affected around 1% of queries, and has been the most recent of the Google Penguin algorithm updates.
Google may have released Penguin 3.0 on October 18, 2014.
On October 21, 2014, Google's Pierre Farr confirmed that Penguin 3.0 was an algorithm "refresh", with no new signals added.
On April 7, 2015, Google's John Mueller said in a Google+ hangout that both Penguin and Panda "currently are not updating the data regularly" and that updates must be pushed out manually. This confirms that the algorithm is not updated continuously which was believed to be the case earlier on in the year.
The strategic goal that Panda, Penguin, and the page layout update share is to display higher quality websites at the top of Google's search results. However, sites that were downranked as the result of these updates have different sets of characteristics. The main target of Google Penguin is spamdexing (including link bombing).
In a Google+ Hangout on April 15, 2016, John Mueller said "I am pretty sure when we start rolling out [Penguin] we will have a message to kind of post but at the moment I don't have anything specific to kind of announce."
On September 23, 2016 Google announced that Google Penguin was now part of the core algorithm meaning that it updates in real time. Hence there will not longer be announcements by Google relating to future refreshes. Real-time also means that websites are evaluated in real-time and rankings impacted in real-time. During the last years webmasters instead always had to wait for the roll-out of the next update to get out of a Penguin penalty. Also, Google Penguin 4.0 is more granular as opposed to previous updates, since it may affect a website on a URL-basis as opposed to always affecting a whole website. Finally, Penguin 4.0  differs from previous Penguin versions since it does not demote a web site when it finds bad links. Instead it discounts the links, meaning it ignores them and they no longer count toward the website's ranking. As a result of this, there is less need to use the disavow file. Google uses both algorithm and human reviewers to identify links that are unnatural (artificial), manipulative or deceptive and includes these in its Manual Actions report for websites.
Two days after the Penguin update was released Google prepared a feedback form, designed for two categories of users: those who want to report web spam that still ranks highly after the search algorithm change, and those who think that their site got unfairly hit by the update. Google also has a reconsideration form through Google Webmaster Tools.
In January 2015, Google's John Mueller said that a Penguin penalty can be removed by simply building good links. The usual process is to remove bad links manually or by using Google's Disavow tool and then filing a reconsideration request. Mueller elaborated on this by saying the algorithm looks at the percentage of good links versus bad links, so by building more good links it may tip the algorithm in your favor which would lead to recovery.
The anchor text, link label, link text, or link title is the visible, clickable text in a hyperlink. The words contained in the anchor text can determine the ranking that the page will receive by search engines. Since 1998, some web browsers have added the ability to show a tooltip for a hyperlink before it is selected. Not all links have anchor texts because it may be obvious where the link will lead due to the context in which it is used. Anchor texts normally remain below 60 characters. Different browsers will display anchor texts differently. Usually, web search engines analyze anchor text from hyperlinks on web pages. Other services apply the basic principles of anchor text analysis as well. For instance, academic search engines may use citation context to classify academic articles, and anchor text from documents linked in mind maps may be used too.Arindam Chaudhuri
Arindam Chaudhuri is an Indian author and the director of IIPM Think Tank at Indian Institute of Planning and Management.He has produced movies that have collected three National Film Awards.Article directory
An article directory is a website with collections of articles written about different subjects. Sometimes article directories are referred to as content farms, which are websites created to produce mass content, where some are based on churnalism.
An article directory may accept new articles from any contributor, but may require that a new article is unique (not published elsewhere) and not spun (see article spinning). A typical article is around 400-500 words, and tools such as a WYSIWYG editor for writing and submitting an article may be provided.
An author box may be provided for personal information about an author, including a link to the author's website.
Tags or categories may be used to organize articles and to help with search engines since tags or categories act as keywords that identify the topics covered in the article. Many directories pay the author for his/her participation. Some directories review articles before they are published and there may be a waiting period of several days before a new article appears. This helps to eliminate low quality submissions, including duplicate articles, spam and spun articles.Fruition
Fruition is a full service digital marketing agency based in Denver, Colorado that provides web design & development and Internet marketing services to multiple international corporations. The company is also known for its Google Penalty Checker tool.Google Hummingbird
Hummingbird is the codename given to a significant algorithm change in Google Search in 2013. Its name was derived from the speed and accuracy of the hummingbird. The change was announced on September 26, 2013, having already been in use for a month. "Hummingbird" places greater emphasis on natural language queries, considering context and meaning over individual keywords. It also looks deeper at content on individual pages of a website, with improved ability to lead users directly to the most appropriate page rather than just a website's homepage.
The upgrade marked the most significant change to Google search in years, with more "human" search interactions and a much heavier focus on conversation and meaning. Thus, web developers and writers were encouraged to optimize their sites with natural writing rather than forced keywords, and make effective use of technical web development for on-site navigation.Google Panda
Google Panda is a major change to Google's search results ranking algorithm that was first released in February 2011. The change aimed to lower the rank of "low-quality sites" or "thin sites", in particular "content farms", and return higher-quality sites near the top of the search results.
CNET reported a surge in the rankings of news websites and social networking sites, and a drop in rankings for sites containing large amounts of advertising. This change reportedly affected the rankings of almost 12 percent of all search results. Soon after the Panda rollout, many websites, including Google's webmaster forum, became filled with complaints of scrapers/copyright infringers getting better rankings than sites with original content. At one point, Google publicly asked for data points to help detect scrapers better. In 2016, Matt Cutts, Google's head of webspam at the time of the Panda update, commented that "with Panda, Google took a big enough revenue hit via some partners that Google actually needed to disclose Panda as a material impact on an earnings call. But I believe it was the right decision to launch Panda, both for the long-term trust of our users and for a better ecosystem for publishers."Google's Panda received several updates after the original rollout in February 2011, and their effect went global in April 2011. To help affected publishers, Google provided an advisory on its blog, thus giving some direction for self-evaluation of a website's quality. Google has provided a list of 23 bullet points on its blog answering the question of "What counts as a high-quality site?" that is supposed to help webmasters "step into Google's mindset".The name "Panda" comes from Google engineer Navneet Panda, who developed the technology that made it possible for Google to create and implement the algorithm.Google Pigeon
Google Pigeon is the code name given to one of Google's local search algorithm updates. This update was released on July 24, 2014. The update is aimed to increase the ranking of local listing in a search.
The changes will also affect the search results shown in Google Maps along with the regular Google search results.
As of the initial release date, it was released in US English and was intended to shortly be released in other languages and locations. This update provides the results based on the user location and the listing available in the local directory.Google Search
Google Search, also referred to as Google Web Search or simply Google, is a web search engine developed by Google LLC. It is the most used search engine on the World Wide Web across all platforms, with 92.74% market share as of October 2018, handling more than 3.5 billion searches each day.The order of search results returned by Google is based, in part, on a priority rank system called "PageRank". Google Search also provides many different options for customized search, using symbols to include, exclude, specify or require certain search behavior, and offers specialized interactive experiences, such as flight status and package tracking, weather forecasts, currency, unit and time conversions, word definitions, and more.
The main purpose of Google Search is to hunt for text in publicly accessible documents offered by web servers, as opposed to other data, such as images or data contained in databases. It was originally developed by Larry Page and Sergey Brin in 1997. In June 2011, Google introduced "Google Voice Search" to search for spoken, rather than typed, words. In May 2012, Google introduced a Knowledge Graph semantic search feature in the U.S.
Analysis of the frequency of search terms may indicate economic, social and health trends. Data about the frequency of use of search terms on Google can be openly inquired via Google Trends and have been shown to correlate with flu outbreaks and unemployment levels, and provide the information faster than traditional reporting methods and surveys. As of mid-2016, Google's search engine has begun to rely on deep neural networks.Competitors of Google include Baidu and Soso.com in China; Naver.com and Daum.net in South Korea; Yandex in Russia; Seznam.cz in the Czech Republic; Yahoo in Japan, Taiwan and the US, as well as Bing and DuckDuckGo. Some smaller search engines offer facilities not available with Google, e.g. not storing any private or tracking information.
Within the US, as of July 2018, Microsoft Sites handled 24.2 percent of all search queries in the United States. During the same period of time, Oath (formerly known as Yahoo) had a search market share of 11.5 percent. Market leader Google generated 63.2 percent of all core search queries in the United States.Google bomb
The terms Google bomb and Googlewashing refer to the practice of causing a website to rank highly in web search engine results for irrelevant, unrelated or off-topic search terms by linking heavily. In contrast, search engine optimization (SEO) is the practice of improving the search engine listings of web pages for relevant search terms.
Google-bombing is done for either business, political, or comedic purposes (or some combination thereof). Google's search-rank algorithm ranks pages higher for a particular search phrase if enough other pages linked to it use similar anchor text. By January 2007, however, Google had tweaked its search algorithm to counter popular Google bombs such as "miserable failure" leading to George W. Bush and Michael Moore; now, search results list pages about the Google bomb itself. Since no later than 21 June 2015, the first result in a Google search for "miserable failure" is the Wikipedia article defining Google bomb. Used both as a verb and a noun, "Google bombing" was introduced to the New Oxford American Dictionary in May 2005.Google bombing is related to spamdexing, the practice of deliberately modifying HTML to increase the chance of a website being placed close to the beginning of search engine results, or to influence the category to which the page is assigned in a misleading or dishonest manner.The term Googlewashing was coined by Andrew Orlowski in 2003 to describe the use of media manipulation to change the perception of a term, or push out competition from search engine results pages (SERPs).Google penalty
A Google penalty is the negative impact on a website's search rankings based on updates to Google's search algorithms or manual review. The penalty can be a by-product of an algorithm update or an intentional penalization for various black-hat SEO techniques.Link building
In the field of search engine optimization (SEO), link building describes actions aimed at increasing the number and quality of inbound links to a webpage with the goal of increasing the search engine rankings of that page or website. Briefly, link building is the process of establishing relevant hyperlinks (usually called links) to a website from external sites. Link building can increase the number of high-quality links pointing to a website, in turn increasing the likelihood of the website ranking highly in search engine results. Link building is also a proven marketing tactic for increasing brand awareness.Outline of Google
The following outline is provided as an overview of and topical guide to Google:
Google – American multinational technology company specializing in Internet-related services and products that include online advertising technologies, search, cloud computing, software, and hardware.PageRank
PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. PageRank was named after Larry Page, one of the founders of Google. PageRank is a way of measuring the importance of website pages. According to Google: PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites. Currently, PageRank is not the only algorithm used by Google to order search results, but it is the first algorithm that was used by the company, and it is the best known.Search engine optimization
Search engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine's unpaid results—often referred to as "natural", "organic", or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers. SEO may target different kinds of search, including image search, video search, academic search, news search, and industry-specific vertical search engines. SEO differs from local search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search. In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.Spamdexing
In digital marketing and online advertising, spamdexing (also known as search engine spam, search engine poisoning, black-hat search engine optimization (SEO), search spam or web spam) is the deliberate manipulation of search engine indexes. It involves a number of methods, such as link building and repeating unrelated phrases, to manipulate the relevance or prominence of resources indexed, in a manner inconsistent with the purpose of the indexing system.It could be considered to be a part of search engine optimization, though there are many search engine optimization methods that improve the quality and appearance of the content of web sites and serve content useful to many users. Search engines use a variety of algorithms to determine relevancy ranking. Some of these include determining whether the search term appears in the body text or URL of a web page. Many search engines check for instances of spamdexing and will remove suspect pages from their indexes. Also, search-engine operators can quickly block the results-listing from entire websites that use spamdexing, perhaps alerted by user complaints of false matches. The rise of spamdexing in the mid-1990s made the leading search engines of the time less useful. Using unethical methods to make websites rank higher in search engine results than they otherwise would is commonly referred to in the SEO (search engine optimization) industry as "black-hat SEO". These methods are more focused on breaking the search-engine-promotion rules and guidelines. In addition to this, the perpetrators run the risk of their websites being severely penalized by the Google Panda and Google Penguin search-results ranking algorithms.Common spamdexing techniques can be classified into two broad classes: content spam (or term spam) and link spam.Timeline of Google Search
Google Search, offered by Google, is the most widely used search engine on the World Wide Web as of 2014, with over three billion searches a day. This page covers key events in the history of Google's search service.
For a history of Google the company, including all of Google's products, acquisitions, and corporate changes, see the history of Google page.Timeline of web search engines
This page provides a full timeline of web search engines, starting from the Archie search engine in 1990. It is complementary to the history of web search engines page that provides more qualitative detail on the history.Web content
Web content is the textual, visual, or aural content that is encountered as part of the user experience on websites. It may include—among other things—text, images, sounds, videos, and animations.
In Information Architecture for the World Wide Web, Lou Rosenfeld and Peter Morville write, "We define content broadly as 'the stuff in your Web site.' This may include documents, data, applications, e-services, images, audio and video files, personal Web pages, archived e-mail messages, and more. And we include future stuff as well as present stuff."