Google Search Console

Google Search Console (previously Google Webmaster Tools) is a web service by Google, provided free of charge for webmasters, which allows them to check indexing status and optimize visibility of their websites.

As of May 20, 2015, Google rebranded Google Webmaster Tools as Google Search Console.[1] In January 2018, Google introduced a new version of the Search Console, with a refreshed user interface and improvements.

Google Search Console
Type of site
Webmaster tools
OwnerGoogle
Websitesearch.google.com/search-console
CommercialNo

Features

It has tools that let webmasters:

  • Submit and check a sitemap and also helps the webmasters to check if there are any errors with their sitemap.
  • Check and set the crawl rate, and view statistics about when Googlebot accesses a particular site.
  • Write and check a robots.txt file to help discover pages that are blocked in robots.txt accidentally.
  • List internal and external pages that link to the site.
  • Get a list of links which Googlebot had difficulty crawling, including the error that Googlebot received when accessing the URLs in question.
  • See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings. (Previously named 'Search Queries'; rebranded May 20, 2015 to 'Search Analytics' with extended filter possibilities for devices, search types and date periods).[2]
  • Set a preferred domain (e.g. prefer example.com over www.example.com or vice versa), which determines how the site URL is displayed in SERPs.
  • Highlight to Google Search elements of structured data which are used to enrich search hit entries (released in December 2012 as Google Data Highlighter).[3]
  • Receive notifications from Google for manual penalties.[4][5]
  • Provide access to an API to add, change and delete listings and list crawl errors.[6]
  • Rich Cards a new section added, for better mobile user experience.[7]
  • Check the security issues if there are any with the website. (Hacked Site or Malware Attacks)
  • Add or remove the property owners and associates of the web property.

Features of Search Analytics reports

  • Accurate data
    • Search Analytics reports deliver more accurate reports than the Search Queries report.
    • The reports are up-to-date and provides the latest information possible.
  • Individual page count
    • Search Analytic reports considers all the links to the same page as single impression.
    • Separate reports are available to track the device type and search type.
  • Image click count more accurate
    • Search Analytics reports only count clicks as clicks on expanded images in an image search result to your page. The previous Search Queries report counts all the click on an images, expanded or not, in both web & images search.
  • Data consolidated by full domain
    • Search Analytics reports assign all clicks, impressions, and other search data to a single, complete host name.
    • Subdomains are regarded as separate entities by Search Console and need to be added separately.

Criticism and controversy

The list of inbound links on Google Webmaster Tools is generally much larger than the list of inbound links that can be discovered using the link:example.com search query on Google itself. Google is tight lipped about the discrepancy. The list on Google Webmaster Tools includes nofollow links that do not convey search engine optimization authority to the linked site. On the other hand, the list of links generated with a link:example.com type query are deemed by Google to be "important" links in a controversial way. Google Webmaster Tools, as well as the Google index, seems to routinely ignore link spam. Once a manual penalty has been removed, Google Webmaster Tools will still display the penalty for another 1–3 days.[8] After the Google Search Console rebrand, information has been produced demonstrating that Google Search Console creates data points that do not reconcile with Google Analytics or ranking data, particularly within the local search market.

See also

References

  1. ^ "Announcing Google Search Console - the new Webmaster Tools". Retrieved 2015-05-21.
  2. ^ "The 7 most important metrics in Google Search Console". 11 May 2016. Retrieved 10 December 2016.
  3. ^ Boudreaux, Ryan (2013-06-18). "How to use Google Data Highlighter, part 1". TechRepublic. Retrieved 2015-09-04.
  4. ^ DeMers, Jayson. "3 Steps to Take When You Suspect an Algorithmic Penalty From Google". searchenginejournal.com. Retrieved 7 March 2014.
  5. ^ Cutts, Matt. "View manual webspam actions in Webmaster Tools". Google. Retrieved 7 March 2014.
  6. ^ "Webmaster Tools API | Google Developers". Google Developers. Retrieved 2015-06-02.
  7. ^ "Introducing rich cards". Retrieved 2016-07-14.
  8. ^ Jansen, Derek. "Manual Spam Action Revoked – But It's Still Listed in Webmaster Tools". PP. Retrieved 31 March 2015. Google typically takes 24-72 hours to remove the message within the "Manual Actions" section of Google Webmaster Tools.

External links

Conductor (company)

Conductor was founded in 2006 as a marketing services company, and launched its SaaS (Software as a Service) platform, Conductor Searchlight, in 2010. The company’s primary product is Conductor Searchlight, a cloud-based content intelligence platform. Conductor also provides recommended actions for optimizing digital marketing metrics and increasing revenue.On March 6, 2018, it was announced that WeWork would acquire Conductor.

Google Analytics

Google Analytics is a web analytics service offered by Google that tracks and reports website traffic, currently as a platform inside the Google Marketing Platform brand. Google launched the service in November 2005 after acquiring developer Urchin.Google Analytics is the most widely used web analytics service on the web. Google Analytics provides an SDK that allows gathering usage data from iOS and Android app, known as Google Analytics for Mobile Apps.

Google Search

Google Search, also referred to as Google Web Search or simply Google, is a web search engine developed by Google LLC. It is the most used search engine on the World Wide Web across all platforms, with 92.74% market share as of October 2018, handling more than 3.5 billion searches each day.The order of search results returned by Google is based, in part, on a priority rank system called "PageRank". Google Search also provides many different options for customized search, using symbols to include, exclude, specify or require certain search behavior, and offers specialized interactive experiences, such as flight status and package tracking, weather forecasts, currency, unit and time conversions, word definitions, and more.

The main purpose of Google Search is to hunt for text in publicly accessible documents offered by web servers, as opposed to other data, such as images or data contained in databases. It was originally developed by Larry Page and Sergey Brin in 1997. In June 2011, Google introduced "Google Voice Search" to search for spoken, rather than typed, words. In May 2012, Google introduced a Knowledge Graph semantic search feature in the U.S.

Analysis of the frequency of search terms may indicate economic, social and health trends. Data about the frequency of use of search terms on Google can be openly inquired via Google Trends and have been shown to correlate with flu outbreaks and unemployment levels, and provide the information faster than traditional reporting methods and surveys. As of mid-2016, Google's search engine has begun to rely on deep neural networks.Competitors of Google include Baidu and Soso.com in China; Naver.com and Daum.net in South Korea; Yandex in Russia; Seznam.cz in the Czech Republic; Yahoo in Japan, Taiwan and the US, as well as Bing and DuckDuckGo. Some smaller search engines offer facilities not available with Google, e.g. not storing any private or tracking information.

Within the US, as of July 2018, Microsoft Sites handled 24.2 percent of all search queries in the United States. During the same period of time, Oath (formerly known as Yahoo) had a search market share of 11.5 percent. Market leader Google generated 63.2 percent of all core search queries in the United States.

Googlebot

Googlebot is the web crawler software used by Google, which collects documents from the web to build a searchable index for the Google Search engine. This name is actually used to refer to two different types of web crawlers: a desktop crawler (to simulate desktop users) and a mobile crawler (to simulate a mobile user).A website will probably be crawled by both Googlebot Desktop and Googlebot Mobile. The subtype of Googlebot can be identified by looking at the user agent string in the request. However, both crawler types obey the same product token (useent token) in robots.txt, and so a developer cannot selectively target either Googlebot mobile or Googlebot desktop using robots.txt.

If a webmaster wishes to restrict the information on their site available to a Googlebot, or another well-behaved spider, they can do so with the appropriate directives in a robots.txt file, or by adding the meta tag to the web page. Googlebot requests to Web servers are identifiable by a user-agent string containing "Googlebot" and a host address containing "googlebot.com".Currently, Googlebot follows HREF links and SRC links. There is increasing evidence Googlebot can execute JavaScript and parse content generated by Ajax calls as well. There are many theories regarding how advanced Googlebot's ability is to process JavaScript, with opinions ranging from minimal ability derived from custom interpreters. Currently, Googlebot uses a web rendering service (WRS) that is based on Chrome 41 (M41). Googlebot discovers pages by harvesting all the links on every page it finds. It then follows these links to other web pages. New web pages must be linked to from other known pages on the web in order to be crawled and indexed or manually submitted by the webmaster.

A problem that webmasters have often noted with the Googlebot is that it takes up an enormous amount of bandwidth. This can cause websites to exceed their bandwidth limit and be taken down temporarily. This is especially troublesome for mirror sites which host many gigabytes of data. Google provides "Search Console" that allow website owners to throttle the crawl rate.How often Googlebot will crawl a site depends on the crawl budget. Crawl budget is an estimation of how often a website is updated. A site's crawl budget is determined by how many incoming links it has and how frequently the site is updated.Technically, Googlebot's development team (Crawling and Indexing team) uses several defined terms internally to takes over what "crawl budget" stands for.

Hreflang

The rel="alternate" hreflang="x" link attribute is a HTML meta element described in RFC 5988. Hreflang specifies the language and optional geographic restrictions for a document. Hreflang is interpreted by search engines and can be used by webmasters to clarify the lingual and geographical targeting of a website.

List of Google April Fools' Day jokes

Google frequently inserts jokes and hoaxes into its products on April Fools' Day, which takes place on April 1.

List of Google products

The following is a list of products and services provided by Google.

Search engine marketing

Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising. SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings.

Search engine optimization

Search engine optimization (SEO) is the process of increasing the visibility of a website or a web page to users of a web search engine. The term excludes the purchase of paid placement, referring only to the improvement of unpaid results (known as "natural" or "organic" results).

SEO is performed because a website will receive more visitors from a search engine the higher the website ranks in the search engine results page (SERP). These visitors can then be converted into customers. SEO may target different kinds of search, including image search, video search, academic search, news search, and industry-specific vertical search engines. SEO differs from local search engine optimization in that the latter is focused on optimizing a business' online presence so that its web pages will be displayed by search engines when a user enters a local search for its products or services. The former instead is more focused on national or international searches.

As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, modifying HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.

Sitemaps

The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more efficiently and to find URLs that may be isolated from rest of the site's content. The sitemaps protocol is a URL inclusion protocol and complements robots.txt, a URL exclusion protocol.

Yahoo! Site Explorer

Yahoo! Site Explorer (YSE) was a Yahoo! service which allowed users to view information on websites in Yahoo!'s search index. The service was closed on November 21, 2011 and merged with Bing Webmaster Tools, a tool similar to Google Search Console (previously Google Webmaster Tools). In particular, it was useful for finding information on backlinks pointing to a given webpage or domain because YSE offered full, timely backlink reports for any site. After merging with Bing Webmaster Tools, the service only offers full backlink reports to sites owned by the webmaster. Reports for sites not owned by the webmaster are limited to 1,000 links.Webmasters who added a special authentication code to their websites were also allowed to:

See extra information on their sites

Submit Sitemaps

Submit up to 20 URL removal requests for their domains to Yahoo!.

Rewrite dynamic URLs from their site by either removing a dynamic parameter or by using a default value for a parameter.

Submit feeds for Yahoo Search Monkey

View Errors Yahoo encountered while crawling their web site

Company
Advertising
Communication
Software
Platforms
Hardware
Development
Publishing
Search
(timeline)
Events
People
Real estate
Other
Related

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.