World Wide Web Consortium

The World Wide Web Consortium (W3C) is the main international standards organization for the World Wide Web (abbreviated WWW or W3).

Founded and currently led by Tim Berners-Lee, the consortium is made up of member organizations which maintain full-time staff for the purpose of working together in the development of standards for the World Wide Web. As of 19 November 2018, the World Wide Web Consortium (W3C) has 476 members.[3][2]

The W3C also engages in education and outreach, develops software and serves as an open forum for discussion about the Web.

World Wide Web Consortium
W3C® Icon
MottoLeading the Web to Its Full Potential
Formation1 October 1994
TypeStandards organization
PurposeDeveloping protocols and guidelines that ensure long-term growth for the Web.
HeadquartersCambridge, Massachusetts, United States
Coordinates42°21′43.4″N 71°05′27.0″W / 42.362056°N 71.090833°WCoordinates: 42°21′43.4″N 71°05′27.0″W / 42.362056°N 71.090833°W
Region served
476 member organizations[2]
Tim Berners-Lee


The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October, 1994. It was founded at the Massachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with support from the European Commission and the Defense Advanced Research Projects Agency (DARPA), which had pioneered the ARPANET, one of the predecessors to the Internet.[3] It was located in Technology Square until 2004, when it moved, with CSAIL, to the Stata Center.[4]

The organization tries to foster compatibility and agreement among industry members in the adoption of new standards defined by the W3C. Incompatible versions of HTML are offered by different vendors, causing inconsistency in how web pages are displayed. The consortium tries to get all those vendors to implement a set of core principles and components which are chosen by the consortium.

It was originally intended that CERN host the European branch of W3C; however, CERN wished to focus on particle physics, not information technology. In April 1995, the French Institute for Research in Computer Science and Automation (INRIA) became the European host of W3C, with Keio University Research Institute at SFC (KRIS) becoming the Asian host in September 1996.[5] Starting in 1997, W3C created regional offices around the world. As of September 2009, it had eighteen World Offices covering Australia, the Benelux countries (Netherlands, Luxembourg, and Belgium), Brazil, China, Finland, Germany, Austria, Greece, Hong Kong, Hungary, India, Israel, Italy, South Korea, Morocco, South Africa, Spain, Sweden, and, as of 2016, the United Kingdom and Ireland.[6]

In October 2012, W3C convened a community of major web players and publishers to establish a MediaWiki wiki that seeks to document open web standards called the WebPlatform and WebPlatform Docs.

In January 2013, Beihang University became the Chinese host.

Specification maturation

Sometimes, when a specification becomes too large, it is split into independent modules which can mature at their own pace. Subsequent editions of a module or specification are known as levels and are denoted by the first integer in the title (e.g. CSS3 = Level 3). Subsequent revisions on each level are denoted by an integer following a decimal point (e.g. CSS2.1 = Revision 1).

The W3C standard formation process is defined within the W3C process document, outlining four maturity levels through which each new standard or recommendation must progress.[7]

Working draft (WD)

After enough content has been gathered from 'editor drafts' and discussion, it may be published as a working draft (WD) for review by the community. A WD document is the first form of a standard that is publicly available. Commentary by virtually anyone is accepted, though no promises are made with regard to action on any particular element commented upon.[7]

At this stage, the standard document may have significant differences from its final form. As such, anyone who implements WD standards should be ready to significantly modify their implementations as the standard matures.[7]

Candidate recommendation (CR)

A candidate recommendation is a version of a standard that is more mature than the WD. At this point, the group responsible for the standard is satisfied that the standard meets its goal. The purpose of the CR is to elicit aid from the development community as to how implementable the standard is.[7]

The standard document may change further, but at this point, significant features are mostly decided. The design of those features can still change due to feedback from implementors.[7]

Proposed recommendation (PR)

A proposed recommendation is the version of a standard that has passed the prior two levels. The users of the standard provide input. At this stage, the document is submitted to the W3C Advisory Council for final approval.[7]

While this step is important, it rarely causes any significant changes to a standard as it passes to the next phase.[7]

W3C recommendation (REC)

This is the most mature stage of development. At this point, the standard has undergone extensive review and testing, under both theoretical and practical conditions. The standard is now endorsed by the W3C, indicating its readiness for deployment to the public, and encouraging more widespread support among implementors and authors.[7]

Recommendations can sometimes be implemented incorrectly, partially, or not at all, but many standards define two or more levels of conformance that developers must follow if they wish to label their product as W3C-compliant.[7]

Later revisions

A recommendation may be updated or extended by separately-published, non-technical errata or editor drafts until sufficient substantial edits accumulate for producing a new edition or level of the recommendation. Additionally, the W3C publishes various kinds of informative notes which are to be used as references.[7]


Unlike the ISOC and other international standards bodies, the W3C does not have a certification program. The W3C has decided, for now, that it is not suitable to start such a program, owing to the risk of creating more drawbacks for the community than benefits.[7]


The Consortium is jointly administered by the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL, located in Stata Center) in the United States, the European Research Consortium for Informatics and Mathematics (ERCIM) (in Sophia Antipolis, France), Keio University (in Japan) and Beihang University (in China).[8][9] The W3C also has World Offices in eighteen regions around the world.[10] The W3C Offices work with their regional web communities to promote W3C technologies in local languages, broaden the W3C's geographical base and encourage international participation in W3C Activities.

The W3C has a staff team of 70–80 worldwide as of 2015.[11] W3C is run by a management team which allocates resources and designs strategy, led by CEO Jeffrey Jaffe (as of March 2010), former CTO of Novell. It also includes an advisory board which supports in strategy and legal matters and helps resolve conflicts.[12][13] The majority of standardization work is done by external experts in the W3C's various working groups.


The Consortium is governed by its membership. The list of members is available to the public.[2] Members include businesses, nonprofit organizations, universities, governmental entities, and individuals.[14]

Membership requirements are transparent except for one requirement: An application for membership must be reviewed and approved by the W3C. Many guidelines and requirements are stated in detail, but there is no final guideline about the process or standards by which membership might be finally approved or denied.[15]

The cost of membership is given on a sliding scale, depending on the character of the organization applying and the country in which it is located.[16] Countries are categorized by the World Bank's most recent grouping by GNI ("Gross National Income") per capita.[17]


In 2012 and 2013, the W3C started considering adding DRM-specific Encrypted Media Extensions (EME) to HTML5, which was criticised as being against the openness, interoperability, and vendor neutrality that distinguished websites built using only W3C standards from those requiring proprietary plug-ins like Flash.[18][19][20][21][22]

On September 18, 2017, the W3C published the EME specification as a Recommendation, leading to the Electronic Frontier Foundation's resignation from W3C.[23][24]


W3C/IETF standards (over Internet protocol suite):


  1. ^ "W3C Invites Chinese Web Developers, Industry, Academia to Assume Greater Role in Global Web Innovation". 20 January 2013. Retrieved 30 November 2013.
  2. ^ a b c "World Wide Web Consortium – current Members". World Wide Web Consortium. 29 March 2012. Retrieved 19 November 2018.
  3. ^ a b W3C (September 2009). "World Wide Web Consortium (W3C) About the Consortium". Retrieved 19 November 2018.
  4. ^ Michael Blanding, "The Past and Future of Kendall Square", MIT Technology Review August 18, 2015 [1]
  5. ^ "Press Release: Keio University joins MIT and INRIA in hosting W3C". Retrieved 13 July 2017.
  6. ^ Jacobs, Ian (June 2009). "W3C Offices". Retrieved 14 September 2009.
  7. ^ a b c d e f g h i j k "World Wide Web Consortium | Development Process". 12 April 2005. Retrieved 3 April 2012.
  8. ^ "W3C Contact". 31 October 2006. Retrieved 3 April 2012.
  9. ^ "Facts About W3C". W3C. Retrieved 7 November 2015.
  10. ^ "List of Offices". World Wide Web Consortium. Retrieved 19 November 2018.
  11. ^ "W3C people list". Retrieved 3 April 2012.
  12. ^ "W3C pulls former Novell CTO for CEO spot". 8 March 2010. Retrieved 3 April 2012.
  13. ^ "The World Wide Web Consortium: Building a Better Internet". Mays Digital. Archived from the original on 18 August 2016. Retrieved 7 November 2015.
  14. ^ W3C (2010). "Membership FAQ – W3C". Retrieved 7 August 2010.
  15. ^ Jacobs, Ian (2008). "Join W3C". Retrieved 14 September 2008.
  16. ^ W3C Membership Fee Calculator
  17. ^ "World Bank Country Classification". Retrieved 3 July 2010.
  18. ^ Cory Doctorow (12 March 2013). "What I wish Tim Berners-Lee understood about DRM". Technology blog at Archived from the original on 6 April 2013. Retrieved 20 March 2013.
  19. ^ Glyn Moody (13 February 2013). "BBC Attacks the Open Web, GNU/Linux in Danger". Open Enterprise blog at Archived from the original on 6 April 2013. Retrieved 20 March 2013.
  20. ^ Scott Gilbertson (12 February 2013). "DRM for the Web? Say It Ain't So". Webmonkey. Condé Nast. Archived from the original on 6 April 2013. Retrieved 21 March 2013.
  21. ^ "Tell W3C: We don't want the Hollyweb". Defective by Design. Free Software Foundation. March 2013. Archived from the original on 6 April 2013. Retrieved 25 March 2013.
  22. ^ Danny O'Brien (October 2013). "Lowering Your Standards: DRM and the Future of the W3C". Electronic Frontier Foundation. Retrieved 3 October 2013.
  23. ^ Peter Bright (18 September 2017). "HTML5 DRM finally makes it as an official W3C Recommendation". Ars Technica. Retrieved 18 September 2017.
  24. ^ Cory Doctorow (18 September 2017). "An open letter to the W3C Director, CEO, team and membership". Blog at Electronic Frontier Foundation. Retrieved 18 September 2017.
  25. ^ Groth, Paul; Moreau, Luc (30 April 2013). "PROV-Overview: An Overview of the PROV Family of Documents". World Wide Web Consortium. Retrieved 8 April 2016.

External links

Agora (web browser)

Agora was a World Wide Web email browser and was a proof of concept to help people to use the full internet. Agora was an email-based web browser designed for non-graphic terminals and to help people without full access to the internet such as in developing countries or without a permanent internet connection. Similar to W3Gate, Agora was a server application designed to fetch HTML documents through e-mail rather than http.

Amaya (web editor)

Amaya (formerly Amaya World) is a discontinued free and open source WYSIWYG web authoring tool with browsing abilities.

It was created by a structured editor project at the INRIA, a French national research institution, and later adopted by the World Wide Web Consortium (W3C) as their testbed for web standards; a role it took over from the Arena web browser. Since the last release in January 2012, INRIA and the W3C have stopped supporting the project and active development has ceased.Amaya has relatively low system requirements, even in comparison with other web browsers from the era of its active development period, so it has been considered a "lightweight" browser.

Arena (web browser)

The Arena browser (also known as the Arena WWW Browser) was one of the first web browsers for Unix. Originally created by Dave Raggett in 1993, the browser continued its development at CERN and the World Wide Web Consortium (W3C) and subsequently by Yggdrasil Computing. As a testbed browser, Arena was used in testing the implementation for HTML version 3.0, Cascading Style Sheets (CSS), Portable Network Graphics (PNG), and libwww. Arena was widely used and popular at the beginning of the World Wide Web.

Arena, which predated Netscape Navigator and Microsoft's Internet Explorer, featured a number of innovations used later in commercial products. It was the first browser to support background images, tables, text flow around images, and inline mathematical expressions.The Arena browser served as the W3C's testbed browser from 1994 to 1996 when it was succeeded by the Amaya project.

Computer Graphics Metafile

Computer Graphics Metafile (CGM) is a free and open international standard file format for 2D vector graphics, raster graphics, and text, and is defined by ISO/IEC 8632.

Document Object Model

The Document Object Model (DOM) is a cross-platform and language-independent application programming interface that treats an XML document as a tree structure wherein each node is an object representing a part of the document. The DOM represents a document with a logical tree. Each branch of the tree ends in a node, and each node contains objects. DOM methods allow programmatic access to the tree; with them one can change the structure, style or content of a document. Nodes can have event handlers attached to them. Once an event is triggered, the event handlers get executed.The principal standardization of DOM was handled by the World Wide Web Consortium, which last developed a recommendation in 2004. WHATWG took over development of the standard, publishing it as a living document. The W3C now publishes stable snapshots of the WHATWG standard.


Hypertext Markup Language (HTML) is the standard markup language for creating web pages and web applications. With Cascading Style Sheets (CSS) and JavaScript, it forms a triad of cornerstone technologies for the World Wide Web.Web browsers receive HTML documents from a web server or from local storage and render the documents into multimedia web pages. HTML describes the structure of a web page semantically and originally included cues for the appearance of the document.

HTML elements are the building blocks of HTML pages. With HTML constructs, images and other objects such as interactive forms may be embedded into the rendered page. HTML provides a means to create structured documents by denoting structural semantics for text such as headings, paragraphs, lists, links, quotes and other items. HTML elements are delineated by tags, written using angle brackets. Tags such as and directly introduce content into the page. Other tags such as

surround and provide information about document text and may include other tags as sub-elements. Browsers do not display the HTML tags, but use them to interpret the content of the page.

HTML can embed programs written in a scripting language such as JavaScript, which affects the behavior and content of web pages. Inclusion of CSS defines the look and layout of content. The World Wide Web Consortium (W3C), maintainer of both the HTML and the CSS standards, has encouraged the use of CSS over explicit presentational HTML since 1997.

Indexed Database API

The Indexed Database API (commonly referred to as IndexedDB) is a JavaScript application programming interface (API) provided by web browsers for managing a NoSQL database of JSON objects. It is a standard maintained by the World Wide Web Consortium (W3C).As an alternative to the Web storage standard, IndexedDB can provide more storage capacity. Web storage has fixed limits per website, but IndexedDB limits are "usually quite large, if they exist at all".Use cases for IndexedDB include caching web application data for offline availability. Some browser modules, such as devtools or extensions, may also use it for storage.


IndieAuth is a standard decentralized authentication protocol that uses OAuth 2.0 and enables services to verify the identity of a user represented by a URL as well as to obtain an access token that can be used to access resources under the control of the user..

IndieAuth is developed in the IndieWeb community and was published as a W3C Note. It was published as a W3C Note by the Social Web Working Group due to lacking the time needed to formally progress it to a W3C recommendation, despite having several interoperable implementations.


InkML is an XML-based markup language to describe "ink" data input with an electronic pen or stylus. The recommended specification was published by the World Wide Web Consortium (W3C) in September 2011.

It is part of the W3C Multimodal Interaction Activity initiative.


libwww (Library World Wide Web) is a modular client-side web API for Unix and Windows. It is also the name of the reference implementation of the libwww API.

It has been used for applications of varying sizes, including web browsers, editors, Internet bots, and batch tools. Pluggable modules provided with libwww add support for HTTP/1.1 with caching, pipelining, POST, Digest Authentication, and deflate.

The purpose of libwww is to serve as a testbed for protocol experiments so that software developers do not have to "reinvent the wheel."libcurl is considered to be a modern replacement for libwww.

Line Mode Browser

The Line Mode Browser (also known as LMB,, WWWLib, or just www) is the second web browser ever created.

The browser was the first demonstrated to be portable to several different operating systems.

Operated from a simple command-line interface, it could be widely used on many computers and computer terminals throughout the Internet.

The browser was developed starting in 1990, and then supported by the World Wide Web Consortium (W3C) as an example and test application for the libwww library.

Micropub (protocol)

Micropub (MP) is a W3C Recommendation that describes a client–server protocol based on HTTP to create, update, and delete posts (e.g. social media) on servers using web or native app clients. Micropub was originally developed in the IndieWebCamp community, contributed to W3C, and published as a W3C working draft on 2016-01-28. As of 2017-05-23 it is a W3C Recommendation.Micropub uses OAuth 2.0 Bearer Tokens for authentication and accepts traditional form posts as well as JSON posts. Posted data uses a vocabulary derived from Microformats. Micropub is mostly used to create "posts", which are similar to Tweets, or micro blog posts, like those posted to Twitter. The protocol supports a variety of different content types however, such as Bookmarks, Favorites, Reposts, Events, RSVPs, and Checkins. Micropub is currently supported on a variety of IndieWeb compatible websites, as well as


Shapes Constraint Language (SHACL) is a World Wide Web Consortium (W3C) specification for validating graph-based data against a set of conditions. Among others, SHACL includes features to express conditions that constrain the number of values that a property may have, the type of such values, numeric ranges, string matching patterns, and logical combinations of such constraints. SHACL also includes an extension mechanism to express more complex conditions in languages such as SPARQL.

A SHACL validation engine takes as input a data graph and a graph containing shapes declarations and produces a validation report that can be consumed by tools. All these graphs can be represented in any Resource Description Framework (RDF) serialization formats including JSON-LD or Turtle. The adoption of SHACL may influence the future of linked data.

Scalable Vector Graphics

Scalable Vector Graphics (SVG) is an XML-based vector image format for two-dimensional graphics with support for interactivity and animation. The SVG specification is an open standard developed by the World Wide Web Consortium (W3C) since 1999.

SVG images and their behaviors are defined in XML text files. This means that they can be searched, indexed, scripted, and compressed. As XML files, SVG images can be created and edited with any text editor, as well as with drawing software.

All major modern web browsers—including Mozilla Firefox, Internet Explorer, Google Chrome, Opera, Safari, and Microsoft Edge—have SVG rendering support.

W3C Markup Validation Service

The Markup Validation Service is a validator by the World Wide Web Consortium (W3C) that allows Internet users to check HTML and XHTML documents for well-formed markup. Markup validation is an important step towards ensuring the technical quality of web pages. However, it is not a complete measure of web standards conformance. Though W3C validation is important for browser compatibility and site usability, it has not been confirmed what effect it has on search engine optimization.

Web Ontology Language

The Web Ontology Language (OWL) is a family of knowledge representation languages for authoring ontologies. Ontologies are a formal way to describe taxonomies and classification networks, essentially defining the structure of knowledge for various domains: the nouns representing classes of objects and the verbs representing relations between the objects. Ontologies resemble class hierarchies in object-oriented programming but there are several critical differences. Class hierarchies are meant to represent structures used in source code that evolve fairly slowly (typically monthly revisions) whereas ontologies are meant to represent information on the Internet and are expected to be evolving almost constantly. Similarly, ontologies are typically far more flexible as they are meant to represent information on the Internet coming from all sorts of heterogeneous data sources. Class hierarchies on the other hand are meant to be fairly static and rely on far less diverse and more structured sources of data such as corporate databases.The OWL languages are characterized by formal semantics. They are built upon the World Wide Web Consortium's (W3C) XML standard for objects called the Resource Description Framework (RDF). OWL and RDF have attracted significant academic, medical and commercial interest.

In October 2007, a new W3C working group was started to extend OWL with several new features as proposed in the OWL 1.1 member submission. W3C announced the new version of OWL on 27 October 2009. This new version, called OWL 2, soon found its way into semantic editors such as Protégé and semantic reasoners such as Pellet, RacerPro, FaCT++ and HermiT.The OWL family contains many species, serializations, syntaxes and specifications with similar names. OWL and OWL2 are used to refer to the 2004 and 2009 specifications, respectively. Full species names will be used, including specification version (for example, OWL2 EL). When referring more generally, OWL Family will be used.

Web storage

Web storage, sometimes known as DOM storage (Document Object Model storage), provides web application software methods and protocols used for storing data in a web browser. Web storage supports persistent data storage, similar to cookies but with a greatly enhanced capacity and no information stored in the HTTP request header. There are two main web storage types: local storage and session storage, behaving similarly to persistent cookies and session cookies respectively.

All major browsers support Web storage, which is standardized by the World Wide Web Consortium (W3C).


eXtensible HyperText Markup Language (XHTML) is part of the family of XML markup languages. It mirrors or extends versions of the widely used HyperText Markup Language (HTML), the language in which Web pages are formulated.

While HTML, prior to HTML5, was defined as an application of Standard Generalized Markup Language (SGML), a flexible markup language framework, XHTML is an application of XML, a more restrictive subset of SGML. XHTML documents are well-formed and may therefore be parsed using standard XML parsers, unlike HTML, which requires a lenient HTML-specific parser.XHTML 1.0 became a World Wide Web Consortium (W3C) recommendation on January 26, 2000. XHTML 1.1 became a W3C recommendation on May 31, 2001. The standard known as XHTML5 is being developed as an XML adaptation of the HTML5 specification.

XML Base

XML Base is a World Wide Web Consortium recommended facility for defining base URIs, for resolving relative URIs, in parts of XML documents.

XML Base recommendation was adopted on 2001-06-27.

World Wide Web Consortium (W3C)
Products and

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.