Web standards

Web standards are the formal, non-proprietary standards and other technical specifications that define and describe aspects of the World Wide Web. In recent years, the term has been more frequently associated with the trend of endorsing a set of standardized best practices for building web sites, and a philosophy of web design and development that includes those methods.[1]

Overview

Web standards include many interdependent standards and specifications, some of which govern aspects of the Internet, not just the World Wide Web. Even when not web-focused, such standards directly or indirectly affect the development and administration of web sites and web services. Considerations include the interoperability, accessibility and usability of web pages and web sites.

Web standards, in the narrow sense, consist of the following:

More broadly, the following technologies may be referred to as "web standards" as well:

Web standards are evolving specifications of web technologies.[9] Web standards are developed by standards organizations—groups of interested and often competing parties chartered with the task of standardization—not technologies developed and declared to be a standard by a single individual or company. It is crucial to distinguish those specifications that are under development from the ones that already reached the final development status (in case of W3C specifications, the highest maturity level).

The web standards movement

The earliest visible manifestation of the web standards movement was the Web Standards Project, launched in August 1998 as a grassroots coalition fighting for improved web standards support in browsers.[10]

The web standards movement supports concepts of standards-based web design, including the separation of document structure from a web page or application's appearance and behavior; an emphasis on semantically structured content that validates (that is, contains no errors of structural composition) when tested against validation software maintained by the World Wide Web Consortium; and progressive enhancement, a layered approach to web page and application creation that enables all people and devices to access the content and functionality of a page, regardless of personal physical ability (accessibility), connection speed, and browser capability.

Prior to the web standards movement, many web page developers used invalid, incorrect HTML syntax such as "table layouts" and "spacer" GIF images to create web pages — an approach often referred to as "tag soup". Such pages sought to look the same in all browsers of a certain age (such as Microsoft Internet Explorer 4 and Netscape Navigator 4), but were often inaccessible to people with disabilities. Tag soup pages also displayed or operated incorrectly in older browsers, and required code forks such as JavaScript for Netscape Navigator and JScript for Internet Explorer that added to the cost and complexity of development. The extra code required, and the lack of a caching page layout language, made web sites "heavy" in terms of bandwidth, as did the frequent use of images as text. These bandwidth requirements were burdensome to users in developing countries, rural areas, and wherever fast Internet connections were unavailable.

The Web Standards movement pioneered by Glenn Davis, George Olsen, Jeffrey Zeldman, Steven Champeon, Todd Fahrner, Eric A. Meyer, Tantek Çelik, Dori Smith, Tim Bray, Jeffrey Veen, and other members of the Web Standards Project replaced bandwidth-heavy tag soup with light, semantic markup and progressive enhancement, with the goal of making web content "accessible to all".[11]

The Web Standards movement declared that HTML, CSS, and JavaScript were more than simply interesting technologies. "They are a way of creating Web pages that will facilitate the twin goals of sophisticated and appropriate presentation and widespread accessibility."[11] The group succeeded in persuading Netscape, Microsoft, and other browser makers to support these standards in their browsers. It then set about promoting these standards to designers, who were still using tag soup, Adobe Flash, and other proprietary technologies to create web pages.

Common usage

When a web site or web page is described as complying with web standards, it usually means that the site or page has valid HTML, CSS and JavaScript. The HTML should also meet accessibility and semantic guidelines. Full standard compliance also covers proper settings for character encoding, valid RSS or valid Atom news feed, valid RDF, valid metadata, valid XML, valid object embedding, valid script embedding, browser- and resolution-independent codes, and proper server settings.

When web standards are discussed, the following publications are typically seen as foundational:

  • Recommendations for markup languages, such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), and Scalable Vector Graphics (SVG) from W3C.
  • Recommendations for stylesheets, especially Cascading Style Sheets (CSS), from W3C.
  • Standards for ECMAScript, more commonly JavaScript, from Ecma International.
  • Recommendations for Document Object Models (DOM), from W3C.
  • Properly formed names and addresses for the page and all other resources referenced from it (URIs), based upon RFC 2396, from IETF.[12]
  • Proper use of HTTP and MIME to deliver the page, return data from it and to request other resources referenced in it, based on RFC 2616, from IETF.[13]

Web accessibility is normally based upon the Web Content Accessibility Guidelines[14] published by the W3C's Web Accessibility Initiative.

Work in the W3C toward the Semantic Web is currently focused by publications related to the Resource Description Framework (RDF), Gleaning Resource Descriptions from Dialects of Languages (GRDDL) and Web Ontology Language (OWL).

Standards publications and bodies

A W3C Recommendation is a specification or set of guidelines that, after extensive consensus-building, has received the endorsement of W3C Members and the Director.

An IETF Internet Standard is characterized by a high degree of technical maturity and by a generally held belief that the specified protocol or service provides significant benefit to the Internet community. A specification that reaches the status of Standard is assigned a number in the IETF STD series while retaining its original IETF RFC number.

Non-standard and vendor-proprietary pressures

HTML 5 contains numerous "willful violations" of other specifications, in order to accommodate limitations of existing platforms.[15]

See also

References

  1. ^ "Mission - Web Standards Project". WaSP. Retrieved 2009-01-19.
  2. ^ "W3C Technical Reports and Publications". W3C. Retrieved 2009-01-19.
  3. ^ a b c Allsopp, John (2009-12-09). Developing with Web Standards. Berkeley: New Riders. p. 11. ISBN 978-0-321-70271-5.
  4. ^ "Ecma formal publications". Ecma. Retrieved 2009-01-19.,
  5. ^ "Search for World Wide Web in ISO standards". ISO. Retrieved 2009-01-19.
  6. ^ "IETF RFC page". IETF. Retrieved 2009-01-19.
  7. ^ "Unicode Technical Reports". Unicode Consortium. Retrieved 2009-01-19.
  8. ^ "IANA home page". IANA. Retrieved 2009-01-19.
  9. ^ Leslie Sikos (2011). Web standards - Mastering HTML5, CSS3, and XML. Apress. ISBN 978-1-4302-4041-9.
  10. ^ Sliwa, Carol (1998-08-17). "Browser standards targeted". Computerworld. 32 (33). p. 76. ISSN 0010-4841.
  11. ^ a b "Web Standards Mission". Archive.webstandards.org. Retrieved 2014-02-26.
  12. ^ Berners-Lee, Tim; Fielding, Roy T.; Masinter, Larry (1998). Uniform Resource Identifiers (URI): Generic Syntax. IETF. doi:10.17487/RFC2396. RFC 2396. Retrieved 2009-10-27.
  13. ^ Fielding, Roy T.; Gettys, James; Mogul, Jeffrey C.; Nielsen, Henrik Frystyk; Masinter, Larry; Leach, Paul J.; Berners-Lee, Tim (1999). Hypertext Transfer Protocol -- HTTP/1.1. IETF. doi:10.17487/RFC2616. RFC 2616. Retrieved 2009-10-27.
  14. ^ "Web Content Accessibility Guidelines 1.0, W3C Recommendation 5-May-1999". W3C. 1999. Retrieved 2009-02-18.
  15. ^ "HTML 5 - A vocabulary and associated APIs for HTML and XHTML - Compliance with other specifications". Retrieved 2017-06-29.

External links

Acid2

Acid2 is a test page published and promoted by the Web Standards Project to expose web page rendering flaws in web browsers and other applications that render HTML. Named after the acid test for gold, it was developed in the spirit of Acid1, a relatively narrow test of compliance with the Cascading Style Sheets 1.0 (CSS1) standard, and was released on 13 April 2005. As with Acid1, an application passes the test if the way it displays the test page matches a reference image.

Acid2 tests aspects of HTML markup, CSS 2.1 styling, PNG images, and data URIs. The Acid2 test page will be displayed correctly in any application that follows the World Wide Web Consortium and Internet Engineering Task Force specifications for these technologies. These specifications are known as web standards because they describe how technologies used on the web are expected to function.

Acid2 was designed with Microsoft Internet Explorer particularly in mind. The creators of Acid2 were dismayed that Internet Explorer did not follow web standards. It was prone to display web pages differently from other browsers, causing web developers to spend time tweaking their web pages. Acid2 challenged Microsoft to make Internet Explorer comply with web standards.

Acid2 was released on 13 April 2005. On 31 October 2005, Safari 2.0.2 became the first browser to pass Acid2. Opera, Konqueror, Firefox, and others followed. With the release of Internet Explorer 8 on 19 March 2009, the latest versions of all major desktop web browsers now pass the test until IE10 was released, which fails the test. Its successor, Microsoft Edge, is able to render it correctly as of Windows 10 version 1607. Acid2 was followed by Acid3.

The test fails when browsers become compliant with current CSS collapse and margin standards.

Acid3

The Acid3 test is a web test page from the Web Standards Project that checks a web browser's compliance with elements of various web standards, particularly the Document Object Model (DOM) and JavaScript.

If the test is successful, the results of the Acid3 test will display a gradually increasing fraction counter below a series of colored rectangles. The number of subtests passed will indicate the percentage that will be displayed on the screen. This percentage does not represent an actual percentage of conformance as the test does not really keep track of the subtests that were actually started (100 is assumed). Moreover, the browser also has to render the page exactly as the reference page is rendered in the same browser. Like the text of the Acid2 test, the text of the Acid3 reference rendering is not a bitmap, in order to allow for certain differences in font rendering.

Acid3 was in development from April 2007, and released on 3 March 2008. The main developer was Ian Hickson, a Google employee who also wrote the Acid2 test. Acid2 focused primarily on Cascading Style Sheets (CSS), but this third Acid test also focuses on technologies used on highly interactive websites characteristic of Web 2.0, such as ECMAScript and DOM Level 2. A few subtests also concern Scalable Vector Graphics (SVG), Extensible Markup Language (XML), and data URIs. It includes several elements from the CSS2 recommendation that were later removed in CSS2.1, but reintroduced in World Wide Web Consortium (W3C) CSS3 working drafts that have not made it to candidate recommendations yet.

By April 2017, the updated specifications had diverged from the test such that the latest versions of Google Chrome and Mozilla Firefox no longer pass the test as written. Hickson acknowledges that some aspects of the test were controversial and has written that the test "no longer reflects the consensus of the Web standards it purports to test, especially when it comes to issues affecting mobile browsers".

Amaya (web editor)

Amaya (formerly Amaya World) is a discontinued free and open source WYSIWYG web authoring tool with browsing abilities.

It was created by a structured editor project at the INRIA, a French national research institution, and later adopted by the World Wide Web Consortium (W3C) as their testbed for web standards; a role it took over from the Arena web browser. Since the last release in January 2012, INRIA and the W3C have stopped supporting the project and active development has ceased.Amaya has relatively low system requirements, even in comparison with other web browsers from the era of its active development period, so it has been considered a "lightweight" browser.

Ampersand

The ampersand is the logogram &, representing the conjunction "and". It originated as a ligature of the letters et—Latin for "and".

BlueGriffon

BlueGriffon is a WYSIWYG content editor for the World Wide Web. It is based on the discontinued Nvu editor, which in turn is based on the Composer component of the Mozilla Application Suite. Powered by Gecko, the rendering engine of Firefox, it can edit Web pages in conformance to Web Standards. It runs on Microsoft Windows, macOS and Linux.

BlueGriffon complies with the W3C's web standards. It can create and edit pages in accordance to HTML 4, XHTML 1.1, HTML 5 and XHTML 5. It supports CSS 2.1 and all parts of CSS 3 already implemented by Gecko. BlueGriffon also includes SVG-edit, an XUL-based editor for SVG that is originally distributed as an add-on to Firefox and was adapted to BlueGriffon.

A version without the CSS Stylesheet editor is free to download and is available on Microsoft Windows, macOS and Linux.

Many enhancements are available via add-ons. Most add-ons such as 'Project Manager', 'CSS Stylesheet editor', 'MathML Editor', 'Word Count' and 'FullScreen view/edit' must be paid for, while only two ('FireFTP' and 'Dictionaries') are free to download.

Browser game

A browser game is a video game that is played via the World Wide Web using a web browser. Browser games can be run using standard web technologies or browser plug-ins. The creation of such games usually involves use of standard web technologies as a frontend and other technologies to provide a backend. Browser games include all video game genres and can be single-player or multiplayer. Browser games are also portable and can be played on multiple different devices, web browsers, and operating systems.

Browser games come in many genres and themes that appeal to both regular and casual players. Multiple browser games have developed beyond the online platform to become large titles or franchises sold physically in stores, in online marketplaces like Steam or XBLA, or in decentralized distribution platforms such as itch.io. Some notable titles are Transformice, Alien Hominid, Bejeweled, Bloons, Club Penguin, Cookie Clicker, and Meat Boy.

Gnip

Gnip, Inc. was a social media API aggregation company. Headquartered in Boulder, Colorado, it provided data from dozens of social media websites via a single API. Calling itself the "Grand Central Station for the social web", Gnip was among the first social media API aggregation services.

Gnip is known as an early influencer in building the real-time web. The company has also been instrumental in defining relevant web standards: Gnip's co-founder Eric Marcoullier actively advocated for adoption of open web standards, and helped define the new Activity Streams format for web data.

Subsequent to a 2010 data licensing agreement with Twitter Inc, Twitter purchased Gnip in April 2014.

HTML5 audio

HTML5 Audio is a subject of the HTML5 specification, incorporating audio input, playback, and synthesis, as well as speech to text, in the browser.

Tasman (layout engine)

Tasman is a discontinued browser engine developed by Microsoft for inclusion in the Macintosh version of Internet Explorer 5. Tasman was an attempt to improve support for web standards, as defined by the World Wide Web Consortium. At the time of its release, Tasman was seen as the layout engine with the best support for web standards such as HTML and CSS. Internet Explorer for Mac is no longer supported, but newer versions of Tasman are incorporated in some other Microsoft products.Tantek Çelik led the software team that developed the Tasman engine. Tasman later became used as the layout engine for the MSN for Mac OS X and Office 2004 for Mac.

Trident (software)

Trident (also known as MSHTML) is a proprietary browser engine for the Microsoft Windows version of Internet Explorer, developed by Microsoft.

It was first introduced with the release of Internet Explorer version 4.0 in October 1997; it has been steadily upgraded and remains in use today. For versions 7 and 8 of Internet Explorer, Microsoft made significant changes to the Trident layout engine to improve compliance with web standards and add support for new technologies.In the Microsoft Edge browser, Trident is superseded by its fork, EdgeHTML.

W3C Geolocation API

The W3C Geolocation API is an effort by the World Wide Web Consortium (W3C) to standardize an interface to retrieve the geographical location information for a client-side device. It defines a set of objects, ECMAScript standard compliant, that executing in the client application give the client's device location through the consulting of Location Information Servers, which are transparent for the application programming interface (API). The most common sources of location information are IP address, Wi-Fi and Bluetooth MAC address, radio-frequency identification (RFID), Wi-Fi connection location, or device Global Positioning System (GPS) and GSM/CDMA cell IDs. The location is returned with a given accuracy depending on the best location information source available.

W3C Markup Validation Service

The Markup Validation Service is a validator by the World Wide Web Consortium (W3C) that allows Internet users to check HTML and XHTML documents for well-formed markup. Markup validation is an important step towards ensuring the technical quality of web pages. However, it is not a complete measure of web standards conformance. Though W3C validation is important for browser compatibility and site usability, it has not been confirmed what effect it has on search engine optimization.

WebAssembly

WebAssembly (often shortened to Wasm) is a set of standards that define a portable (modular) binary format and a corresponding language similar to the assembler language (Assembly) for executable program and environment-specific program interfaces into which it may be embedded and interact with. It was initially developed to improve the performance of JavaScript applications and to be used inside Web browsers, but it isn't constrained to them and can be embedded anywhere else.Wasm does not replace JavaScript. It utilizes Emscripten compiler to compile C++ (and other input languages) source code into binary file which runs in the same sandbox as regular script code, though as of 2019 it doesn't have full DOM access yet. Emscripten provides bindings for several commonly used environment interfaces like WebGL.

The World Wide Web Consortium (W3C) maintains the standard with contributions from Mozilla, Microsoft, Google, and Apple.

WebGL

WebGL (Web Graphics Library) is a JavaScript API for rendering interactive 2D and 3D graphics within any compatible web browser without the use of plug-ins. WebGL is fully integrated with other web standards, allowing GPU-accelerated usage of physics and image processing and effects as part of the web page canvas. WebGL elements can be mixed with other HTML elements and composited with other parts of the page or page background. WebGL programs consist of control code written in JavaScript and shader code that is written in OpenGL ES Shading Language (GLSL ES), a language similar to C or C++, and is executed on a computer's graphics processing unit (GPU).

WebGL is designed and maintained by the non-profit Khronos Group.

WebRTC

WebRTC (Web Real-Time Communication) is a free, open-source project that provides web browsers and mobile applications with real-time communication (RTC) via simple application programming interfaces (APIs). It allows audio and video communication to work inside web pages by allowing direct peer-to-peer communication, eliminating the need to install plugins or download native apps. Supported by Apple, Google, Microsoft, Mozilla, and Opera, WebRTC is being standardized through the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF).Its mission is to "enable rich, high-quality RTP applications to be developed for the browser, mobile platforms, and IoT devices, and allow them all to communicate via a common set of protocols". The reference implementation is released as free software under the terms of a BSD license. OpenWebRTC provides another free implementation based on the multimedia framework GStreamer. JavaScript inventor Brendan Eich called it a "new front in the long war for an open and unencumbered web".

WebUSB

WebUSB is a proposed JavaScript application programming interface (API) standard for securely providing access to USB devices from web pages.It was published by the Web Platform Incubator Community Group. As of September 2018, it was still in Editor's draft status, and the only web browser to support it was Google Chrome.WebUSB was enabled by default in Chrome 61 on September 5 2017, after which privacy and security concerns were raised.

Web Standards Project

The Web Standards Project (WaSP) was a group of professional web developers dedicated to disseminating and encouraging the use of the web standards recommended by the World Wide Web Consortium, along with other groups and standards bodies.

Founded in 1998, The Web Standards Project campaigned for standards that reduced the cost and complexity of development while increasing the accessibility and long-term viability of any document published on the Web. WaSP worked with browser companies, authoring tool makers, and peers to encourage them to use these standards, since they "are carefully designed to deliver the greatest benefits to the greatest number of web users". The group disbanded in 2013.

Web worker

A web worker, as defined by the World Wide Web Consortium (W3C) and the Web Hypertext Application Technology Working Group (WHATWG), is a JavaScript script executed from an HTML page that runs in the background, independently of user-interface scripts that may also have been executed from the same HTML page. Web workers are often able to utilize multi-core CPUs more effectively.The W3C and WHATWG envision web workers as long-running scripts that are not interrupted by user-interface scripts (scripts that respond to clicks or other user interactions). Keeping such workers from being interrupted by user activities should allow Web pages to remain responsive at the same time as they are running long tasks in the background.

The simplest use of workers is for performing a computationally expensive task without interrupting the user interface.

The W3C and the WHATWG are currently in the process of developing a definition for an application programming interface (API) for web workers.

XMLHttpRequest

XMLHttpRequest (XHR) is an API in the form of an object whose methods transfer data between a web browser and a web server. The object is provided by the browser's JavaScript environment. Particularly, retrieval of data from XHR for the purpose of continually modifying a loaded web page is the underlying concept of Ajax design. Despite the name, XHR can be used with protocols other than HTTP and data can be in the form of not only XML, but also JSON, HTML or plain text.WHATWG maintains an XHR standard as a living document. Ongoing work at the W3C to create a stable specification is based on snapshots of the WHATWG standard.

Server-side
Client-side
Topics

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.