The frame rate for motion picture film cameras was typically 24 frames per second (fps) with multiple flashes on each frame during projection to prevent flicker. Analog television and video employed interlacing where only half of the image (known as a video field) was recorded and played back/refreshed at once but at twice the rate of what would be allowed for progressive video of the same bandwidth, resulting in smoother playback, as opposed to progressive video which is more similar to how celluloid works. The field rate of analog television and video systems was typically 50 or 60 fields per second. Usage of frame rates higher than 24 FPS for feature motion pictures and higher than 30 FPS for other applications are emerging trends in the 21st century.
In early cinema history, there was no standard frame rate established. Thomas Edison's early films were shot at 40 fps, while the Lumière Brothers used 16 fps. This had to do with a combination of the use of a hand crank rather than a motor, which created variable frame rates because of the inconsistency of the cranking of the film through the camera. After the introduction of synch sound recording, 24 fps became the industry standard frame rate for capture and projection of motion pictures. 24 fps was chosen because it was the minimum frame rate that would produce adequate sound quality. This was done because film was expensive, and using the lowest possible frame rate would use the least amount of film.
A few film formats have experimented with frame rates higher than the 24 fps standard. The original 3-strip Cinerama features of the 1950s ran at 26 fps. The first two Todd-AO 70mm features, Oklahoma! (1955) and Around the World in 80 Days (1956) were shot and projected at 30 fps. Douglas Trumbull's 70mm Showscan film format operated at 60 fps.
The IMAX HD (high definition in this case meaning high definition film stock, as 70mm IMAX is the highest resolution motion picture image in the world) film Momentum, presented at Seville Expo '92, was shot and projected at 48 fps. IMAX HD has also been used in film-based theme park attractions, including Disney's Soarin' Over California.
Digital Cinema Initiatives has published a document outlining recommended practice for high frame rate digital cinema. This document outlines the frame rates and resolutions that can be used in high frame rate digital theatrical presentations with currently available equipment.
Peter Jackson's The Hobbit film series, beginning with The Hobbit: An Unexpected Journey in December 2012, used a shooting and projection frame rate of 48 frames per second, becoming the first feature film with a wide release to do so. Its 2013 sequel, The Hobbit: The Desolation of Smaug and 2014 sequel, The Hobbit: The Battle of the Five Armies, followed suit. All films also have versions which are converted and projected at 24 fps.
In 2016, Ang Lee released Billy Lynn's Long Halftime Walk. Unlike The Hobbit trilogy, which used 48 frames per second, the picture shot and projected selected scenes in 120 frames per second, which is five times faster than the 24 frames per second standard used in Hollywood. 
RocketJump's Video Game High School was the first web series to use HFR, and the first content shot and edited in a mixed frame rate. The series, which follows the lives of high school students in a world where gamers are revered as pro-athletes, adopted HFR in its second season, using the standard 24 frames per second for real world interactions, and 48 frames per second for "in-game" action sequences. Although the content is available on YouTube and Netflix, it can only be viewed in mixed frame rate using a special player on RocketJump's website.
Some media players are capable of showing HFR content and almost all computers and smart devices can handle this format as well. In recent years some televisions have the ability to take normal 24 fps videos and "convert" them to HFR content by interpolating the motion of the picture, effectively creating new computer generated frames between each two key frames and running them at higher refresh rate. Some computer programs allow for that as well but with higher precision and better quality as the computing power of the PC has grown.
Motion interpolation may cause some artifacts, as a result of the computer "guessing" the frames wrong.
Billy Lynn's Long Halftime Walk is a 2016 war drama film directed by Ang Lee and written by Jean-Christophe Castelli, based on the 2012 eponymous novel by Ben Fountain. The film stars Joe Alwyn, Kristen Stewart, Garrett Hedlund, Vin Diesel, Steve Martin, and Chris Tucker. Principal photography began in early April 2015 in Georgia. The film is a co-production between the United States, United Kingdom, and China.The film had its world premiere at the 54th New York Film Festival on October 14, 2016, and was theatrically released in the United States on November 11, 2016, by TriStar Pictures. It had high production costs associated with being the first ever feature film using an extra-high frame rate of 120 frames per second, further complicated by the 3D format, at 4K HD resolution. It received mixed reviews from critics and was a box-office bomb, grossing just $30 million worldwide against its $40 million budget.Canon EOS 5DS
The Canon EOS 5DS and EOS 5DS R (known as the EOS 5Ds and EOS 5Ds R in Japan) are two closely related digital SLR cameras announced by Canon on February 6, 2015. Both are professional full-frame cameras with 50.6-megapixel sensors, the highest of any full-frame camera at the time of announcement. The only difference between the two models is that the sensor of the "R" version includes an optical filter that cancels out the effects of a standard optical low-pass filter. This distinction is roughly similar to that between Nikon's now-replaced D800 and D800E (with the E having a self-cancelling filter). Canon stated that both the 5DS and 5DS R will not replace the older EOS 5D Mark III, so therefore both the 5DS and 5DS R will have their new positions in Canon's DSLR camera lineup.
At the time of announcement, estimated prices were US$3,699.00 and US$3,899.00 (EOS 5DS and EOS 5DS R), with announced date of availability, through authorized Canon retailers, in June 2015.Despite the record-high pixel count and related storage and processing power, these cameras do not shoot 4K video or high frame rate 1080p video.Full-size demosaicked jpeg files from this camera occupy approximately 20 megabytes and exceed 8K resolution.Doppler echocardiography
Doppler echocardiography is a procedure that uses Doppler ultrasonography to examine the heart. An echocardiogram uses high frequency sound waves to create an image of the heart while the use of Doppler technology allows determination of the speed and direction of blood flow by utilizing the Doppler effect.
An echocardiogram can, within certain limits, produce accurate assessment of the direction of blood flow and the velocity of blood and cardiac tissue at any arbitrary point using the Doppler effect. One of the limitations is that the ultrasound beam should be as parallel to the blood flow as possible. Velocity measurements allow assessment of cardiac valve areas and function, any abnormal communications between the left and right side of the heart, any leaking of blood through the valves (valvular regurgitation), calculation of the cardiac output and calculation of E/A ratio (a measure of diastolic dysfunction). Contrast-enhanced ultrasound-using gas-filled microbubble contrast media can be used to improve velocity or other flow-related medical measurements.
An advantage of Doppler echocardiography is that it can be used to measure blood flow within the heart without invasive procedures such as cardiac catheterization.
In addition, with slightly different filter/gain settings, the method can measure tissue velocities by tissue Doppler echocardiography. The combination of flow and tissue velocities can be used for estimating left ventricular filling pressure, although only under certain conditions.Although "Doppler" has become synonymous with "velocity measurement" in medical imaging, in many cases it is not the frequency shift (Doppler shift) of the received signal that is measured, but the phase shift (when the received signal arrives). However, the calculation result will end up identical.
This procedure is frequently used to examine children's hearts for heart disease because there is no age or size requirement.Encounter (video game)
Encounter (also known as Encounter!) is a first person shoot 'em up game originally released in 1983 for the Atari 8-bit family and Commodore 64 programmed by Paul Woakes for Novagen Software. It was published by Novagen in the UK and Europe and by Synapse in North America. The gameplay is similar to that of Atari's 1980 arcade game Battlezone, but with scaled sprites instead of wireframe 3D graphics. Encounter is notable for moving large, screen-filling objects at a high frame rate. Woakes later developed Mercenary.Versions for the Amiga and Atari ST computers followed much later, in 1991.Frame rate
Frame rate (expressed in frames per second or fps) is the frequency (rate) at which consecutive images called frames appear on a display. The term applies equally to film and video cameras, computer graphics, and motion capture systems. Frame rate may also be called the frame frequency, and be expressed in hertz.HFR
HFR or Hfr may refer to:
Hedge fund replication
High frame rate
Hfr cell, or Hfr strain, a bacterium with a conjugative plasmidHigh-motion
High motion is the characteristic of video or film footage displayed possessing a sufficiently high frame rate (or field rate) that moving images do not blur or strobe even when tracked closely by the eye. The most common forms of high motion are NTSC and PAL video (i.e., "normal television") at their native display rates. Movie film (at the standard 24 frame/s) does not portray high motion even when shown on television monitors.Interface Region Imaging Spectrograph
The Interface Region Imaging Spectrograph (IRIS), also called Explorer 94, is a NASA solar observation satellite. The mission was funded through the Small Explorer program to investigate the physical conditions of the solar limb, particularly the chromosphere of the Sun. The spacecraft consists of a satellite bus and spectrometer built by the Lockheed Martin Solar and Astrophysics Laboratory (LMSAL), and a telescope provided by the Smithsonian Astrophysical Observatory. IRIS is operated by LMSAL and NASA's Ames Research Center.
The satellite's instrument is a high-frame-rate ultraviolet imaging spectrometer, providing one image per second at 0.3 arcsecond angular resolution and sub-ångström spectral resolution.
NASA announced on 19 June 2009 that IRIS was selected from six Small Explorer mission candidates for further study, along with the Gravity and Extreme Magnetism (GEMS) space observatory.The spacecraft arrived at Vandenberg Air Force Base, California, on 16 April 2013 and was successfully launched on 27 June 2013 by a Pegasus-XL rocket.Johnny Lee (computer scientist)
Johnny Chung Lee (born 1979) a computer engineer famous for his inventions related to the Wii Remote. He is involved with human-computer interaction.
He earned a B.S. degree in computer engineering at the University of Virginia in 2001.
Lee completed his Ph.D. at Carnegie Mellon University's Human-Computer Interaction Institute
Sometime in 2008 Lee posted video demos and sample code at his website taking advantage of the high resolution (1024×768 Pixels) high frame-rate (100 Hz) IR camera built-in into the controller of the Wii video game console, the Wii Remote, for
low-cost multipoint interactive whiteboards,
and head tracking for desktop VR displays.This was the subject for his presentation at the prestigious TED conference in the same year, where he demonstrated several such applications. The WiimoteProject forum has become the discussion, support and sharing site for Lee's Wii Remote projects and other newer developments.
he was named one of the world's top 35 innovators under 35 (TR35) in 2008.
after that, Lee was hired by Microsoft to work on their Kinect project
At some point he was hired at Google to work on their Project Tango.is a currently working at Lee's other projects include an interactive whiteboard, 3D head tracking, finger tracking, and a DIY telepresence robot.YouTube videos created for Lee's projects have received over 10 million views, with the Wii Remote head tracking project being the most highly rated video on YouTube of all time for more than a week in January 2008. He also demonstrated several of these applications at events such as TED, and has been featured on popular websites such as Slashdot, Gizmodo, hackedgadgets, Popular Science, Wired Blogs and Engadget several times. Various magazine, newspapers and television programs have featured interviews with Lee as well. Lee has also made invited appearances at events such as Maker Faire.
Electronic Arts had initially stated that Lee's Wii Remote head tracking technology would appear as an Easter egg in the game Boom Blox, but later announced that the feature had been removed.While Lee was a core member of Microsoft's Kinect development team, he approached Adafruit with the idea of a driver development contest and personally financed it.List of films with high frame rates
This is a list of films with high frame rates. Only films with a native (without motion interpolation) shooting and projection frame rate of 48 or higher, for all or some of its scenes, are included. This is at least double the 24 frames per second (fps) standard used in Hollywood. Several of these films also have versions which are converted and projected at 24 fps.Location View
Location View was an interactive website developed by Tokyo-based company LocationView Co. offering registered users a street-level view of selected cities in Japan. It featured 360-degree horizontal and 180-degree vertical panning, zoom, and virtual mobility in which the user could control speed. Unlike other street view services, such as MapJack or Google Maps’ Streetview, Location View had a high frame rate and seamless transition between frames, enabling continuous, lifelike motion and surroundings which were animated rather than static. The site was developed by the Location View Co. and introduced on 14 May 2007 in Japan. When it was launched, several major cities in Japan, such as Tokyo, Yokohama, Kawasaki, Nagoya, Kyoto, Osaka and Himeji were included. It was since expanded to include the suburbs of a number of other Japanese cities. The service was closed down on 27 April 2009.Nikon D2H
The Nikon D2H is a professional-grade digital single-lens reflex camera introduced by Nikon Corporation on July 22, 2003. It uses Nikon's own JFET-LBCAST sensor with a 4.1-megapixel resolution, and is optimised for sports and action shooting that require a high frame rate. In 2005, the D2H was replaced by the D2Hs, which added new features derived from the 12-megapixel D2X digital SLR. The D2Hs was discontinued after the introduction of the D300 and D3 models.
Like most early Nikon Digital SLR cameras, it uses a "DX Format" sensor, which applies a crop factor compared to 35 mm film of approximately 1.5×.P2 (storage media)
P2 (P2 is a short form for "Professional Plug-In") is a professional digital recording solid-state memory storage media format introduced by Panasonic in 2004, and especially tailored to electronic news-gathering (ENG) applications. It features tapeless (non-linear) recording of DV, DVCPRO, DVCPRO25, DVCPRO50, DVCPRO-HD, or AVC-Intra streams on a solid-state flash memory. The P2 card is essentially a RAID of Secure Digital (SD) memory cards with an LSI controller tightly packaged in a die-cast PC Card (formerly PCMCIA) enclosure, so data transfer rate increases as memory capacity increases. The system includes cameras, decks as drop-in replacements for videocassette recorders (VCR), and a special 5.25-inch computer drive for random-access integration with non-linear editing systems (NLE). The cards can also be used directly where a PC card (PCMCIA) slot is available, as in most older notebook computers, as a normal hard disk drive, although a custom software driver must first be loaded.
As of early 2010, P2 cards are available in capacities of 4, 8, 16, 32 and 64 GB. At introduction, P2 cards offered low recording capacity compared to competing, video tape-based formats (a miniDV tape holds roughly 13 GB of data, and an S-size HDCAM tape holds 50 GB). To solve this, camcorders and decks using P2 media employ multiple card slots, with the ability to span the recording over all slots. Cards are recorded in sequence, and when a card is full, it can be swapped out while another card is recording. This limits recording time only by power supply and the available number of cards. If a card is partially full, the deck will record only until it is full. Unlike video tape, old video cannot be recorded over accidentally; old footage must be manually deleted.
P2 cards are of a ruggedized PCMCIA type with the fastest transfer speeds currently available through this format. The card also contains a processor that organizes and safeguards the files and the case is developed and crafted to "military" (according to Panasonic) specifications, making P2 cards tough and reliable.
The first pieces of equipment released by Panasonic which use the P2 format included the AJ-SPX800 (a 2/3" broadcast camcorder for ENG and EFP applications), the studio recorder AJ-SPD850, the AJ-PCD10 offload device (basically, a five-slot PC card reader with USB interface designed to fit a 5-1/4" IT systems bay), and the memory cards themselves – AJ-P2C004 (4 GB) and AJ-P2C002 (2 GB). Panasonic is currently shipping a wide range of camcorders that support the P2 format, including the professional video camera AG-HVX200 HD handheld camcorder, and the high-end, or broadcast professional shoulder-mount AG-HPX500, AJ-HPX2000, and AJ-HPX3000 camcorders. Panasonic has also announced the P2-based AG-HPX170 handheld HD tapeless camcorder. The HPX170 is very similar to the HVX200 and the HVX200A, the main difference being the lack of a video tape drive on the 170. The latest products to feature P2 technology are the well-received recently launched AJ-HPX2700 and AH-HPX3700 "Varicam" high end cameras.
On April 15, 2012, Panasonic introduced the "MicroP2" system, an entirely different format based on SDHC/SDXC conforming to UHS-II (Ultra-High Speed) bus mode. Most (but not all) current P2 products can use MicroP2 (UHS-II) and SDHC/SDXC (UHS-I/UHS-II) cards through a MicroP2 card adapter, with some requiring a firmware update. Some P2 products (such as the AG-HPX500E camera) are not able to use MicroP2 at all.
On February 27, 2014, Panasonic has announced a new generation of P2 media, the expressP2 card, designed to accommodate high frame rate 1080 HDAVC-ULTRA recording (above 60fps) as well as 4K capture.Peter Anderson (cinematographer)
Peter Anderson is a cinematographer, visual effects supervisor, and leading expert on a number of specialized imaging technologies, many of which he helped to develop, including modern 3-D, motion control, large format, high frame rate, and high dynamic range.
Anderson was the staff director of photography at Walt Disney Animation Studios, and he has led visual-effects facilities at Walt Disney Studios and Universal Studios.
Anderson was instrumental in the creation of the theme-park attractions King Kong: 360 3-D, T2 3-D: Battle Across Time and Captain Eo. Peter has also supervised IMAX 3-D productions, including Cirque du Soleil: Journey of Man and Wild Ocean, and contributed visual effects to an array of projects, including the features U2 3D and Tron and the original incarnations of the television series Battlestar Galactica and Cosmos.In 2014, Anderson received the Academy of Motion Picture Arts and Sciences prestigious Gordon E. Sawyer Award, a special Oscar awarded to "an individual in the motion picture industry whose technological contributions have brought credit to the industry."Slow motion
Slow motion (commonly abbreviated as slo-mo or slow-mo) is an effect in film-making whereby time appears to be slowed down. It was invented by the Austrian priest August Musger in the early 20th century.
Typically this style is achieved when each film frame is captured at a rate much faster than it will be played back. When replayed at normal speed, time appears to be moving more slowly. A term for creating slow motion film is overcranking which refers to hand cranking an early camera at a faster rate than normal (i.e. faster than 24 frames per second). Slow motion can also be achieved by playing normally recorded footage at a slower speed. This technique is more often applied to video subjected to instant replay than to film. A third technique that is becoming common using current computer software post-processing (with programs like Twixtor) is to fabricate digitally interpolated frames to smoothly transition between the frames that were actually shot. Motion can be slowed further by combining techniques, interpolating between overcranked frames. The traditional method for achieving super-slow motion is through high-speed photography, a more sophisticated technique that uses specialized equipment to record fast phenomena, usually for scientific applications.
Slow motion is ubiquitous in modern filmmaking. It is used by a diverse range of directors to achieve diverse effects. Some classic subjects of slow-motion include:
Athletic activities of all kinds, to demonstrate skill and style.
To recapture a key moment in an athletic game, typically shown as a replay.
Natural phenomena, such as a drop of water hitting a glass.Slow motion can also be used for artistic effect, to create a romantic or suspenseful aura or to stress a moment in time. Vsevolod Pudovkin, for instance, used slow motion in a suicide scene in The Deserter, in which a man jumping into a river seems sucked down by the slowly splashing waves. Another example is Face/Off, in which John Woo used the same technique in the movements of a flock of flying pigeons. The Matrix made a distinct success in applying the effect into action scenes through the use of multiple cameras, as well as mixing slow-motion with live action in other scenes. Japanese director Akira Kurosawa was a pioneer using this technique in his 1954 movie Seven Samurai. American director Sam Peckinpah was another classic lover of the use of slow motion. The technique is especially associated with explosion effect shots and underwater footage.The opposite of slow motion is fast motion. Cinematographers refer to fast motion as undercranking since it was originally achieved by cranking a handcranked camera slower than normal. It is often used for comic, or occasional stylistic effect. Extreme fast motion is known as time lapse photography; a frame of, say, a growing plant is taken every few hours; when the frames are played back at normal speed, the plant is seen to grow before the viewer's eyes.
The concept of slow motion may have existed before the invention of the motion picture: the Japanese theatrical form Noh employs very slow movements.The Hobbit (film series)
The Hobbit is a film series consisting of three high fantasy adventure films directed by Peter Jackson. They are based on the 1937 novel The Hobbit by J. R. R. Tolkien, with large portions of the trilogy inspired by the appendices to The Return of the King, which expand on the story told in The Hobbit, as well as new material and characters written especially for the films. Together they act as a prequel to Jackson's The Lord of the Rings film trilogy. The films are subtitled An Unexpected Journey (2012), The Desolation of Smaug (2013), and The Battle of the Five Armies (2014).The screenplay was written by Fran Walsh, Philippa Boyens, Jackson, and Guillermo del Toro, who was originally chosen to direct before his departure from the project. The films take place in the fictional world of Middle-earth sixty years before the beginning of The Lord of the Rings, and follow hobbit Bilbo Baggins (Martin Freeman), who is convinced by the wizard Gandalf the Grey (Ian McKellen) to accompany thirteen dwarves, led by Thorin Oakenshield (Richard Armitage), on a quest to reclaim the Lonely Mountain from the dragon Smaug (voiced by Benedict Cumberbatch). The films also expand upon certain elements from the novel and other source material, such as Gandalf's investigation at Dol Guldur, and the pursuit of Azog and Bolg, who seek vengeance against Thorin and his kindred.
The films feature an ensemble cast that also includes James Nesbitt, Ken Stott, Evangeline Lilly, Lee Pace and Luke Evans, with several actors reprising their roles from The Lord of the Rings, including Cate Blanchett, Orlando Bloom, Ian Holm, Christopher Lee, Hugo Weaving, Elijah Wood, and Andy Serkis. The films also feature Manu Bennett, Sylvester McCoy, Stephen Fry, Mikael Persbrandt, Barry Humphries, and Lawrence Makoare. Also returning for production, among others, were illustrators John Howe and Alan Lee, art director Dan Hennah, cinematographer Andrew Lesnie, and composer Howard Shore, while props were again crafted by Weta Workshop, with visual effects managed by Weta Digital.
The first film in the series premiered at the Embassy Theatre in Wellington, New Zealand on 28 November 2012. One hundred thousand people lined the red carpet on Courtenay Place, and the entire event was broadcast live on television in New Zealand and streamed over the Internet. The second film of the series premiered at the Dolby Theatre in Los Angeles, California on 2 December 2013. The third and final film premiered at the Odeon Leicester Square in London on 1 December 2014.The series was one of the highest-grossing film series of all time, and earned more money than The Lord of the Rings trilogy. Although critically considered to be inferior to The Lord of the Rings, it was nominated for various awards and won several, though not as many as its predecessor series.Ultra HD Forum
Ultra HD Forum is an organization whose goal is to help solve the real world hurdles in deploying Ultra HD video and thus to help promote UHD deployment. The Ultra HD Forum will help navigate amongst the standards related to high dynamic range (HDR), high frame rate (HFR), next generation audio (NGA), and wide color gamut (WCG). The Ultra HD Forum is an industry organisation that is complementary to the UHD Alliance (that maintains consumer-facing logos), covering different aspects of the UHD ecosystem.WhitestormJS
whs.js is a framework for 3D web apps built with Three.js technology. It implements a core with component system and plugin support for fast development of 3D scene with physics. This engine has physics support implemented by custom Physi.js library. Framework provides extended component control, a high frame rate and uses WebWorkers technology for multithreading.
Whitestorm.js is made available under the MIT license.