Join the discussion via video-dev.org in #hlsjs (our Slack channel)
hls.js is a JavaScript library that implements an HTTP Live Streaming client. It relies on HTML5 video and MediaSource Extensions for playback.
It works by transmuxing MPEG-2 Transport Stream and AAC/MP3 streams into ISO BMFF (MP4) fragments. Transmuxing is performed asynchronously using a Web Worker when available in the browser. hls.js also supports HLS + fmp4, as announced during WWDC2016
hls.js works directly on top of a standard HTML<video>
element.
hls.js is written in ECMAScript6 (*.js
) and TypeScript (*.ts
) (strongly typed superset of ES6), and transpiled in ECMAScript5 using the TypeScript compiler.
Modules written in TS and plain JS/ES6 can be interdependent and imported/required by each other.
Webpack is used to build the distro bundle and serve the local development environment.
- API and usage docs, with code examples
- Auto-Generated Docs (Latest Release)
- Auto-Generated Docs (Master)
Note you can access the docs for a particular version using "https://github.com/video-dev/hls.js/blob/deployments/README.md"
https://hls-js.netlify.com/demo
https://hls-js-dev.netlify.com/demo
Find the commit on https://github.com/video-dev/hls.js/blob/deployments/README.md.
hls.js is only compatible with browsers supporting MediaSource extensions (MSE) API with 'video/MP4' mime-type inputs.
Hls.js is supported on:
- Chrome for Android 39+
- Chrome for Desktop 39+
- Firefox for Android 41+
- Firefox for Desktop 42+
- IE11+ for Windows 8.1+
- Edge for Windows 10+
- Opera for Desktop
- Vivaldi for Desktop
- Safari for Mac 8+ (beta)
- Safari for ipadOS 13+
Please note: iOS Safari on iPhone does not support the MediaSource API. This includes all browsers on iOS as well as apps using UIWebView and WKWebView.
Safari browsers (iOS, iPadOS, and macOS) have built-in HLS support through the plain video "tag" source URL. See the example below (Getting Started) to run appropriate feature detection and choose between using Hls.js or natively built-in HLS support.
When a platform has neither MediaSource nor native HLS support, the browser cannot play HLS.
Keep in mind that if the intention is to support HLS on multiple platforms, beyond those compatible with hls.js, the HLS streams need to strictly follow the specifications of RFC8216, especially if apps, smart TVs, and set-top boxes are to be supported.
Find a support matrix of the MediaSource API here: https://developer.mozilla.org/en-US/docs/Web/API/MediaSource
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
<!-- Or if you want a more recent alpha version -->
<!-- <script src="https://cdn.jsdelivr.net/npm/hls.js@alpha"></script> -->
<video id="video"></video>
<script>
var video = document.getElementById('video');
var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8';
if (Hls.isSupported()) {
var hls = new Hls();
hls.loadSource(videoSrc);
hls.attachMedia(video);
}
// hls.js is not supported on platforms that do not have Media Source
// Extensions (MSE) enabled.
//
// When the browser has built-in HLS support (check using `canPlayType`),
// we can provide an HLS manifest (i.e. .m3u8 URL) directly to the video
// element through the `src` property. This is using the built-in support
// of the plain video element, without using hls.js.
//
// Note: it would be more normal to wait on the 'canplay' event below however
// on Safari (where you are most likely to find built-in HLS support) the
// video.src URL must be on the user-driven white-list before a 'canplay'
// event will be emitted; the last video event that can be reliably
// listened-for when the URL is not on the white-list is 'loadedmetadata'.
else if (video.canPlayType('application/vnd.apple.mpegurl')) {
video.src = videoSrc;
}
</script>
Note that the example code above will check for hls.js support first and then fallback to check if the browser natively supports HLS. To check for native browser support first and then fallback to Hls.js, swap these conditionals.
See this comment to understand some of the tradeoffs.
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
<!-- Or if you want a more recent alpha version -->
<!-- <script src="https://cdn.jsdelivr.net/npm/hls.js@alpha"></script> -->
<video id="video"></video>
<script>
var video = document.getElementById('video');
var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8';
//
// First check for native browser HLS support
//
if (video.canPlayType('application/vnd.apple.mpegurl')) {
video.src = videoSrc;
//
// If no native HLS support, check if hls.js is supported
//
} else if (Hls.isSupported()) {
var hls = new Hls();
hls.loadSource(videoSrc);
hls.attachMedia(video);
}
</script>
Video is controlled through HTML <video>
element HTMLVideoElement
methods, events and optional UI controls (<video controls>
).
The following players integrate hls.js for HLS playback:
- JW Player
- Akamai Adaptive Media Player (AMP)
- Clappr
- Flowplayer through flowplayer-hlsjs
- MediaElement.js
- Videojs through Videojs-hlsjs
- Videojs through videojs-hls.js. hls.js is integrated as a SourceHandler -- new feature in Video.js 5.
- Videojs through videojs-contrib-hls.js. Production ready plug-in with full fallback compatibility built-in.
- Fluid Player
- OpenPlayerJS, as part of the OpenPlayer project
- CDNBye, a p2p engine for hls.js powered by WebRTC Datachannel.
made by gramk, plays hls from address bar and m3u8 links
- Chrome native-hls
- Firefox native-hls
No external JS libs are needed. Prepackaged builds are included with each release.
Hls.js can be installed as a dependency using npm:
npm install hls.js
or for the version from master (canary)
npm install hls.js@canary
Either directly include dist/hls.js or dist/hls.min.js
Or type
npm install --save hls.js
All HLS resources must be delivered with CORS headers permitting GET
requests.
- VoD & Live playlists
- DVR support on Live playlists
- fragmented MP4 container (beta)
- MPEG-2 TS container
- ITU-T Rec. H.264 and ISO/IEC 14496-10 Elementary Stream
- ISO/IEC 13818-7 ADTS AAC Elementary Stream
- ISO/IEC 11172-3 / ISO/IEC 13818-3 (MPEG-1/2 Audio Layer III) Elementary Stream
- Packetized metadata (ID3v2.3.0) Elementary Stream
- AAC container (audio only streams)
- MPEG Audio container (MPEG-1/2 Audio Layer III audio only streams)
- Timed Metadata for HTTP Live Streaming (in ID3 format, carried in MPEG-2 TS)
- AES-128 decryption
- SAMPLE-AES decryption (only supported if using MPEG-2 TS container)
- Encrypted media extensions (EME) support for DRM (digital rights management)
- Widevine CDM (beta/experimental) (see Shaka-package test-stream in demo)
- CEA-608/708 captions
- WebVTT subtitles
- Alternate Audio Track Rendition (Master Playlist with Alternative Audio) for VoD and Live playlists
- Adaptive streaming
- Manual & Auto Quality Switching
- 3 Quality Switching modes are available (controllable through API means)
- Instant switching (immediate quality switch at current video position)
- Smooth switching (quality switch for next loaded fragment)
- Bandwidth conservative switching (quality switch change for next loaded fragment, without flushing the buffer)
- In Auto-Quality mode, emergency switch down in case bandwidth is suddenly dropping to minimize buffering.
- 3 Quality Switching modes are available (controllable through API means)
- Manual & Auto Quality Switching
- Accurate Seeking on VoD & Live (not limited to fragment or keyframe boundary)
- Ability to seek in buffer and back buffer without redownloading segments
- Built-in Analytics
- All internal events can be monitored (Network Events, Video Events)
- Playback session metrics are also exposed
- Resilience to errors
- Retry mechanism embedded in the library
- Recovery actions can be triggered fix fatal media or network errors
- Redundant/Failover Playlists
For details on the HLS format and these tags' meanings, see https://tools.ietf.org/html/draft-pantos-hls-rfc8216bis-08
#EXT-X-STREAM-INF:<attribute-list>
<URI>
#EXT-X-MEDIA:<attribute-list>
#EXT-X-SESSION-DATA:<attribute-list>
The following properties are added to their respective variants' attribute list but are not implemented in their selection and playback.
VIDEO-RANGE
andHDCP-LEVEL
(See #2489)
#EXTM3U
#EXT-X-VERSION=<n>
#EXTINF:<duration>,[<title>]
#EXT-X-ENDLIST
#EXT-X-MEDIA-SEQUENCE=<n>
#EXT-X-TARGETDURATION=<n>
#EXT-X-DISCONTINUITY
#EXT-X-DISCONTINUITY-SEQUENCE=<n>
#EXT-X-BYTERANGE=<n>[@<o>]
#EXT-X-MAP:<attribute-list>
#EXT-X-KEY:<attribute-list>
(METHOD=SAMPLE-AES
is only supports with MPEG-2 TS segments)#EXT-X-PROGRAM-DATE-TIME:<attribute-list>
#EXT-X-START:TIME-OFFSET=<n>
#EXT-X-SERVER-CONTROL:<attribute-list>
#EXT-X-PART-INF:PART-TARGET=<n>
#EXT-X-PART:<attribute-list>
#EXT-X-PRELOAD-HINT:<attribute-list>
#EXT-X-SKIP:<attribute-list>
#EXT-X-RENDITION-REPORT:<attribute-list>
The following tags are added to their respective fragment's attribute list but are not implemented in streaming and playback.
#EXT-X-DATERANGE:<attribute-list>
(Not added to metadata TextTracks. See #2218)#EXT-X-BITRATE
(Not used in ABR controller)#EXT-X-GAP
(Not implemented. See #2940)
For a complete list of issues, see "Top priorities" in the Release Planning and Backlog project tab. Codec support is dependent on the runtime environment (for example, not all browsers on the same OS support HEVC).
- Multiple
#EXT-X-MAP
tag support #2279 - CMAF CC support #2623
Emsg
Inband Timed Metadata for FMP4 (ID3 within Emsgv1) in "metadata" TextTracks #2360#EXT-X-DATERANGE
in "metadata" TextTracks #2218#EXT-X-GAP
filling #2940#EXT-X-I-FRAME-STREAM-INF
I-frame Media Playlist filesSAMPLE-AES
with fmp4, aac, mp3, vtt... segments (MPEG-2 TS only)- Advanced variant selection based on runtime media capabilities (See issues labeled
media-capabilities
) - MP3 elementary stream audio in IE and Edge (<=18) on Windows 10 (See #1641 and Microsoft answers forum)
You can safely require this library in Node and absolutely nothing will happen. A dummy object is exported so that requiring the library does not throw an error. Hls.js is not instantiable in Node.js. See #1841 for more details.
hls.js is released under Apache 2.0 License
Pull requests are welcome. Here is a quick guide on how to start.
- First, checkout the repository and install the required dependencies
git clone https://github.com/video-dev/hls.js.git
# setup dev environment
cd hls.js
npm install
# runs dev-server for demo page (recompiles on file-watch, but doesn't write to actual dist fs artifacts)
npm run dev
# lint
npm run lint
- Use EditorConfig or at least stay consistent to the file formats defined in the
.editorconfig
file. - Develop in a topic branch, not master
- Prettier should run automatically in the pre-commit hook, but if it doesn't, run
npm run prettier
.
After cloning or pulling from the repository, first of all, make sure your local node-modules are up-to-date with the package deps:
npm install
Build all flavors (suitable for prod-mode/CI):
npm install
npm run build
Only debug-mode artifacts:
npm run build:debug
Build and watch (customized dev setups where you'll want to host through another server than webpacks' - for example in a sub-module/project)
npm run build:watch
Only specific flavor (known configs are: debug, dist, light, light-dist, demo):
npm run build -- --env dist # replace "dist" by other configuration name, see above ^
Note: The "demo" config is always built.
NOTE: hls.light.*.js
dist files do not include EME, subtitles, or alternate-audio support. In addition,
the following types are not available in the light build:
AudioStreamController
AudioTrackController
CuesInterface
EMEController
SubtitleStreamController
SubtitleTrackController
TimelineController
Run linter:
npm run lint
Run linter with auto-fix mode:
npm run lint:fix
Run linter with errors only (no warnings)
npm run lint:quiet
Run all tests at once:
npm test
Run unit tests:
npm run test:unit
Run unit tests in watch mode:
npm run test:unit:watch
Run functional (integration) tests:
npm run test:func
Click here for details.