Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure that the done event is triggered even if the segment does not contain audio/video data #224

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@
dist
dist-test
npm-debug.log
samples
4 changes: 2 additions & 2 deletions debug/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -356,7 +356,7 @@ <h3>footer</h3>
prepareSourceBuffer(combined, outputType, function () {
console.log('appending...');
window.vjsBuffer.appendBuffer(bytes);
video.play();
// video.play();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Were these meant to be commented out?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops. Autoplay on an airplane with sound turned on. I can revert this, though I think its a bit annoying to keep the autoplay

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just video.volume = 0, no? :D

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of our automated tests have to be run with video.muted = true; to avoid autoplay restrictions breaking our tests. YMMV.

});
}
});
Expand All @@ -379,7 +379,7 @@ <h3>footer</h3>
if ($('#working-active').checked) {
prepareSourceBuffer(function() {
window.vjsBuffer.appendBuffer(bytes);
video.play();
// video.play();
});
}

Expand Down
147 changes: 91 additions & 56 deletions lib/mp4/transmuxer.js
Original file line number Diff line number Diff line change
Expand Up @@ -712,10 +712,6 @@ VideoSegmentStream.prototype = new Stream();
* in the source; false to adjust the first segment to start at media timeline start.
*/
CoalesceStream = function(options, metadataStream) {
// Number of Tracks per output segment
// If greater than 1, we combine multiple
// tracks into a single segment
this.numberOfTracks = 0;
this.metadataStream = metadataStream;

options = options || {};
Expand All @@ -730,8 +726,12 @@ CoalesceStream = function(options, metadataStream) {
this.keepOriginalTimestamps = options.keepOriginalTimestamps;
}

this.tracks = {
video: null,
audio: null
};

this.pendingTracks = [];
this.videoTrack = null;
this.pendingBoxes = [];
this.pendingCaptions = [];
this.pendingMetadata = [];
Expand Down Expand Up @@ -760,12 +760,34 @@ CoalesceStream = function(options, metadataStream) {
this.pendingBytes += output.boxes.byteLength;

if (output.track.type === 'video') {
this.videoTrack = output.track;
this.tracks.video.track = output.track;
}
if (output.track.type === 'audio') {
this.audioTrack = output.track;
this.tracks.audio.track = output.track;
}
};

this.reset = function(flush) {
if (this.tracks.video) {
this.tracks.video.track = null;
if (flush) {
this.tracks.video.flushed = false;
}
}

if (this.tracks.audio) {
this.tracks.audio.track = null;
if (flush) {
this.tracks.audio.flushed = false;
}
}

this.pendingTracks.length = 0;
this.pendingBoxes.length = 0;
this.pendingCaptions.length = 0;
this.pendingBytes = 0;
this.pendingMetadata.length = 0;
};
};

CoalesceStream.prototype = new Stream();
Expand All @@ -784,54 +806,64 @@ CoalesceStream.prototype.flush = function(flushSource) {
timelineStartPts = 0,
i;

if (this.pendingTracks.length < this.numberOfTracks) {
if (flushSource !== 'VideoSegmentStream' &&
flushSource !== 'AudioSegmentStream') {
// Return because we haven't received a flush from a data-generating
// portion of the segment (meaning that we have only recieved meta-data
// or captions.)
return;
} else if (this.remuxTracks) {
// Return until we have enough tracks from the pipeline to remux (if we
// are remuxing audio and video into a single MP4)
return;
} else if (this.pendingTracks.length === 0) {
// In the case where we receive a flush without any data having been
// received we consider it an emitted track for the purposes of coalescing
// `done` events.
// We do this for the case where there is an audio and video track in the
// segment but no audio data. (seen in several playlists with alternate
// audio tracks and no audio present in the main TS segments.)
this.emittedTracks++;

if (this.emittedTracks >= this.numberOfTracks) {
this.trigger('done');
this.emittedTracks = 0;
}
return;
}
if (flushSource !== 'VideoSegmentStream' && flushSource !== 'AudioSegmentStream') {
// Return because we haven't received a flush from a data-generating
// portion of the segment (meaning that we have only recieved metadata
// or captions.)
return;
}

if (flushSource === 'VideoSegmentStream') {
this.tracks.video.flushed = true;
}

if (this.videoTrack) {
timelineStartPts = this.videoTrack.timelineStartInfo.pts;
if (flushSource === 'AudioSegmentStream') {
this.tracks.audio.flushed = true;
}

var waitingOn = {
video: this.tracks.video && !this.tracks.video.flushed,
audio: this.tracks.audio && !this.tracks.audio.flushed
};

var done = !(waitingOn.video || waitingOn.audio);

if (this.remuxTracks && !done) {
// Return until we have enough tracks from the pipeline to remux (if we
// are remuxing audio and video into a single MP4)
return;
}

if (this.pendingTracks.length === 0 && done) {
// In the case where we receive a flush without any data having been
// received we consider it an emitted track for the purposes of coalescing
// `done` events.
// We do this for the case where there is an audio and video track in the
// segment but no audio data. (seen in several playlists with alternate
// audio tracks and no audio present in the main TS segments.)
this.reset(true);
this.trigger('done');
return;
}

if (this.tracks.video && this.tracks.video.track) {
timelineStartPts = this.tracks.video.track.timelineStartInfo.pts;
VIDEO_PROPERTIES.forEach(function(prop) {
event.info[prop] = this.videoTrack[prop];
event.info[prop] = this.tracks.video.track[prop];
}, this);
} else if (this.audioTrack) {
timelineStartPts = this.audioTrack.timelineStartInfo.pts;
} else if (this.tracks.audio && this.tracks.audio.track) {
timelineStartPts = this.tracks.audio.track.timelineStartInfo.pts;
AUDIO_PROPERTIES.forEach(function(prop) {
event.info[prop] = this.audioTrack[prop];
event.info[prop] = this.tracks.audio.track[prop];
}, this);
}

if (this.pendingTracks.length === 1) {
event.type = this.pendingTracks[0].type;
} else {
if (this.remuxTracks && this.tracks.video && this.tracks.audio) {
event.type = 'combined';
} else {
event.type = this.pendingTracks[0].type;
}

this.emittedTracks += this.pendingTracks.length;

initSegment = mp4.initSegment(this.pendingTracks);

// Create a new typed array to hold the init segment
Expand Down Expand Up @@ -889,21 +921,15 @@ CoalesceStream.prototype.flush = function(flushSource) {
// it for the first
event.metadata.dispatchType = this.metadataStream.dispatchType;

// Reset stream state
this.pendingTracks.length = 0;
this.videoTrack = null;
this.pendingBoxes.length = 0;
this.pendingCaptions.length = 0;
this.pendingBytes = 0;
this.pendingMetadata.length = 0;
this.reset();

// Emit the built segment
this.trigger('data', event);

// Only emit `done` if all tracks have been flushed and emitted
if (this.emittedTracks >= this.numberOfTracks) {
if (done) {
this.reset(true);
this.trigger('done');
this.emittedTracks = 0;
}
};
/**
Expand Down Expand Up @@ -962,7 +988,10 @@ Transmuxer = function(options) {
type: 'audio'
};
// hook up the audio segment stream to the first track with aac data
pipeline.coalesceStream.numberOfTracks++;
pipeline.coalesceStream.tracks.audio = {
track: null,
flushed: false
};
pipeline.audioSegmentStream = new AudioSegmentStream(audioTrack, options);
// Set up the final part of the audio pipeline
pipeline.adtsStream
Expand Down Expand Up @@ -1039,7 +1068,10 @@ Transmuxer = function(options) {

// hook up the video segment stream to the first track with h264 data
if (videoTrack && !pipeline.videoSegmentStream) {
pipeline.coalesceStream.numberOfTracks++;
pipeline.coalesceStream.tracks.video = {
track: null,
flushed: false
};
pipeline.videoSegmentStream = new VideoSegmentStream(videoTrack, options);

pipeline.videoSegmentStream.on('timelineStartInfo', function(timelineStartInfo) {
Expand Down Expand Up @@ -1073,7 +1105,10 @@ Transmuxer = function(options) {

if (audioTrack && !pipeline.audioSegmentStream) {
// hook up the audio segment stream to the first track with aac data
pipeline.coalesceStream.numberOfTracks++;
pipeline.coalesceStream.tracks.audio = {
track: null,
flushed: false
};
pipeline.audioSegmentStream = new AudioSegmentStream(audioTrack, options);

// Set up the final part of the audio pipeline
Expand Down