Added an option to analyse a small sample of frames to create a single colour palette for the whole gif#10
Open
tsuko wants to merge 5 commits intoChman:masterfrom
Open
Added an option to analyse a small sample of frames to create a single colour palette for the whole gif#10tsuko wants to merge 5 commits intoChman:masterfrom
tsuko wants to merge 5 commits intoChman:masterfrom
Conversation
added 2 commits
August 27, 2017 17:33
…l adhere to by analyzing a sample of frames (set as every n-th frame in the Inspector) and producing a NeoQuant palette out of those combined frames.
Owner
|
Whoa not sure how I missed this PR. That's a very nice addition, thanks for taking the time to implement it. I think the PR is missing a file though? See the inline comment. |
Chman
reviewed
Dec 11, 2017
| m_Width = serializedObject.FindProperty("m_Width"); | ||
| m_Height = serializedObject.FindProperty("m_Height"); | ||
| m_FramePerSecond = serializedObject.FindProperty("m_FramePerSecond"); | ||
| m_FramesPerColorSample = serializedObject.FindProperty("m_FramesPerColorSample"); |
Owner
There was a problem hiding this comment.
Recorder doesn't have a serialized m_FramesPerColorSample field. Did you miss a file in your commit?
Author
There was a problem hiding this comment.
You're quite right! I hadn't copied my changes to Recorder.cs across. It should be good to go now. Thanks, @Chman.
roguesleipnir
added a commit
to roguesleipnir/Unity-Moments
that referenced
this pull request
Aug 19, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
I extended Moments for my own project, and I thought other people might find these changes useful, as I get more stable colours and substantially faster encoding time now.
What Moments does now:
The existing version of Moments creates a new colour palette for every frame in a gif. NeuQuant is computationally expensive, so this makes encoding slow and, more importantly, it can result in "flickering" colours every frame when the NeuQuant operation quantises the colors slightly differently with a moving camera or changing colors on the screen.
What my additions do:
My fork gives Moments the ability to sample several frames from a gif, analyze those frames together with NeuQuant, and create a single color palette that every frame will map to, without further NeuQuant calls.
I've added a "Frames Per Color Sample" field to the inspector, which allows you to sample every n-th frame in a recording for color mapping purposes. If the "Frames Per Color Sample" is set to zero, Moments reverts to its current default behaviour of creating a brand new color palette every frame.
The results:
By setting "Frames Per Color Sample" to 6, for instance, I halved the encoding time for my 5.5 second gifs on Windows and I get rock-solid colours for the duration of my gif, without the frame-by-frame flickering I experienced previously.
BEFORE: (please note the flickering pinks and purples on some objects)

AFTER:
