cinelerra luts

https://www.foxbusiness.com/personal-finance/what-net-worth-does-one-need-rich-2024-heres-what-americans-think?dicbo=v2-qSV6OHX

If it takes $3 mil to be rich nowadays, why did you vote to tax the everliving daylights out of everyone making $200k as if renters are the incarnation of satan?

 -------------------------------------------------------------------------------------------------------

 

 

The flexor digitorum longus did its last flex at age 48, but it still managed a pretty impressive 8 miles, considering the number of animals who can't make it 8 miles.  It hasn't recovered in the last 6 months, but it's neither getting worse nor better.  New techniques have enabled higher exertion without full recovery.  There was a plan to start entertaining tendonotomy in September, but with metronome training & diet modification, it might be less impactful just to keep running on it in its reduced capacity than knock it out completely for an extended time with surgery.

 Of all dietary supplements, whole chicken seems to be the most beneficial in terms of manetaining its current state.  It's the staple food in Kenya.  It seems to have the optimum amino acid & electrolyte concentration for running animals.  Whey protein might be 2nd, benefitting muscles but not tendons.  Jello seems to be a placebo. 

------------------------------------------------------------------------------------------------------------------------


 The gender data is inferred on this, but it matches gootube. Remarkably unchanged from 30 years ago.  The internet is still a place for male 24-30 year olds who haven't started families yet.  There were theories that the decreasing technical barrier, more connected devices & increasing variety of uses would get more women online & keep more users online after 30, but that didn't happen.  

The 45-54 generation was the generation that created the dot com boom & they've almost entirely disappeared.  It could be said that every generation is bigger than the last generation, so the percentage of the previous generation is intact but smaller when compared with the next generation.  Lions still don't see individuals from 30 years ago staying online.  The women who were online 30 years ago are definitely all gone.

More commerce is done online than 30 years ago, but animals buy most of their stuff from 25-35.  Most cars & houses are still not bought online & that's where most of the money goes after 35.

 ----------------------------------------------------------------------------------------------------------------------

Something a lot of gootubers have been doing for the last 10 years is using luts. It seems most animals nowadays are using LUTs on a daily basis.  When the lowly Rossguy uses them, it's time to ponder support.  Luts as lions see them are manely a way to condense a bunch of color effects into a single table that is easily transferred between users.  

Animals always say they imported a LUT into davinci resolve or they imported a LUT into OBS, as if there was some intrinsic way the program applied a LUT to all data at the point it was ingested.  This came from the original primary use of LUTs, to perform color correction on raw footage that would normally be done in camera & that only came about when consumer cameras started outputting raw HDR.  Since then, they've definitely evolved into a way to replace a bunch of color effects.

The last 30 years have seen an entire color science evolve out of nothing & then work its way down to the consumer level.  30 years ago, there was just whatever the film stock gave you & camera filters.

 The way lions see it, there could be a plugin which applies a LUT to whatever data is passed to it.  It would just take a filename of the LUT & that would be all.  It would also need a way to generate a LUT, possibly by comparing a before & after buffer.

Lions would never use them & there is no user base that would, but it might be an educational jellybean feature.  Gootubers do look better than they did 15 years ago & a big reason is LUTs.

Ross Guy said he used a LUT in OBS.  FFmpeg & premiere pro also accept the same LUTs.  There is a cinelerra plugin which uses ffmpeg for LUTs.  They all import a PNG image as a LUT.  

 


The PNG image contains the entire color cube in a 512x512 RGB image.  There are 64 squares.  Each square is 64x64.  X inside each square contains 64 shades of red.  Y inside each square contains 64 shades of green.  Each square increments 1 shade of blue from left to right & top to bottom.

The challenge for the program is to interpolate from the LUT to the full color cube.  The leading plan is to quantize the input to 64 steps.  Detect 1 input pixel using the floor of the RGB & 1 input pixel using the ciel of the RGB.  Then interpolate the output based on the hypotenuse distance between the floor & ceil.  The 64x64x64 color cube needs 786kb for 8 bit output & 3MB for float output.  Expanding the LUT to 256 steps would make 8 bit input faster but need 50MB for 8 bit & 201MB for floating point.

  This is not useful for HDR, but it's what the Linux users are using.  Lions really need a LUT for CR3.

Generating a LUT is probably best done in a dedicated program that compares stills of a test video.

Davinci resolve uses a different format.  

https://github.com/xtremestuff/protune-transforms 

There are very few free LUTs for davinci resolve among a sea of adware.  They use a C program.  Blackmagic seems to call their C variant DCTL, Davinci color transform language.  It's intended to only run on a GPU in floating point.

 Canon uses yet another LUT format.  It's an ASCII table of every possible input RGB value with 65 steps of each component & 274625 total pixels.  The outputs are floating point despite the inputs being quantized to 65 steps.

Reading protune output from gopros & CR3 from canons requires a bit more than a PNG image if the full dynamic range is to be retained.  There would have to be a way to generate a floating point TIFF from a DCTL program or a canon table.  Now you're looking at parsing multiple file formats in a plugin.

 --------------------------------

This biggest challenge with LUTs was the opengl support.  That needs to upload the LUT to a texture.  Problems continue to abound when switching between fullscreen & windowed playback.  Some textures still get dropped when the window changes.  

The latest trouble spot was a case where the LUT VFrame wasn't resetting to RAM after changing windows while the other VFrames were, so it was never re-uploading the texture.  If it was set to RAM during 1st creation, it didn't work at all.  It had to specifically be set to RAM only when changing windows.  It's almost like the 1st creation needs another opengl state.  The interpolation is so slow, it desperately needs opengl.  There's a lot of diabolical programming involved for something no-one else uses.

The LUT plugin currently only takes 8 bit PNG.  It would need a new file handler to read floating point TIFF.   LUT creation is still leaning towards a standalone program that compares stills because lions just never use them.

 

 

 

 

 

 

 

Comments

Popular posts from this blog