Skip to content

Implement DeepCCompress node #23

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
dbr opened this issue Jan 9, 2019 · 7 comments
Open

Implement DeepCCompress node #23

dbr opened this issue Jan 9, 2019 · 7 comments

Comments

@dbr
Copy link

dbr commented Jan 9, 2019

I see this is planned in the README \o/

We have one internally and it has proven very useful - it has two modes:

  1. Merge by z-depth threshold - the obvious case
  2. A dumb "sample reduction percentage" mode, which merges samples to approximately reduce the count by a given percentage - e.g 50% removes every other sample, 10% would remove every ~10th sample. Can be prone to visual artifacts with certain data, but has proven quite useful particularly for debugging/performance diagnostic reasons

We had also discussed some other features which would have been useful at times:

  1. Option to to merge samples with similar values (e.g merge all samples where the R/G/B values are identical) - most obviously useful for things like ID channels.
  2. Allow combination of these merge options - e.g so you can "merge 50% of the samples which are within 0.1 units", or "merge the samples within 0.00001 in rgba.red and within 1 unit".
@charlesangus
Copy link
Owner

Good thoughts, thanks. I hadn't considered percentage reduction, but was considering an option for clamping sample count, so you could specify no more than 20 samples for example. Percentage reduction over a threshold might be good, too - reduce 50% any samples over 20, say.

@falkhofmann falkhofmann self-assigned this Nov 21, 2021
@falkhofmann
Copy link
Collaborator

i started a bit to look into the logic.

the total sample count is doable and all good. so defining a number of samples are doable, but the issue i am having, is that some samples are empty in all channels. however, the new total sample count is correct. just the wrong/empty values.

so far i have 3 options in mind to reduce samples:

  • percentage. reduce by given numbers. Ie 10 would reduce samples on each pixeel to 90%
  • remove n-th sample. similar to percentage, but offers a different input method.
  • cap. limit total amount of samples to a certain user defined treshold.

however, one thing I want to keep is the original very front and very back sample. to the entire "depth" is kept. only samples in between those shall be removed IMHO.

what i dont understand TBH is Merge by z-depth threshold - the obvious case

@charlesangus
Copy link
Owner

"Merge by z-depth threshold" is something like, "merge samples which are closer together than .1 units". So if you had samples like:

1
1.1
1.2
1.5
1.6
1.7
2

With merge threshold set to .1, you'd get something like:

1-1.2
1.5-1.7
2

Basically, in pseudo-code:

if sample2.front - sample1.back < threshold:
     mergeOver(sample1, sample2)

@charlesangus
Copy link
Owner

And in my conception, you would get "thick" samples out, where the new samples front is the first samples front, and its back is the furthest sample's back.

But that may or may not be the ideal way to do it.

@charlesangus
Copy link
Owner

The reason I say that's the "obvious" case, is 1) that's how renderers handle aggregating deep samples (Mantra I know for sure does it like this, pretty sure others do too) and 2) it will keep the sort of spatial distribution of the samples in a sensible way. If there's three "clusters" of samples, there will still be three "clusters" out, so long as the clusters are further apart than the threshold.

Solves cases where the threshold wasn't set correctly in the renderer, and so your hard-surface render has like 600 samples per pixel, all within .0001 units of each other.

@falkhofmann
Copy link
Collaborator

that makes a lot of sense i have to admit.

@falkhofmann falkhofmann removed their assignment Dec 5, 2021
@charlesangus charlesangus changed the title DeepCompress node Implement DeepCCompress node Dec 7, 2021
@CMG99
Copy link
Collaborator

CMG99 commented Sep 13, 2022

Is there any code related to this plugin, as i need to implement similar functionality to speed up my Deep Nodes.

If not I was thinking of just creating lots of helper functions in a separate file that can be shared between plugins.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants