Skip to content

Conversation

tbekolay
Copy link

@tbekolay tbekolay commented Jun 5, 2023

This aids downstream repos that implement fixes for various cloning issues by making this function able to be monkey-patched.

For context, I am part of @hunse's team that is affected by #994. We are successfully using the workaround that he posted in his final comment. However, in order to implement that workaround we have to copy/paste the entirety of quantize_apply in our project to monkey-patch it. While it works, it's a brittle solution as we will have to update our copied quantize_apply when it changes in this repo.

While we are happy to make a pull request with our full fix, it will add to your maintenance burden, so we thought that we would instead start off with this very minimal change that will not increase your maintenance burden, but still allow us to do a minor surgical monkey patch that requires no copy/pasting in our project. If you would like us to instead or in addition contribute that fix, please let me know (but since #994 has not received the "contributions welcome" tag, I assume it is not of interest at the moment).

This aids downstream repos that implement fixes for various cloning
issues by making this function able to be monkey-patched.
@github-actions github-actions bot added the technique:qat Regarding tfmot.quantization.keras (for quantization-aware training) APIs and docs label Jun 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
technique:qat Regarding tfmot.quantization.keras (for quantization-aware training) APIs and docs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant