-
-
Notifications
You must be signed in to change notification settings - Fork 950
Revert "SQS: Support Async JSON SQS Protocol & Message Attributes" #2260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
)" This reverts commit 7e501a0.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2260 +/- ##
==========================================
- Coverage 81.57% 81.54% -0.03%
==========================================
Files 77 77
Lines 9537 9523 -14
Branches 1161 1157 -4
==========================================
- Hits 7780 7766 -14
Misses 1564 1564
Partials 193 193 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey bro, I've already completely investigated the situation, see my comment with the reproduction script.
This code is legit (in terms of the performance hit), we don't need to revert it.
You can close the PR :)
EDIT:
Referring to this: #2258
@Nusnus or @auvipy I am having a hard time following the conversation across multiple tickets. Please excuse me if I am missing something and please correct me if my understanding is wrong --
So there seem to be two issues:
Is that right? I can dig into (2) but I am not familiar with (1). In addition, if you need a very fast fix for (2) feel free to revert #2226 and I can do debugging while not blocking anyone. |
Yes.
Thank you, that will be very helpful!
Thank you. For now I prefer to avoid reverting it unless we know for sure it causes the problem. |
@Nusnus thanks for the confirmation. I will work to repro today and get back to you. Do you have a time frame for when you need a revert/keep decision? |
I aim to resolve everything for a fixed release by Monday. I’m available during this time for assistance and review if needed. |
@Nusnus So far I have not been able to reproduce this issue without forcing an illegal combination of packages versions between celery, kombu, boto3, and botocore. So far my findings are that if you stay within the prescribed version ranges (as encoded in install_requires), you should not hit this bug. I will spend more time on this tomorrow hunting for valid ranges that also have this bug. You might have to consider whether Kombu wants to support running out-of-spec versions of botocore. If you want to support running very old botocore, Celery itself will have to lower the boto3 version requirement. Otherwise, if you want to preserve the current minimum-required botocore, we may have to instruct users having issues to upgrade to the latest minimum support botocore version from 2.5 years ago. I am frankly surprised that whatever is assembling illegal package combinations is not throwing version-compatibility errors and is instead naively allow incompatible packages to run together. That seems like something that would require a very high bar to support. |
We are facing the same error with respect to containers crashing. Error - CRITICAL/MainProcess] Unrecoverable error: Exception("Request HTTP Error HTTP 404 Not Found (b'\n')") |
@sarswamanishpl Can you report the exact installed version of You can find this with |
I think we should bump the minimum version of botocore, is that right? |
@auvipy I would agree with that, but for at least one of these folks they are running a botocore version that is already out of spec. So bumping would not fix his issue (though, I would hesitate to want to officially support allowing out-of-spec botocore versions). I noted in the original PR, but the optimal version of botocore would be Update: |
@hfern this is the botocore version we are using botocore==1.29.165 |
@sarswamanishpl thank you for that info! You have led me to the first quality repro case! For anyone else following along, you can repro with from celery import Celery
app = Celery(
'kombudebug.tasks',
broker='sqs://',
broker_transport_options = {
'region': 'us-west-2',
},
)
app.conf.update(
task_default_queue='kombu-debug-queue',
)
@app.task
def add(x, y):
print("Got task with args:", x, y)
return x + y
if __name__ == '__main__':
print("Sending task...")
add.delay(1, 2)
print("Task sent.") And the error occurs when running the worker:
So clearly, something is going wrong with that call construction for receiving messages. I'll investigate further. |
@auvipy I can confirm that adding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this do not need the revert anymore as we found a fix
to see if it fixes a regression report Reverts #2226