Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(load-generator): Adapt OpenFeature OFREP instead of IN_PROCESS #2114

Conversation

aepfli
Copy link
Contributor

@aepfli aepfli commented Mar 10, 2025

Changes

OpenFeature supports multiple different modes.
The newest implementation relies on threads and other python internal, which seem to be monkeypatched by locust.
Due to this, the IN_PROCESS mode is not working properly anymore and is causing issues.

Instead we suggest to switch to the OFREP (OpenFeature Remote Evaluation Protocol) for this usecase. This is a simple http call which will return all evaluate flags, and will regularly poll for changes.

relates:

Merge Requirements

For new features contributions, please make sure you have completed the following
essential items:

  • CHANGELOG.md updated to document new feature additions
  • Appropriate documentation updates in the docs
  • Appropriate Helm chart updates in the helm-charts

Maintainers will not merge until the above have been completed. If you're unsure
which docs need to be changed ping the
@open-telemetry/demo-approvers.

Copy link

linux-foundation-easycla bot commented Mar 10, 2025

CLA Signed

The committers listed above are authorized under a signed CLA.

@github-actions github-actions bot added the helm-update-required Requires an update to the Helm chart when released label Mar 10, 2025
OpenFeature supports multiple different modes.
The newest implementation relies on threads and other python internal,
which seem to be monkeypatched by locust.
Due to this, the IN_PROCESS mode is not working properly anymore and
is causing issues.

Instead we suggest to switch to the OFREP (OpenFeature Remote Evaluation Protocol)
for this usecase. This is a simple http call which will return all evaluate flags,
and will regularly poll for changes.

Signed-off-by: Simon Schrottner <simon.schrottner@dynatrace.com>
@aepfli aepfli force-pushed the feat/openfeature-ofrep-instead-of-inprocess-for-loadgenerator branch from 98c1ea3 to a70d7f3 Compare March 10, 2025 09:36
@aepfli
Copy link
Contributor Author

aepfli commented Mar 10, 2025

i created open-telemetry/opentelemetry-helm-charts#1563 - but my helm knowledge is limited, if there is anything i can or should improve please let me know

@aepfli
Copy link
Contributor Author

aepfli commented Mar 10, 2025

I checked the documentation, but i could not find any deeper explanation of the ports, nor the used modes for OpenFeature, hence i do not see any need for a documentation update. wdyt?

@julianocosta89 julianocosta89 marked this pull request as ready for review March 11, 2025 07:33
@julianocosta89 julianocosta89 requested a review from a team as a code owner March 11, 2025 07:33
@julianocosta89
Copy link
Member

@aepfli this one should be good to go.
In regards the Helm chart, it requires a couple of other steps, but I'll get back to you on the other PR.

I've added to your PR the flagd ports on the docker-compose.minimal.yml file, hope you don't mind.

Is there anything here you want us to wait before merging it?

@julianocosta89 julianocosta89 self-requested a review March 11, 2025 07:40
Copy link
Member

@julianocosta89 julianocosta89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've noticed that in the Helm update, you are adding FLAGD_OFREP_PORT to the load-generator, but not in here.
Is that missing?

Signed-off-by: Simon Schrottner <simon.schrottner@dynatrace.com>
@aepfli
Copy link
Contributor Author

aepfli commented Mar 11, 2025

I've noticed that in the Helm update, you are adding FLAGD_OFREP_PORT to the load-generator, but not in here. Is that missing?

thank you nice catch, added the envvar to the load generator

@aepfli
Copy link
Contributor Author

aepfli commented Mar 11, 2025

Is there anything here you want us to wait before merging it?

from my side we can go ahead with this pr ;) thank you

@julianocosta89
Copy link
Member

@aepfli last question before getting this merged.

I was testing it out and noticed that the traces for the evaluation are not connected with load-generator:

Screenshot 2025-03-11 at 11 11 38

Is there a way to connect the load-generator with the flagd call?

@aepfli
Copy link
Contributor Author

aepfli commented Mar 11, 2025

okay here comes my limited OpenTelemetry know-how ;)

Yes there is, our OpenTelemetryHook implementation is really simple and we are "attaching" ourselves to the current span. the question is, who is responsible to create the current span, and should this maybe be done by the load-generator. see https://github.com/open-feature/python-sdk-contrib/blob/345e7934b9de3879d3aff45c8213ece1a98e3711/hooks/openfeature-hooks-opentelemetry/src/openfeature/contrib/hook/opentelemetry/__init__.py#L16-L42

As I said, my knowledge here is limited but am I on the right track, and should we generate a new span for each WebsiteUser invocation?

@julianocosta89
Copy link
Member

@julianocosta89
Copy link
Member

@aepfli
Copy link
Contributor Author

aepfli commented Mar 11, 2025

we are never creating spans, only adding ourselves to existing ones. So I am not sure, if we should extract this into an own topic, and think about what is a good span for a loadtest. When to create a new one, etc. wdyt?

@julianocosta89
Copy link
Member

Had some time again to look at this.
It seems that the Loadgen is creating the span:
image

But it looks like the context is not being propagated with the hook.

@aepfli
Copy link
Contributor Author

aepfli commented Mar 12, 2025

most likely the span is generated with self.client.get("/") (https://github.com/open-telemetry/opentelemetry-demo/blob/main/src/load-generator/locustfile.py#L172) - but that is after flag evaluation, hence we do not have a span created when the flag is evaluated - or am i wrong here? question do you see the span correctly attached in the previous version?

There are two different things actually ;)
one is the request to fetch out flag configuration that is happening every 5s via polling, but that is detached from any kind of load generator behavior.
second is the actual evaluation of the flag, this is not related to the first one, and this is happening before the self.client.get.

I do not think it makes sense to combine both of them into one span. As the first is responsible for having flag data within the memory. Where as the second one is actually interesting for the span - when we are generating the load.

@julianocosta89
Copy link
Member

Yeah, it makes sense to keep them as separate spans.

I wonder if the call shouldn't have a span and be linked to the flagd though

@beeme1mr
Copy link
Contributor

I wonder if the call shouldn't have a span and be linked to the flagd though

What's the goal? The current behavior seems correct unless we want to manually create a span within the load generator.

@julianocosta89
Copy link
Member

@beeme1mr the loadgen span is already being created no?!
#2114 (comment)

@aepfli
Copy link
Contributor Author

aepfli commented Mar 13, 2025

@beeme1mr the loadgen span is already being created no?! #2114 (comment)

I was just assuming that client.get creates a log span, but if it is, this span is generated in that method, before we evaluate flags. hence i think this is an additional possible improvement, and should be most likely extracted into an own topic.

@julianocosta89
Copy link
Member

@beeme1mr the loadgen span is already being created no?! #2114 (comment)

I was just assuming that client.get creates a log span, but if it is, this span is generated in that method, before we evaluate flags. hence i think this is an additional possible improvement, and should be most likely extracted into an own topic.

Totally! That would indeed be a different issue, I'm just discussing here because we are at it.
But let me get this one merged

@julianocosta89 julianocosta89 merged commit 06e6e46 into open-telemetry:main Mar 13, 2025
31 checks passed
@aepfli
Copy link
Contributor Author

aepfli commented Mar 13, 2025

Totally! That would indeed be a different issue, I'm just discussing here because we are at it. But let me get this one merged

I will do a short exploration, maybe i can create an easy to use solution ;)

@aepfli
Copy link
Contributor Author

aepfli commented Mar 13, 2025

see: #2119 - where i add spans around each task of the locust load generator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
helm-update-required Requires an update to the Helm chart when released
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants