Replies: 1 comment
-
Would this be the solution: in the POST body parsing plugin use _format_version: "3.0"
_transform: true
services:
- name: gpt-4o
host: gpt-4o.internal.cluster
port: 80
protocol: http
plugins:
- name: rate-limiting
config:
minute: 10
policy: local
- name: gpt-35
host: gpt-35.internal.cluster
port: 80
protocol: http
plugins:
- name: rate-limiting
config:
minute: 50
policy: local
- name: dummy-service
host: dummy.internal
port: 80
protocol: http
routes:
- name: chat-route
paths:
- /v1/chat/completions
strip_path: false
service: dummy-service
- name: gpt-4o-route
paths:
- /v1/chat/completions/gpt-4o
strip_path: true
service: gpt-4o
- name: gpt-35-route
paths:
- /v1/chat/completions/gpt-3.5-turbo
strip_path: true
service: gpt-35
plugins:
- name: body-model-router
route: chat-route
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
I wanted to ask if it is possible in Kong to apply a single entrypoint route that uses a POST body parsing custom plugin, which routes the traffic to the corresponding backend service. But still I would want to apply the default plugins on the backend services.
The following config yaml examplifies this:
I would want Kong to
Like a unified API Gateway for openai API specifics, that routes traffic based on the POST body model parameter to a specific backend.
I guess this comes down to:
Can
kong.service.set_target()
be used to reference the service object defined in the config with the service specific plugins? Or is there another way that the route plugin could then trigger the service plugins instead of directly setting upstream host port?Beta Was this translation helpful? Give feedback.
All reactions