You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/docs/advanced/verified-inference.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -6,15 +6,15 @@ sidebar_position: 18
6
6
7
7
## Overview
8
8
9
-
With verified inference, you can turn your Eliza agent fully verifiable on-chain on Solana with an OpenAIcompatible TEE API. This proves that your agent’s thoughts and outputs are free from human control thus increasing the trust of the agent.
9
+
With verified inference, you can turn your Eliza agent fully verifiable on-chain on Solana with an OpenAI-compatible TEE API. This proves that your agent’s thoughts and outputs are free from human control thus increasing the trust of the agent.
10
10
11
11
Compared to [fully deploying the agent in a TEE](https://elizaos.github.io/eliza/docs/advanced/eliza-in-tee/), this is a more light-weight solution which only verifies the inference calls and only needs a single line of code change.
12
12
13
13
The API supports all OpenAI models out of the box, including your fine-tuned models. The following guide will walk you through how to use verified inference API with Eliza.
14
14
15
15
## Background
16
16
17
-
The API is built on top of [Sentience Stack](https://github.com/galadriel-ai/Sentience), which cryptographically verifies agent's LLM inferences inside TEEs, posts those proofs on-chain on Solana, and makes the verified inference logs available to read and display to users.
17
+
The API is built on top of [Sentience Stack](https://github.com/galadriel-ai/Sentience), which cryptographically verifies the agent's LLM inferences inside TEEs, posts those proofs on-chain on Solana, and makes the verified inference logs available to read and display to users.
18
18
19
19
Here’s how it works:
20
20

@@ -23,7 +23,7 @@ Here’s how it works:
23
23
2. The TEE securely processes the request by calling the LLM API.
24
24
3. The TEE sends back the `{Message, Proof}` to the agent.
25
25
4. The TEE submits the attestation with `{Message, Proof}` to Solana.
26
-
5. The Proof of Sentience SDK is used to read the attestation from Solana and verify it with `{Message, Proof}`. The proof log can be added to the agent website/app.
26
+
5. The Proof of Sentience SDK is used to read the attestation from Solana and verify it with `{Message, Proof}`. The proof log can be added to the agent's website/app.
27
27
28
28
To verify the code running inside the TEE, use instructions [from here](https://github.com/galadriel-ai/sentience/tree/main/verified-inference/verify).
29
29
@@ -48,7 +48,7 @@ To verify the code running inside the TEE, use instructions [from here](https://
48
48
```
49
49
4. **Run your agent.**
50
50
51
-
Reminder how to run an agent is [here](https://elizaos.github.io/eliza/docs/quickstart/#create-your-first-agent).
51
+
Reminder of how to run an agent is [here](https://elizaos.github.io/eliza/docs/quickstart/#create-your-first-agent).
0 commit comments