Skip to content

Commit

Permalink
Add private ai pii masking example in notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
letmerecall committed Dec 11, 2024
1 parent 85409c5 commit eebe8c4
Showing 1 changed file with 177 additions and 7 deletions.
184 changes: 177 additions & 7 deletions examples/notebooks/privateai_pii_detection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,26 @@
"source": [
"# Private AI PII detection example\n",
"\n",
"This notebook shows how to use Private AI for PII detection in NeMo Guardrails."
"This notebook shows how to use Private AI for PII detection and PII masking in NeMo Guardrails."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Import libraries"
"## PII Detection"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Import libraries"
]
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -29,7 +36,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -42,7 +49,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create rails with Private AI PII detection\n",
"### Create rails with Private AI PII detection\n",
"\n",
"For this step you'll need your OpenAI API key & Private AI API key.\n",
"\n",
Expand Down Expand Up @@ -98,7 +105,123 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Input rails"
"### Input rails"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = rails.generate(messages=[{\"role\": \"user\", \"content\": \"Hello! I'm John. My email id is text@gmail.com. I live in California, USA.\"}])\n",
"\n",
"info = rails.explain()\n",
"\n",
"print(\"Response\")\n",
"print(\"----------------------------------------\")\n",
"print(response[\"content\"])\n",
"\n",
"\n",
"print(\"\\n\\nColang history\")\n",
"print(\"----------------------------------------\")\n",
"print(info.colang_history)\n",
"\n",
"print(\"\\n\\nLLM calls summary\")\n",
"print(\"----------------------------------------\")\n",
"info.print_llm_calls_summary()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Output rails"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = rails.generate(messages=[{\"role\": \"user\", \"content\": \"give me a sample email id\"}])\n",
"\n",
"info = rails.explain()\n",
"\n",
"print(\"Response\")\n",
"print(\"----------------------------------------\\n\\n\")\n",
"print(response[\"content\"])\n",
"\n",
"\n",
"print(\"\\n\\nColang history\")\n",
"print(\"----------------------------------------\")\n",
"print(info.colang_history)\n",
"\n",
"print(\"\\n\\nLLM calls summary\")\n",
"print(\"----------------------------------------\")\n",
"info.print_llm_calls_summary()\n",
"\n",
"\n",
"print(\"\\n\\nCompletions where PII was detected!\")\n",
"print(\"----------------------------------------\")\n",
"print(info.llm_calls[0].completion)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## PII Masking\n",
"\n",
"Note: This example uses ollama model."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Input rails"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {},
"outputs": [],
"source": [
"os.environ[\"PAI_API_KEY\"] = \"YOUR PRIVATE AI API KEY\" # Visit https://portal.private-ai.com to get your API key\n",
"\n",
"YAML_CONFIG = \"\"\"\n",
"<!-- models:\n",
" - type: main\n",
" engine: openai\n",
" model: gpt-3.5-turbo-instruct -->\n",
"\n",
"models:\n",
" - type: main\n",
" engine: ollama\n",
" model: llama3.2\n",
" parameters:\n",
" base_url: http://localhost:11434\n",
"\n",
"rails:\n",
" config:\n",
" privateai:\n",
" server_endpoint: https://api.private-ai.com/cloud/v3/process/text\n",
" input:\n",
" entities:\n",
" - LOCATION\n",
" - EMAIL_ADDRESS\n",
" input:\n",
" flows:\n",
" - mask pii on input\n",
"\"\"\"\n",
"\n",
"\n",
"\n",
"config = RailsConfig.from_content(yaml_content=YAML_CONFIG)\n",
"rails = LLMRails(config)"
]
},
{
Expand Down Expand Up @@ -129,7 +252,47 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Output rails"
"### Output rails"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [],
"source": [
"os.environ[\"PAI_API_KEY\"] = \"YOUR PRIVATE AI API KEY\" # Visit https://portal.private-ai.com to get your API key\n",
"\n",
"YAML_CONFIG = \"\"\"\n",
"<!-- models:\n",
" - type: main\n",
" engine: openai\n",
" model: gpt-3.5-turbo-instruct -->\n",
"\n",
"models:\n",
" - type: main\n",
" engine: ollama\n",
" model: llama3.2\n",
" parameters:\n",
" base_url: http://localhost:11434\n",
"\n",
"rails:\n",
" config:\n",
" privateai:\n",
" server_endpoint: https://api.private-ai.com/cloud/v3/process/text\n",
" output:\n",
" entities:\n",
" - LOCATION\n",
" - EMAIL_ADDRESS\n",
" output:\n",
" flows:\n",
" - mask pii on output\n",
"\"\"\"\n",
"\n",
"\n",
"\n",
"config = RailsConfig.from_content(yaml_content=YAML_CONFIG)\n",
"rails = LLMRails(config)"
]
},
{
Expand Down Expand Up @@ -160,6 +323,13 @@
"print(\"----------------------------------------\")\n",
"print(info.llm_calls[0].completion)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down

0 comments on commit eebe8c4

Please sign in to comment.