An API-driven organizational AI system backend offering assistants, ontology, integrations, workflows, and analytics in a unified framework.
- Overview
- Getting Started
- Standard Operating Procedure
- Development Guide
- Additional Resources
- Development Tools
- Help & Support
The ABI (Augmented Business Intelligence) project is a Python-based backend framework designed to serve as the core infrastructure for building an Organizational AI System. This system empowers businesses to integrate, manage, and scale AI-driven operations with a focus on ontology, assistant-driven workflows, and analytics. Designed for flexibility and scalability, ABI provides a customizable framework suitable for organizations aiming to create intelligent, automated systems tailored to their needs.
- Assistants: Configurable AI assistants to handle specific organizational tasks and interact with users.
- Ontology Management: Define and manage data relationships, structures, and semantic elements.
- Integrations: Seamlessly connect to external data sources and APIs for unified data access.
- Pipelines: Define data processing pipelines to handle and transform data efficiently into the ontological layer.
- Workflows: Automate complex business processes and manage end-to-end workflows.
- Analytics: Access insights through integrated analytics and real-time data processing.
- Data: Handle diverse datasets and manage schema, versioning, deduplication, and change data capture.
ABI Framework is open-source and available for non-production use under the AGPL license. For production deployments, a commercial license is required. Please contact us at support@naas.ai for details on licensing options.
- Install Docker Desktop
Choose one of the following options:
a. Clone the Repository (for personal use)
git clone https://github.com/jupyter-naas/abi.git
cd abi
b. Fork the Repository (to contribute changes)
# 1. Fork via GitHub UI
# 2. Clone your fork
git clone https://github.com/YOUR-USERNAME/abi.git
cd abi
c. Create a Private Fork (for private development)
# 1. Create private repository via GitHub UI
# 2. Clone your private repository
git clone https://github.com/YOUR-USERNAME/abi-private.git
cd abi-private
git remote add upstream https://github.com/jupyter-naas/abi.git
git pull --rebase upstream main
git push
- Copy this file to .env
cp .env.example .env
- Replace placeholder values with your actual credentials
- Uncomment (remove #) from lines you want to activate. The variables are used to configure the assistant.
Note: The .env file should never be committed to version control as it contains sensitive credentials
- Copy the example file to config.yaml
cp config.yaml.example config.yaml
- Edit the file with your configuration
- The config.yaml file is used to configure your workflows, pipelines and the API:
workspace_id
: Workspace ID linked to all components: assistants, ontologies, pipelines, workflows, etc.github_project_repository
: Your Github repository name (e.g. jupyter-naas/abi). It will be used in documentation and API as registry name.github_support_repository
: A Github repository name (e.g. jupyter-naas/abi) to store support issues. It will be used by the support agent to create all requests or report bugs. It can be the same asgithub_project_repository
.github_project_id
: Your Github project number stored in Github URL (e.g. 1 for https://github.com/jupyter-naas/abi/projects/1). It will be used to assign all your issues to your github project.api_title
: API title (e.g. ABI API) displayed in the documentation.api_description
: API description (e.g. API for ABI, your Artifical Business Intelligence) displayed in the documentation.logo_path
: Path to the logo (e.g. assets/logo.png) used in the API documentation.favicon_path
: Path to the favicon (e.g. assets/favicon.ico) used in the API documentation.
Once you have forked and created your own version of the ABI repository, you need to establish a Git remote. This will enable you to push and pull to and from the original ABI repository. Doing so will allow you to update your project with the latest changes, or contribute back to the open-source project.
Execute the following commands in your terminal:
# Access your repo
cd "your_directory_name"
# Add remote
git remote add abi https://github.com/jupyter-naas/abi.git
# Push to main branch
git push abi main
# Pull from main branch
git pull abi main
About Git default remote
When you clone a git repository from Github or any other provider, it will always create a default remote for you, named, origin
. You might already have asked yourself what this origin
was. It's your default git remote.
This means that, assuming you are on the main
branch, executing git push
is the same as git push origin main
.
So by default will just use:
- The branch you are actually on
- The
origin
remote. Even if other exists, it will always useorigin
by default.
The API deployment is automated through GitHub Actions.
Every time you push to the main branch with last commit message as feat:
or fix:
, the API will be deployed as follow:
- A new container is built (via the "Build ABI Container" workflow)
- The deployment workflow creates/updates a NAAS space with the latest container image
- The API will be accessible through the NAAS platform once deployment is complete as the following URL:
https://<github_project_repository.name>-api.default.space.naas.ai/
-
Create a GitHub Classic Personal Access Token:
- Go to GitHub Settings > Developer Settings > Personal Access Tokens > Tokens (classic)
- Generate a new token with the following permissions:
repo
(Full control of private repositories)read:packages
andwrite:packages
(For container registry access)
- Copy the token value
-
Get required API keys:
- OpenAI API key from OpenAI Platform
- NAAS Credentials JWT Token from your NAAS account
-
Navigate to your repository's Settings > Secrets and variables > Actions and add the following secrets:
ACCESS_TOKEN
: Your GitHub Classic Personal Access TokenOPENAI_API_KEY
: Your OpenAI API keyNAAS_CREDENTIALS_JWT_TOKEN
: Your NAAS Credentials JWT TokenABI_API_KEY
: Your key to access the API
- Go to your repository's Actions tab
- Look for the "ABI API" workflow
- Check the latest workflow run for deployment status and logs
This standard procedure explain how to answer to user intent using the ABI framework.
Begin by identifying the user's business problem and core question they want answered. Understanding this clearly will help guide the solution design. For example, "What are my top priorities?"
Map your business problem to ontological concepts:
-
Identify Domain Concepts
- Use
src/ontologies/domain-level
ontology - Example for "What are my top priorities?":
- Task (core concept)
- Properties: assignee, creator, due date, status, priority, labels
- Use
-
Map to Application Concepts
- Use
src/ontologies/application-level
ontology - Map domain concepts to your tools:
- Tasks → GitHub Issues, CRM Tasks, Marketing Campaigns
- Create subclasses that inherit from domain classes:
- abi:GitHubIssue ⊂ abi:Task
- abi:GithubUser ⊂ abi:User
- abi:GithubProject ⊂ abi:Project
- Use
-
Write SPARQL Query
- Create query from
src/ontologies/ConsolidatedOntology.ttl
- Use schema to retrieve data from all relevant subclasses
- Ensures solution remains tool-agnostic and reusable
- Create query from
Once you have your ontological concepts, build your solution in three steps:
-
Integration Create or update integrations in
src/integrations
to connect with required data sources. Please checkoutsrc/integrations/GithubIntegration
orsrc/integrations/GithubGraphqlIntegration
for more details. -
Pipeline Create a pipeline to map data from integrations to ontological concepts. Keep mapping logic modular by:
- Building small pipelines for specific data transformations
- Combining smaller pipelines into larger ones as needed
You will be able to use function to easily create mapping to ontology.
Please checkout
src/data/pipelines/GithubIssuePipeline
for more details.
-
Workflow Create a workflow that uses pipeline results via SPARQL queries. Workflows should focus on business logic rather than data transformation. Please checkout
src/workflows/operations_assistant/GetTopPrioritiesWorkflow
for more details.
- Create or use an existing assistant in
src/assistants
. - Setup the workflow that answer to the user intent as a tool in the assistant. We recommend to put the user intent as description of your workflow so the assistant can understand it better.
- You can also add your pipelines and integrations function as tools if you want to trigger them from the assistant.
- Setup your assistant to validate your solution with your terminal. See Chat with Assistant for detailed instructions.
- Ask the user intent and see if the solution is working as expected.
- If not, you can update your assistant configuration, workflow, pipeline and integration and test again.
Merge your branch into main.
- Your assistant will be deployed to production and you will be able to use it with API but also in Naas platform.
- Your workflows, pipelines and integrations will also be deployed as API.
- Your pipelines will schedule according to your configuration.
Project is divided into two main parts:
src
: Contains the core components of the framework, including integrations, pipelines, workflows, and assistants.lib/abi
: Contains the shared libraries and utilities used across the project.
You start an agent by running the following command:
make
Agent are connected to tools through integrations, workflows or pipelines. You will only have a access to tools registered in .env file. Remember to add your tools in .env file before starting an agent.
Here is the list of all agents you can start:
# Start default agent (chat-supervisor-agent) which can access to all domain agents and tools
make
# Or start a specific foundation agent:
make chat-support-agent # Support agent
# Or start a specific domain agent:
make chat-content-agent # Content agent
make chat-finance-agent # Finance agent
make chat-growth-agent # Growth agent
make chat-opendata-agent # Open Data agent
make chat-operations-agent # Operations agent
make chat-sales-agent # Sales agent
# Or start a specific custom agent (List available in Makefile)
make chat-airtable-agent # Airtable agent
make chat-agicap-agent # Agicap agent
make chat-aws-s3-agent # AWS S3 agent
make chat-brevo-agent # Brevo agent
make chat-clockify-agent # Clockify agent
make chat-discord-agent # Discord agent
make chat-github-agent # Github agent
To change default agent please update: .DEFAULT_GOAL := chat-supervisor-agent
in Makefile
To run a Python script, use the __main__
block pattern in the script file and run with: poetry run python YourScriptPath.py
Here is an example of how to run a pipeline in your terminal:
# src/data/pipelines/YourPipeline.py
if __name__ == "__main__":
from src import secret
from src.integrations import YourIntegration
from abi.services.ontology_store import OntologyStoreService
# Setup dependencies
integration = YourIntegration(YourIntegrationConfiguration(...))
ontology_store = OntologyStoreService()
# Create pipeline configuration
config = YourPipelineConfiguration(
integration=integration,
ontology_store=ontology_store
)
# Initialize and run pipeline
pipeline = YourPipeline(config)
result = pipeline.run(YourPipelineParameters(
parameter_1="test",
parameter_2=123
))
# Print results in Turtle format to verify ontology mapping
print(result.serialize(format="turtle"))
Terminal command:
poetry run python src/data/pipelines/YourPipeline.py
If you need to add a new Python dependency to src
project, you can use the following command:
make add dep=<library-name>
This will automatically:
- Add the dependency to your
pyproject.toml
- Update the
poetry.lock
file - Install the package in your virtual environment
make abi-add dep=<library-name>
To create a new integration, follow these steps:
-
Create Integration File Create a new file in
src/integrations/YourIntegration.py
using template:src/integrations/__IntegrationTemplate__.py
. -
Add Required Methods Implement the necessary methods for your integration. Common patterns include:
- Authentication methods
- API endpoint wrappers
- Data transformation utilities
-
Add Configuration If your integration requires API keys or other configuration:
- Add the required variables to
.env.example
- Update your local
.env
file with actual values
- Add the required variables to
-
Test Integration Create tests in
tests/integrations/
to verify your integration works as expected.
For more detailed examples, check the existing integrations in the src/integrations/
directory.
Pipelines in ABI are used to process and transform data. Here's how to create a new pipeline:
-
Create Pipeline File Create a new file in
src/data/pipelines/YourPipeline.py
using template:src/data/pipelines/__PipelineTemplate__.py
. -
Implement Pipeline Logic
- Add your data processing logic in the
run()
method - Use the integration to fetch data
- Transform data into RDF graph format
- Store results in the ontology store if needed
- Add your data processing logic in the
-
Test Pipeline Create tests in
tests/pipelines/
to verify your pipeline:- Test data transformation
- Test integration with ontology store
- Test error handling
For examples, see existing pipelines in the src/data/pipelines/
directory.
To create a new workflow in ABI, follow these steps:
-
Create Workflow File Create a new file in
src/workflows/YourWorkflow.py
using template:src/workflows/__WorkflowTemplate__.py
. -
Implement Workflow Logic
- Add your business logic in the
run()
method - Use integrations to interact with external services
- Process and transform data as needed
- Return results in the required format
- Add your business logic in the
-
Test Workflow Create tests in
tests/workflows/
to verify your workflow:- Test business logic
- Test integration with external services
- Test error handling
- Test API endpoints
-
Use the Workflow The workflow can be used in multiple ways:
- As a standalone script:
python -m src.workflows.YourWorkflow
- As an API endpoint: Import and use the
api()
function - As a LangChain tool: Import and use the
as_tool()
function
- As a standalone script:
For examples, see existing workflows in the src/workflows/
directory.
To create a new assistant, follow these steps:
Create a new file in src/assistants/custom/YourAssistant.py
using template: src/assistants/custom/__TemplateAssistant__.py
.
- Import necessary integrations, pipelines and workflows
- Configure integrations with required credentials
- Add tools using the
as_tools()
method (Class.as_tools(Configuration))
- Create function to run new assistant in
src/apps/terminal_agent/main.py
following the pattern of existing assistants - Set function in pyproject.toml:
chat-<assistant-name>-agent = "src.apps.terminal_agent.main:run_<assistant-name>-agent"
- Add new function in Makefile:
make chat-<assistant-name>-agent
- Run new assistant:
make chat-<assistant-name>-agent
- Add route handlers in
src/api.py
- Example: Adding a new assistant
- Import your assistant from
src/assistants/custom/YourAssistant.py
- Add it to the
assistants_router
variable as follow:
from src.assistants.custom.YourAssistant import create_your_assistant
your_assistant = create_your_assistant()
your_assistant.as_api(assistants_router)
Remember to add the as_api()
method to your new assistant in its file.
The API will used the secrets stored in your github repository secrets. If you want to add new secrets, you need to do the following:
- Navigate to your repository's Settings > Secrets and variables > Actions and add the new secrets
- Open `.github/workflows/deploy_api.yml
- Add your github secrets in the env section of the: "Push latest abi container"
- Pass the secrets to space environment in
ENV_CONFIG
- Commit and push your changes
- lib/abi: lib/abi/README.md
- src: src/README.md
For Cursor users there is the .cursorrules file already configured to help you create new Integrations, Pipelines and Workflows.
More will be added as we add more components to the framework.
For any questions or support requests, please reach out via support@naas.ai or on our community forum on Slack.