Integration with SAP Advanced Financial Closing is not yet published. Please refer to What's New for SAP Advanced Financial Closing for the latest updates.
SAP Advanced Financial Closing SDK for CDS provides an SDK for SAP Advanced Financial Closing to be consumed with SAP Cloud Application Programming Model (Node.js).
- Requirements and Setup
- Getting Started
- Architecture
- Usage
- Advanced Setup
- Support, Feedback, Contributing
- Code of Conduct
- Licensing
- To develop and test applications build with this SDK you need a CAP Node.js project
- To integrate you need access to an instance of SAP Advanced Financial Closing
- Run
npm add @cap-js-community/sap-afc-sdk
in@sap/cds
project - Execute
npm start
to start server- Access welcome page at http://localhost:4004
- Access Applications
- /launchpad.html: Sandbox Launchpad
- /scheduling.monitoring.job/webapp: Standalone Scheduling Monitoring Job UI
- Access Service Endpoints
- Public API
- /api/job-scheduling/v1: Scheduling Provider API (OpenAPI Swagger UI)
- OData API (UI)
- /odata/v4/job-scheduling/monitoring: Feature Toggle API ($metadata)
- WebSocket API
- /ws/job-scheduling: Scheduling WebSocket endpoint
- REST API
- /rest/feature: Feature Toggle API
- CDS Internal API
SchedulingProcessingService
: Scheduling Processing serviceconst schedulingWebsocketService = await cds.connect.to("SchedulingWebsocketService");
SchedulingWebsocketService
: Scheduling Websocket serviceconst schedulingProcessingService = await cds.connect.to("SchedulingProcessingService");
- Public API
SAP Advanced Financial Closing (AFC) lets you define, automate, process, and monitor the entity close for your organization.
The SAP Advanced Financial Closing SDK for CDS provides a plugin for SAP Cloud Application Programming Model (CAP) (Node.js) to extend and integrate with SAP Advanced Financial Closing (AFC). Specifically, it provides an out-of-the-box implementation of the SAP Advanced Financial Closing Scheduling Service Provider Interface to expose a Scheduling Provider service to manage Job definitions and Jobs. Furthermore, it brings the following out-of-the-box virtues:
- API: Exposes a RESTful API implementing the AFC Scheduling Provider Interface to manage Job definitions and Jobs
- Event-Queue: Provides an Event Queue to process and aync Jobs (periodically) asynchronously and resiliently (circuit breaker, retry, load-balancing, etc.)
- Websocket: Provides websocket connection support to monitor Job processing live
- Feature-Toggle: Provides a feature toggle library to control the execution of the Event Queue
- UI: Provides a UI5 application to monitor and cancel Jobs
- Broker: Implements a service broker to manage service key management to API
The SAP Advanced Financial Closing SDK for CDS is build on the following architecture open source building blocks as depicted in the following diagram:
- WebSocket Adapter for CDS (https://github.com/cap-js-community/websocket)
- Exposes a WebSocket protocol via WebSocket standard or Socket.IO for CDS services. Runs in context of the SAP Cloud Application Programming Model (CAP) using @sap/cds (CDS Node.js).
- Event Queue for CDS (https://github.com/cap-js-community/event-queue)
- The Event-Queue is a framework built on top of CAP Node.js, designed specifically for efficient and streamlined asynchronous event processing
- Feature Toggle Library for CDS (https://github.com/cap-js-community/feature-toggle-library)
- SAP BTP feature toggle library enables Node.js applications using the SAP Cloud Application Programming Model to maintain live-updatable feature toggles via Redis
You can develop a 3rd-Party Scheduling Provider for SAP Advanced Financial Closing using the SAP Advanced Financial Closing SDK, built on the SAP Cloud Programming Model and enhanced with @cap-js-community open-source plugins, leveraging SAP Build Code for a seamless and scalable solution.
Requesting job scheduling, synchronizing status and results, and updating job definitions between SAP Advanced Financial Closing (AFC) and a 3rd-party scheduling provider can be easily implemented using the AFC SDK.
The open source components are shared between SAP Advanced Financial Closing and the SAP Advanced Financial Closing SDK for CDS.
The architectural design of the SAP Advanced Financial Closing (AFC) SDK for implementing a Scheduling Provider is based on the SAP Cloud Application Programming Model (CAP) and SAP Build Code. It leverages the @cap-js-community open-source components to enable scheduling services in AFC.
The following diagram illustrates the high-level architecture of the SAP Advanced Financial Closing SDK for CDS:
Key components and processing flow:
- SAP Advanced Financial Closing (AFC):
- Sends scheduling requests via AFC Scheduling Service Provider Interface using REST API (OpenAPI)
- Scheduling Provider Service:
- Handles incoming scheduling requests
- Creates scheduling jobs synchronously and places asynchronous requests into the Event Queue
- Scheduling Processing Service:
- Processes scheduled jobs asynchronously
- Retrieves job requests from the Event Queue and executes them.
- Scheduling WebSocket Service:
- Listens for status updates of scheduled jobs
- Notifies the Monitoring Scheduling Job UI via WebSockets when job statuses change
- Scheduling Monitoring Service:
- Monitoring Scheduling Job UI (SAP Fiori Elements V4 / SAP UI5 application)
- Reads scheduling job details from the database
- Supports monitoring via OData V4 API
- Displays scheduling job statuses and updates in real-time via WebSockets
- Event Queue & Feature Toggles:
- Event Queue (using CDS Outbox) facilitates asynchronous job execution
- Feature Toggles allow influence Job and Event Queue processing dynamically
- Database & Redis Caching:
- Stores job scheduling data in the database
- Redis is used for information distribution (e.g. Event Queue, WebSockets, Feature Toggles)
Options can be passed to SDK via CDS environment via cds.requires.sap-afc-sdk
section:
endpoints: Object
: Endpoint configuration. Default{}
endpoints.approuter: String
: Url of approuter. Defaultnull
(derived from conventions<app>-srv
)endpoints.server: String
: Url of server. Defaultnull
(derived from environment, e.g. CF)
api: Object
: API configuration. Default{}
ui: Object | Boolean
: UI configuration. Usefalse
to disable UI. Default{}
ui.path: String
: Path to the served UI5 application. Default''
ui.link: Boolean
: Fill link of jobs to served UI5 launchpad, ifnull
. Defaulttrue
ui.swagger: Boolean | Object
: Serve API docs via Swagger UI. Defaulttrue
ui.swagger.SchedulingProviderService: Boolean
: Serve API docs of Scheduling Provider via Swagger UI. Defaulttrue
ui.launchpad: Boolean
: Serve launchpad. Defaulttrue
ui."scheduling.monitoring.job": Boolean
: Serve Scheduling Monitoring Job UI separately, if no Launchpad is served. Defaulttrue
broker: Boolean | Object
: Broker configuration. Serve broker endpoint, if truthy. Defaultfalse
andtrue
inproduction
mockProcessing: Boolean | Object
: Activate mocked job processing. Defaultfalse
mockProcessing.default: String
: Default processing status. Defaultcompleted
mockProcessing.min: Number
: Minimum processing time in seconds. Default0
mockProcessing.max: Number
: Maximum processing time in seconds. Default30
mockProcessing.status: Object
: Status distribution valuesmockProcessing.status.completed: Number
: Completed status distribution valuemockProcessing.status.completedWithWarning: Number
: Completed With Warning status distribution valuemockProcessing.status.completedWithError: Number
: Completed With Error status distribution valuemockProcessing.status.failed: Number
: Failed status distribution value
config: Object
: Advanced SDK configuration. See config.json
A new CDS project can be initialized using SAP Build Code
tools on SAP Business Technology Platform (BTP) or @sap/cds-dk
CLI command cds init
can be used to bootstrap a new
CAP application.
SAP Build Code:
- Open SAP Build Lobby
- Press
Create
- Select objective
Application
- Choose category
Full-Stack
- Select type
Full-Stack Node.JS
- Provide project name
- Press
Review
- Press
Create
- Open project in SAP Business Application Studio
- Continue with
cds
CLI, adding features
CDS Command-Line-Interface:
- Terminal:
npm install -g @sap/cds-dk
- Init a new CDS project:
- Terminal:
cds init <name>
- Terminal:
- Switch to project folder:
- Terminal:
cd <name>
- Terminal:
- Install
- Terminal:
npm install
- Terminal:
- Add AFC SDK
- Terminal
npm install @cap-js-community/sap-afc-sdk
- Terminal
- Use
afc
command- Add globally:
- Terminal:
npm install -g @cap-js-community/sap-afc-sdk
- Terminal:
- Use locally:
- Terminal:
npx afc
- Terminal:
- Add globally:
- Init Target Environment
- Cloud Foundry:
- Terminal:
afc init cf
- Terminal:
- Kyma:
- Terminal:
afc init kyma
- Terminal:
- Cloud Foundry:
- Optionally add AFC SDK features
- Terminal:
afc add broker,sample,http
- Terminal:
- Test
- Terminal:
npm start
- Browser:
http://localhost:4004
- Terminal:
The library includes a mocked processing for jump-start development, which is disabled by default via option.
cds.requires.sap-afc-sdk.mockProcessing: false
Setting option cds.requires.sap-afc-sdk.mockProcessing: true
a basic mocked job processing completes
jobs based on a random time value between 0-10
seconds.
The project can be adjusted to use basic mock processing automatically via command:
- Terminal:
afc add -b mock
A more advanced mocked Job processing can be configured by setting the following CDS env options (as described in options):
{
"cds": {
"requires": {
"sap-afc-sdk": {
"mockProcessing": {
"min": 0,
"max": 10,
"default": "completed",
"status": {
"completed": 0.5,
"completedWithWarning": 0.2,
"completedWithError": 0.2,
"failed": 0.1
}
}
}
}
}
}
This default advanced mocked Job processing can be also configured by using CDS profile mock
via --profile mock
or
CDS_ENV=mock
.
The project can be adjusted to use advanced mock processing (without additional mock
profile) automatically via command:
- Terminal:
afc add -a mock
Mock configuration can be adjusted in package.json
afterwards.
To disable mock processing remove CDS env cds.requires.sap-afc-sdk.mockProcessing
, e.g. for command:
- Terminal:
afc add -x mock
The default implementation of the job processing is already provided by the SDK. Focus can be put on custom processing logic, and the processing status update handling.
To implement a custom job processing extend the job processing service definition as follows:
CDS file: /srv/scheduling-processing-service.cds
using SchedulingProcessingService from '@cap-js-community/sap-afc-sdk';
annotate SchedulingProcessingService with @impl: '/srv/scheduling-processing-service.js';
Implementation file: /srv/scheduling-processing-service.js
const { SchedulingProcessingService, JobStatus } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProcessingService extends SchedulingProcessingService {
async init() {
const { processJob, updateJob, cancelJob, syncJob } = this.operations;
this.on(processJob, async (req, next) => {
// Your logic goes here. Check req.data.testRun
// await this.processJobUpdate(req, JobStatus.completed, [{ ... }]);
await next();
});
this.on(updateJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(cancelJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(syncJob, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProcessingService;
A stub implementation for custom scheduling processing service can be generated via command:
- Terminal:
afc add stub
As part of the custom scheduling process service implementation, the following operations can be implemented:
on(processJob)
:- A new job instance was created and needs to be processed
- The job is due (start date time is reached), and the job is ready for processing
- Implement your custom logic, how the job should be processed
- Job ID is accessible via
req.data.ID
and job data can be accessed viareq.job
- Test run can be identified via flag
req.data.testRun
(if job definition supports test mode) - Call
await next()
to perform default implementation (set status torunning
) - Job update can be performed via
this.processJobUpdate()
providing the new status and job results- e.g.
await this.processJobUpdate(req, JobStatus.completed, [{...}])
- e.g.
processJobUpdate
resultdata
property shall contain stream objects to prevent data materialization- Throwing exceptions will automatically trigger the retry process in Event Queue
- Disable mocked job processing via
cds.requires.sap-afc-sdk.mockProcessing: false
(default).
on(updateJob)
:- A job status update is requested and the job results are stored
- Implement your custom logic, how the job status should be updated
- Job data can be retrieved via
req.job
- Job status transition is validated via
async checkStatusTransition(req, statusBefore, statusAfter)
- Valid status transitions are defined in
this.statusTransitions
- Check function and status transitions can be customized
- Valid status transitions are defined in
- Job results are checked and processed via
async checkJobResults(req, results)
- Valid results are valid according to job results signature constraints (see below)
- Returns the processed job results to be inserted
- Call
await next()
to perform default implementation (update status to requested status)
on(cancelJob)
:- A job cancellation is requested
- Implement your custom logic, how the job should be canceled
- Job data can be retrieved via
req.job
- Call
await next()
to perform default implementation (update status tocanceled
)
The job results signature is defined as follows:
type ResultTypeCode : String enum {
link;
data;
message;
};
type MessageSeverityCode : String enum {
success;
info;
warning;
error;
};
type JobResult {
name : String(255) not null;
type : ResultTypeCode not null;
link : String(5000);
mimeType : String(255);
filename : String(5000);
data : LargeBinary;
messages : many JobResultMessage;
};
type JobResultMessage {
code : String(255) not null;
text : String(5000);
severity : MessageSeverityCode not null;
createdAt : Timestamp;
texts : many JobResultMessageText;
};
type JobResultMessageText {
locale : Locale not null;
text : String(5000) not null;
};
Multiple job results can be passed for job update. The following constraints apply for each job result type:
link
:- Properties
name
andlink
need to be provided - Other properties are not allowed
- Properties
data
:- Properties
name
,mimeType
,filename
anddata
need to be provided - Data needs to be provided as base64 encoded string
- Other properties are not allowed
- Properties
message
:- Properties
name
andmessages
need to be provided - Messages need to be provided as array of job result messages
- Other properties are not allowed
- Properties
Job processing is performed as part of the Event Queue processing. The Event Queue is a framework built on top of CAP Node.js, designed specifically for efficient and streamlined asynchronous event processing. In case of errors, the Event Queue provides resilient processing (circuit breaker, retry, load-balancing, etc.).
In addition, to overwriting the default implementation via an on
handler, also additional before
and after
handlers can be registered.
A job provider service is already provided per default by the SDK, implementing the SAP Advanced Financial Closing Scheduling Service Provider Interface. Therefore, focus can be put on additional custom provider logic (e.g. streaming of data from a remote location).
The SAP Advanced Financial Closing Scheduling Service Provider Interface is published on SAP Business Accelerator Hub under package SAP Advanced Financial Closing at https://api.sap.com/api/SSPIV1.
To implement a custom job provider extend the job provider service definition as follows:
CDS file: /srv/scheduling-provider-service.cds
using SchedulingProviderService from '@cap-js-community/sap-afc-sdk';
annotate SchedulingProviderService with @impl: '/srv/scheduling-provider-service.js';
Implementation file: /srv/scheduling-provider-service.js
const { SchedulingProviderService } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProviderService extends SchedulingProviderService {
async init() {
const { Job, JobResult } = this.entities;
this.on("CREATE", Job, async (req, next) => {
// Your logic goes here
await next();
});
this.on(Job.actions.cancel, Job, async (req, next) => {
// Your logic goes here
await next();
});
this.on(JobResult.actions.data, JobResult, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProviderService;
A stub implementation for custom scheduling provider service can be generated via command:
- Terminal:
afc add stub
As part of the custom scheduling provider service implementation, the following operations can be implemented:
on("CREATE", Job)
:- Validates and creates a new job instance
- Call
await next()
to perform default implementation after
: Calls scheduling processing service functionprocessJob
on(Job.actions.cancel, Job)
:- Cancels a job
- Call
await next()
to perform default implementation after
: Calls scheduling processing service functioncancelJob
on(JobResult.actions.data, JobResult)
:- Call
await next()
to perform default implementation - Streams data of a job result (type
data
) from DB to response
- Call
In addition, to overwriting the default implementation via an on
-handler, also additional before
and after
handlers can be registered.
Scheduling Provider Service can be restricted for authorization adding @requires
annotation:
using SchedulingProviderService from '@cap-js-community/sap-afc-sdk';
annotate SchedulingProviderService with @requires: 'JobScheduling';
Details can be found in CDS-based Authorization.
A periodic scheduling job synchronization event named SchedulingProcessingService.syncJob
is running per default every 1 minute
in the Event Queue, to perform job synchronization from an external source. The default implementation is a no-op.
The event syncJob
is registered automatically with cron interval */1 * * * *
in the Event Queue configuration.
To change the cron interval, the Event Queue configuration can be adjusted in the CDS env:
CDS Env:
{
"cds": {
"requires": {
"SchedulingProcessingService": {
"outbox": {
"events": {
"syncJob": {
"cron": "*/2 * * * *"
}
}
}
}
}
}
}
The cron
interval option defines the periodicity of the scheduling job synchronization.
CDS file: /srv/scheduling-processing-service.cds
using SchedulingProcessingService from '@cap-js-community/sap-afc-sdk';
annotate SchedulingProcessingService with @impl: '/srv/scheduling-processing-service.js';
Implementation file: /srv/scheduling-processing-service.js
const { SchedulingProcessingService, JobStatus } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProcessingService extends SchedulingProcessingService {
async init() {
const { syncJob } = this.operations;
this.on(syncJob, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProcessingService;
A stub implementation for periodic job sync can be generated via command:
- Terminal:
afc add stub
Details on how to implement periodic event via Event Queue can be found in Event-Queue documentation on Periodic Events.
The application can be tested locally using the following steps:
- Start application
- Terminal:
npm start
- Terminal:
- Open welcome page
- Browser:
http://localhost:4004
- Browser:
To add sample job definitions and job instances run:
- Terminal:
afc add sample
Test data will be placed at /db/data
To add unit-tests for testing the API endpoints run:
- Terminal:
afc add test
Test files will be placed at /test
.
To add .http
files for testing the API endpoints run
- Terminal:
afc add http
HTTP files will be placed at /http
.
To fully test the application, also accessing APIs from external, a deployment needs to be performed. BTP offers different deployment options, depending on the target environment (Cloud Foundry or Kyma).
- Add MTA feature (already part of Bootstrap for CF)
- Terminal:
afc init cf
- Terminal:
- Build MTA
- Terminal:
mbt build
- Terminal:
- Deploy MTA
- Terminal:
cf deploy mta_archives/<mta>.mtar
- Terminal:
- For details see guide Deployment to CF
- Add helm feature (already part of Bootstrap for Kyma)
- Terminal:
afc init kyma
- Terminal:
- Configuration
- Set global domain in
chart/values.yaml
- Set global image registry in
chart/values.yaml
- Set repository in
containerize.yaml
- Set endpoints to
approuter
andserver
in cds env (see Options) to Kyma API rule hosts
- Set global domain in
- Containerize
- Terminal:
ctz containerize.yaml --push
- Terminal:
- Upgrade
- Terminal:
helm upgrade --install <name> ./gen/chart -n <namespace>
- Terminal:
- Rollout
- Terminal:
kubectl rollout restart deployment -n <namespace>
- Terminal:
- For details see guide Deployment to Kyma
An Open Service Broker compliant broker implementation can be added to the CAP project. The broker is used to manage service key management to the API.
- Add broker and service configuration (already part of Bootstrap)
- Terminal:
afc add broker
- Terminal:
- Deploy to CF (see Deployment to Cloud Foundry)
- Get API key credentials
- Terminal:
afc api key
- Terminal:
- Use API key credentials
- Swagger UI:
- Open URL:
https://<server-url>/api-docs/api/job-scheduling/v1/
- Click
Authorize
and provide key credentials forclient_id
andclient_secret
- Try out endpoints
- Open URL:
- HTTP Client:
- Add .http files
- Update .http files placeholders
- Terminal:
afc api key -h
- Terminal:
- Perform OAuth token request using key credentials (clientId, clientSecret)
- See http/auth/uaa.cloud.http for obtaining an OAuth token
- Via CLI:
- Terminal:
afc api key -t
- Terminal:
- Call API using OAuth token
- See
.http
files in /http to call API endpoints - See
.http
files in /http/scheduling to call scheduling provider API endpoints
- See
- Clear credentials in .http files via
- Terminal:
afc api key -c
- Terminal:
- Destination:
- A destination file for an API endpoint can be created via command:
- Terminal:
afc add key -d -e <endpoint>
- Terminal:
- A destination file for Job Scheduling Provider API can be created via command:
- Terminal:
afc add key -d -j
- Terminal:
- A destination file for an API endpoint can be created via command:
- Swagger UI:
- Reset API management in CF
- Terminal:
afc api key -r
- Terminal:
For development and testing purposes UIs are served as part of the server. Exposed UIs can be accessed via the server welcome page. For productive usage, UIs should be served via HTML5 repo:
- Add WorkZone and HTML5 Repo features (already part of Bootstrap)
- Terminal:
cds add workzone,html5-repo
- Terminal:
- Setup and configure SAP WorkZone instance using HTML5 Apps Content Channel
- Add
Monitor Scheduling Jobs
app to Content Explorer - Assign app to a group, role and site to be accessible
- Add
- Disable UI served in server via CDS env:
cds.requires.sap-afc-sdk.ui: false
- (Optional): Apps from AFC SDK can also be copied over into project at
/app
for further adjustments:- Terminal:
afc add app
- Terminal:
You can scale the application by adding a Redis cache to distribute workload across application instances:
Add Redis to the project (already part of Bootstrap):
- Terminal:
cds add redis
Redis is used by event-queue
, websocket
and feature-toggle-library
modules
to process events, distribute websocket messages and store and distribute feature toggles values.
The Feature Toggle Library is used to control the execution of the Event Queue. It exposes endpoints to manage feature toggles.
GET /rest/feature/state()
: Read current feature toggle state
POST /rest/feature/redisUpdate
: Update feature toggle state
See .http
files in /http/toggles to call feature toggle endpoints.
An internal OAuth token needs to be fetched via /http/auth/uaa.internal.cloud.http
providing credentials from XSUAA instance or via calling:
- Terminal:
afc api key -i
The project can be enabled for multitenancy by following the guide: https://cap.cloud.sap/docs/guides/multitenancy/#enable-multitenancy
The MTX Tool is used to manage the application lifecycle. It can be used to manage the application in Cloud Foundry. Details can be found at https://github.com/cap-js-community/mtx-tool.
This project is open to feature requests/suggestions, bug reports etc. via GitHub issues. Contribution and feedback are encouraged and always welcome. For more information about how to contribute, the project structure, as well as additional contribution information, see our Contribution Guidelines.
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone. By participating in this project, you agree to abide by its Code of Conduct at all times.
Copyright 2025 SAP SE or an SAP affiliate company and sap-afc-sdk contributors. Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.