AWS for SAP
Simplify SAP Jobs Scheduling using AWS Native Tooling
Introduction
Scheduling jobs in SAP is one of the routine operational tasks for customers running SAP workloads. SAP customers often use the transaction SM37 to run batch jobs within their SAP systems. When there are requirements for complex job scheduling and dependencies across multiple SAP systems, customers typically will use a third-party batch scheduling tool. It is always a challenge when the systems go down or unavailable for any reason. For customers running SAP and using a third-party job scheduling tool, it can be a significant challenge when the jobs do not run due to their unavailability. The impact of non-running jobs sometimes is isolated to one system; however, if multiple jobs have not run, it can be expensive and time-consuming to fix!
This blog will take you through how the SAP batch jobs can be scheduled and triggered using AWS native services using a job scheduling solution called “Simple Scheduler”
Pre-Requisites
This blog will assume that you are familiar with SAP Job scheduling and its concepts. For this blog’s purpose, the pre-requisite is that the SAP system runs in AWS or a direct network path is available to connect to the SAP systems.
Scope
AWS SAP Professional Services has developed a cloud-native job Scheduling solution called “Simple Scheduler to simplify the routine operational task of scheduling jobs without maintaining additional infrastructure and software for managing those SAP jobs. Customers can use the “Simple Scheduler” solution to schedule SAP jobs using AWS serverless services. Simple Scheduler executes the jobs at the defined intervals and sends out notifications upon completion or failures of the jobs.
Solution Overview
Simple Scheduler is developed using Amazon DynamoDB, Amazon S3, Amazon API Gateway & AWS Step Functions. The majority of the AWS services used in this solution are serverless, which means there is no infrastructure to maintain. Another feature of the Simple Scheduler is to manage dependent batch jobs that need to be executed in sequence, e.g., running Job 2 once Job 1 completes.
- The job is scheduled via a front end developed by AWS Professional Services, which is hosted in an Amazon S3 bucket.
- The related job parameters and the SAP system information are entered in the front end and stored in Amazon DynamoDB.
- A time-based Amazon CloudWatch Event Rule triggers the job at the scheduled time.
- The orchestration of the jobs are performed using the AWS Step Functions.
- The Amazon CloudWatch Event triggers an AWS Step Function, which invokes a sequence of AWS Lambda functions to read the job parameters from AWS DynamoDB.
- The tool gets the credentials from AWS Secrets Manager and then calls an AWS Lambda function, which connects to SAP and triggers the job in SM37 with the configured variant.
- The AWS Lambda connects to SAP using node-RFC and starts the job.
- Some jobs might have a dependency wherein Job X needs to be executed first, followed by Job Y. The flow of the conditions maintained in the AWS Step Functions.
- The dependency/sequencing of jobs is defined using the input parameters in the front end and stored in DynamoDB. When the time-based event is triggered, the Lambda function reads the job definitions and starts the Step Function, providing it with the sequence included.
- On successful completion or failure of the job, the user will be notified via email using Amazon SNS (if required)
- The Simple Scheduler can orchestrate the jobs between different SAP systems such as S/4HANA, ECC, SCM, and BW.
Solution in detail
- The user enters the job details in the screenshot below as the first step
- The screenshots below are the end state of the solution using AWS Step Function for batch jobs.
-
- Here is an example of a Step Function job without dependencies
For the Simple Scheduler, we have developed a custom AWS Lambda function, which connects to SAP to executes BAPI’s (Business Application Programming Interface), which starts the job within SAP (Execute SAP Job step in the above screenshot). Further on, another Lambda function checks the status of the job. When the SAP job has been completed, and there are no dependencies on the next jobs, the Step Function reaches the end state. If there are dependent jobs, the Step Function runs another iteration of the “Execute SAP Job” step until all dependent jobs are processed.
- The Below is an example of a job with dependencies
-
Cost Estimates
Simple Scheduler is built using the AWS services including AWS Step Functions, AWS Lambda and Amazon DynamoDB. There are 4000 AWS Step Functions state transitions and 1Million AWS Lambda free requests per month.
- For cost estimate purposes, let’s assume the following:
- Number of jobs scheduled and executed: 1000 per month
- us-east-1 (North Virginia) Region
- Each job runs for 30 mins
- Job completion status is evaluated every 300 seconds
AWS Step Functions
Number of Jobs | State Transitions (ST) | Estimated Cost |
---|---|---|
1000 | 32*1000 | 32000 * 0.000025 |
$0.8 USD |
AWS Lambda
Number of Jobs Per Month | 1000 |
Lambda function per Job | 3 |
Function Executed for 1000 jobs (R) | 3 x 1000 = 3000 |
Memory Allocated per Call (M) | 128 MB |
Estimated Duration per call (D) | 10000 ms |
Total Compute (Seconds) (S) | 1000 x 10000 = 10000000 |
Total compute (GB-s) (G) | S * M/ 1024 = 1,250,000 GB-s |
Monthly compute charges | G * 0.00001667 = 20.8375 USD |
Amazon DynamoDB
Based on the assumptions above, total monthly cost is an estimated $22.50, making your annual cost $270. Please refer to the following pricing details for each of the services discussed above as AWS routinely implements price reductions:
AWS Lambda Pricing
AWS Step Functions Pricing
Amazon DynamoDB Pricing
Conclusion
This offering serves as a turnkey solution in the form of native infrastructure as code and as an accelerator to build highly customized solutions for customer-specific requirements.
With Simple Scheduler, you only pay for the services that are used to schedule the jobs, helping to reduce operational licensing and infrastructure costs of third-party tools. You can run SAP batch jobs using an AWS cloud-native solution without a dependency on a third-party job scheduling tool.
Note: This is not an SAP-certified solution but does demonstrate how SAP batch jobs can be scheduled using AWS cloud-native tools.
The solution can be customized to orchestrate and extend to non-SAP jobs as well, which can be very useful as a single, inexpensive tool to operate jobs across your environment. If you would like a better understanding of how the Simple Scheduler solution works, then please connect with us via this link