Support Center
An integral part of Zato, its scalable, service-oriented scheduler makes it is possible to execute high-level API integration processes as background tasks. The scheduler runs periodic jobs which in turn trigger services and services are what is used to integrate systems.
In this article we will check how to use the scheduler with three kinds of jobs, one-time, interval-based and Cron-style ones.
What we want to achieve is a sample yet fairly common use-case:
Instead of, or in addition to, Redis or e-mail, we could use SQL and SMS, or MongoDB and AMQP or anything else - Redis and e-mail are just example technologies frequently used in data synchronisation processes that we use to highlight the workings of the scheduler.
No matter the input and output channels, the scheduler works always the same - a definition of a job is created and the job's underlying service is invoked according to the schedule. It is then up to the service to perform all the actions required in a given integration process.
Our integration service will read as below:
# -*- coding: utf-8 -*-
# Zato
from zato.common.api import SMTPMessage
from zato.server.service import Service
class SyncData(Service):
name = 'api.scheduler.sync'
def handle(self):
# Which REST outgoing connection to use
rest_out_name = 'My Data Source'
# Which SMTP connection to send an email through
smtp_out_name = 'My SMTP'
# Who the recipient of the email will be
smtp_to = 'hello@example.com'
# Who to put on CC
smtp_cc = 'hello.cc@example.com'
# Now, let's get the new data from a remote endpoint ..
# .. get a REST connection by name ..
rest_conn = self.out.plain_http[rest_out_name].conn
# .. download newest data ..
data = rest_conn.get(self.cid).text
# .. construct a new e-mail message ..
message = SMTPMessage()
message.subject = 'New data'
message.body = 'Check attached data'
# .. add recipients ..
message.to = smtp_to
message.cc = smtp_cc
# .. attach the new data to the message ..
message.attach('my.data.txt', data)
# .. get an SMTP connection by name ..
smtp_conn = self.email.smtp[smtp_out_name].conn
# .. send the e-mail message with newest data ..
smtp_conn.send(message)
# .. and now store the data in Redis.
self.kvdb.conn.set('newest.data', data)
Now, we just need to make it run periodically in background.
In the next steps, we will use the Zato Dashboard to configure new jobs for the scheduler.
Keep it mind that any date and time that you enter in web-admin is always interepreted to be in your web-admin user's timezone and this applies to the scheduler too - by default the timezone is UTC. You can change it by clicking Settings and picking the right timezone to make sure that the scheduled jobs run as expected.
It does not matter what timezone your Zato servers are in - they may be in different ones than the user that is configuring the jobs.
First, let's use web-admin to define the endpoints that the service uses. Note that Redis does not need an explicit declaration because it is always available under "self.kvdb" in each service.
Now, we can move on to the actual scheduler jobs.
To cover different integration needs, three types of jobs are available:
Select one-time if the job should not be repeated after it runs once.
Select interval-based if the job should be repeated periodically. Note that such a job will by default run indefinitely but you can also specify after how many times it should stop, letting you to express concepts such as "Execute once per hour but for the next seven days".
Select cron-style if you are already familiar with the syntax of Cron or if you have some Cron tasks that you would like to migrate to Zato.
At times, it is convenient to run a job on demand, no matter what its schedule is and regardless of what type a particular job is. Web-admin lets you always execute a job directly. Simply find the job in the listing, click "Execute" and it will run immediately.
It is very often useful to provide additional context data to a service that the scheduler runs - to achieve it, simply enter any arbitrary value in the "Extra" field when creating or an editing a job in web-admin.
Afterwards, that information will be available as self.request.raw_request in the service's handle method.
There is nothing else required - all is done and the service will run in accordance with a job's schedule.
Yet, before concluding, observe that our integration service is completely reusable - there is nothing scheduler-specific in it despite the fact that we currently run it from the scheduler.
We could now invoke the service from command line. Or we could mount it on a REST, AMQP, WebSocket or trigger it from any other channel - exactly the same Python code will run in exactly the same fashion, without any new programming effort needed.
➤ Python API integration tutorial
➤ What is an integration platform?
➤ Python Integration platform as a Service (iPaaS)
➤ What is an Enterprise Service Bus (ESB)? What is SOA?
➤ Visit the support center for more articles and FAQ
➤ Open-source iPaaS in Python