In this article, we cover details of how Zato SQL connection pools can be configured to take advantage of features and options specific to a particular driver or to the SQLAlchemy library.

SQL connection pools

First, let's review the basic Zato Dashboard form that creates a new SQL connection pool.

Zato Dashboard SQL connections menu

Zato Dashboard SQL connection pool form

Above, we find options that are common to all the supported databases:

  • Name of the connection as it is referenced to in your Python code
  • Whether the connection is active or not
  • How big the pool should be
  • What kind of a database it is (here, Oracle DB)
  • Where the database is located - host and port
  • What is the name of the database
  • The username to connect with (password is changed using a separate form)

More options

The basic form covers the most often required, common options but there is more to it and that comes in two flavours:

  • Options specific to the driver library for a given database
  • Options specific to SQLAlchemy, which is the underlying toolkit used for SQL connections in Zato

As to how to declare them:

  • Options from the first group are specified using a query string appended to the database's name, e.g. mydb?option=value&another_option=another_value.

  • Options from the second group go to the Extra field in the form.

We will cover both of them using a few practical examples.

Specifying encoding in MySQL connections

When connecting to MySQL, there may arise a need to be explicit about the character set to use when issuing queries. This is a driver-level setting so it is configured by adding a query string to the database's name, such as mydb?charset=utf8.

Zato Dashboard Setting MySQL encoding

Using a service name when connecting to Oracle DB

Oracle DB connections will at times require a service name alone to connect to, i.e. without a database name. This is also a driver-level option but this time around the query string is the sole element that is needed, for instance: ?service_name=MYDB.

Zato Dashboard Setting Oracle DB service name

Echoing all SQL queries to server logs

It is often convenient to be able to quickly check what queries exactly are being sent to the database - this time, it is an SQLAlchemy-level setting which means that it forms part of the Extra field.

Each entry in Extra is a key=value pair, e.g. echo=True. If more than one is needed, each such entry is in its own line.

Zato Dashboard Setting MySQL encoding

Knowing which options can be set

At this point, you may wonder about how to learn which options can be used by which driver and what can go to the extra field?

The driver-specific options will be in that particular library's documentation - each one will have a class called Connection whose __init__ method will contain all such arguments. This is what the query string is built from.

As for the extra field - it accepts all the keyword arguments that SQLAlchemy's sqlalchemy.create_engine function accepts, e.g. in addition to echo it may be max_overflow, isolation_level and others.

And that sums up this quick how-to - now, you can configure more advanced SQL options that are specific either to each driver or to SQLAlchemy as such.

Next steps

  • Start the tutorial to learn more technical details about Zato, including its architecture, installation and usage. After completing it, you will have a multi-protocol service representing a sample scenario often seen in banking systems with several applications cooperating to provide a single and consistent API to its callers.

  • Visit the support page if you would like to discuss anything about Zato with its creators

  • Para aprender más sobre las integraciones de Zato y API en español, haga clic aquí

Today, we are looking at how environment variables can be used to let the configuration of your Zato-based API services be reusable across environments - this will help you centralise all of your configuration artefacts without a need for changes when code is promoted across environments of different levels.

Making code and configuration reusable

Let's say we have three environments - development, test and production - and each of them uses a different address for an external endpoint. The endpoint's functionality is the same but the provider of this API also has several environments matching yours.

What we want to achieve is twofold:

  • Ensuring that the code does not need changes when you deploy it in each of your environments
  • Making configuration artefacts, such as files with details of where the external APIs are, independent of a particular environment that executes your code

In this article, we will see how it can be done.

Python code

In the snippet below, we have an outgoing REST connection called CRM which the Python code uses to access an external system.

Zato knows that CRM maps to a specific set of parameters, like host, port, URL path or security credentials to invoke the remote endpoint with.

# -*- coding: utf-8 -*-

# Zato
from zato.server.service import Service

class GetUserDetails(Service):
    """ Returns details of a user by the person's name.
    """
    name = 'api.user.get-details'

    def handle(self):

        # Name of the connection to use
        conn_name = 'CRM'

        # Get a connection object
        conn = self.out.rest[conn_name].conn

        # Create a request
        request = {
            "user_id": 123
        }

        # Invoke the remote endpoint
        response = conn.get(self.cid, request)

        # Skip the rest of the implementation
        pass

The above fulfils the first requirement - our code only ever uses the name CRM and it never has to actually consider what lies behind it, what CRM in reality points to.

All it cares about is that there is some CRM that gives it the required user data - if the connection details to the CRM change, the code will continue to work uninterrupted.

This is already good but we still have the other requirement - that CRM point to different external endpoints depending on which of your environment your code currently runs in.

This is where environment variables come into play. They let you move the details of a configuration to another layer - instead of keeping them along with your code, they can be specified for each execution environment separately. In this way, neither code nor configuration need any modifications - only environment variables change.

Zato Dashboard, enmasse CLI and Docker

When Dashboard is used for configuration, enter environment variables prefixed with a dollar sign "$" instead of an actual value. Zato will understand that the real value is to be read from the server's environment.

For instance, if the CRM is available under the address of https://example.net, create a variable called CRM_ADDRESS with the value of "https://example.net", store it in your ~/.bashrc or a similar file, source this file or log in to a new Linux shell, and then log in to Dashboard to fill out the corresponding form as here.

Zato Dashboard

Zato Dashboard

Once you click OK, restart your Zato servers for the changes to take effect. From now on, each time the CRM connection is used by any service, its address will be read from what $CRM_ADDRESS reads.

In this way, you can assign different values to as many environment variables as needed and Zato will read them when the server starts or when you edit a given connection's definition in Dashboard or via enmasse.

If you use enmasse for DevOps automation, the same can be done in your YAML configuration files, for instance:

  - connection: outgoing
    name: CRM
    address: "$CRM_ADDRESS"
    url_path: /api

Note that the same can be used if you start Zato servers in Docker containers.

For instance, in one environment you can start it as such:

sudo docker run   \
  --env CRM_ADDRESS=https://example.net \
  --publish 17010:17010 \
  --detach \
  mycompany/myrepo

In another environment, the address may be different:

sudo docker run   \
  --env CRM_ADDRESS=https://example.com \
  --publish 17010:17010 \
  --detach \
  mycompany/myrepo

Finally, keep in mind that environment variables can be used in place of any string or integer values, no matter what kind of a connection type we consider, which means that parameters such as pool sizes, usernames, timeouts or options of a similar nature can be always specified using variables specific to each environment without a need for maintaining config files for each environment independently.

Next steps

  • Start the tutorial to learn more technical details about Zato, including its architecture, installation and usage. After completing it, you will have a multi-protocol service representing a sample scenario often seen in banking systems with several applications cooperating to provide a single and consistent API to its callers.

  • Visit the support page if you would like to discuss anything about Zato with its creators

  • Para aprender más sobre las integraciones de Zato y API en español, haga clic aquí

Using Zato, it is easy to make IBM MQ queues available to Python applications - this article will lead you step-by-step through the process of setting up the Python integration platform to listen for MQ messages and to send them to MQ queue managers.

Zato and IBM MQ integrations

Prerequisites

  • First, install Zato - pick the system of your choice and follow the installation instructions
  • In the same operating system where Zato runs, install an IBM MQ Client matching the version of the remote queue managers that you will be connecting to - an MQ Client is a redistributable package that lets applications, such as Zato, connect to IBM MQ and if your queue manager is at version 9, you also need an MQ Client v9.

Further steps will assume that Zato and MQ Client are installed.

Configuring Zato

  • Install PyMQI - this is a low-level package that Zato uses for connections to IBM MQ. You can install it using pip - for instance, assuming that the 'zato' command is in /opt/zato/current/bin/zato, pip can be used as below. Note that we are using the same pip version that Zato ships with, no the system one.

    $ cd /opt/zato/current/bin
    $ ./pip install pymqi
    
  • Now, we need to enable IBM MQ connections in your Zato server - this needs to be done in a file called server.conf, e.g. assuming that your server is in the /opt/zato/env/dev/server1 directory, the file will be in /opt/zato/env/dev/server1/config/repo/server.conf

  • Open the file and locate the [component_enabled] stanza

  • Make sure that there is an entry reading "ibm_mq=True" in the stanza (by default it is False)

  • Save and close the file

  • If the server was running while you were editing the file, use 'zato stop' to stop it

  • Start the server with 'zato start'

  • If you have more than one Zato server, all the steps need to be repeated for each one

Understanding definitions, channels and outgoing connections

Most Zato connection types, including IBM MQ ones, are divided into two broad classes:

  • Channels - for messages sent to Zato from external applications and data sources
  • Outgoing connections - for messages sent from Zato to external connections and data sources

Moreover, certain connection types - including IBM MQ - make use of connection definitions which are reusable pieces of configuration that can be applied to other parts of the configuration.

For instance, IBM MQ credentials are used by both channels and outgoing connections so they can be defined once, in a definition, and reused in many other places.

Note that, if you are familiar with IBM MQ, you may already know what an MQ channel is - the term is the same but the concept does not map 1:1, because in Zato a channel always relates to incoming messages, never to outgoing.

Let's configure Zato using its web-admin dashboard. We shall assume that your queue manager's configuration is the following:

  • Queue manager: QM1
  • Host: localhost
  • Port: 1414
  • Channel: DEV.APP.SVRCONN
  • Username: app
  • Queue 1: DEV.QUEUE.1 (for messages to Zato)
  • Queue 2: DEV.QUEUE.2 (for messages from Zato)

But first, we will need some Python code.

Python API services

Let's deploy this module with two sample Zato services that will handle messages from and to IBM MQ.

You will note two aspects:

  • A Zato channel service is invoked each time a new message arrives in the queue the channel listens for - there is no MQ programming involved. Note that you can mount the same service on multiple Zato channels - it means that the service is reusable and a single one can wait for messages from multiple queues simultaneously.

  • The other service is a producer - it uses an outgoing connection to put messages on MQ queues. Again, you just invoke a method and Zato sends your message, there is no low-level MQ programming here. Just like with channels, it can be used for communication with multiple queues at a time.

# -*- coding: utf-8 -*-

# Zato
from zato.server.service import Service

class APIMQChannel(Service):
    """ Receives messages from IBM MQ queues.
    """
    name = 'api.mq.channel'

    def handle(self):

        # Our handle method is invoked for each message taken off a queue
        # and we can access the message as below.

        # Here is the business data received
        data = self.request.ibm_mq.data

        # Here is how you can access lower-level details,
        # such as MQMD or CorrelId.
        mqmd = self.request.ibm_mq.mqmd
        correl_id = self.request.ibm_mq.correlation_id

        # Let's log the message and some of its details
        self.logger.info('Data: %s', data)
        self.logger.info('Sent by: %s', mqmd.PutApplName)
        self.logger.info('CorrelId: %s', correl_id)

class APIMQProducer(Service):
    """ Sends messages to IBM MQ queues.
    """
    name = 'api.mq.producer'

    def handle(self):

        # Message to send as received on input,
        # without any transformations or deserialisation,
        # hence it is considered 'raw'.
        msg = self.request.raw_request

        # Outgoing connection to use
        conn = 'My MQ Connection'

        # Queue to send the message to
        queue = 'DEV.QUEUE.2'

        # Send the message
        self.outgoing.ibm_mq.send(msg, conn, queue)

        # And that's it, the message is already sent!

Connection definition

In web-admin, go to Connections -> Definitions -> IBM MQ and filll out the form as below. Afterwards, make sure to change your user's password by clicking Change password for the connection definition you have just created.

Zato web-admin connection definitions menu

Zato web-admin connection definitions creation form

Channel

Let's create a new Zato channel to receive message sent from IBM MQ. In web-admin, create it via Connections -> Channels -> IBM MQ. In the Service field, use the channel service deployed earlier.

Zato web-admin channel creation form

Outgoing connections

Zato web-admin outgoing connection creation form

Testing the channel

We have configured everything as far as Zato goes and we can try it out now - let's start with channels. We can use IBM's MQ Explorer to put a message on a queue:

MQ Explorer message received

As expected, here is an entry from the Zato server log confirming that it received the message:

INFO - Data: This is a test message
INFO - Sent by: b'MQ Explorer 9.1.5           '
INFO - CorrelId: b'\x00\x00\x00\x00\x00\x00\x00\x00\x00...'

Testing the outgoing connection

The next step is to send a message from Zato to IBM MQ. We already have our producer deployed and we need a way to invoke it.

This means that the producer service itself needs a channel - for instance, if you want to make it available to REST clients, head over to these articles for more information about using REST channels in Zato.

For the purposes of this guide, though, it will suffice if we invoke our service from web-admin. To that end, navigate to Services -> Find "api.mq.producer" -> Click its name -> Invoker, and a form will show.

Enter any test data and click Submit - data format and transport can be left empty.

Zato web-admin message invoker

The message will go to your service and Zato will deliver it to the queue manager that "My MQ Connection" uses. We confirm it using MQ Explorer again:

MQ Explorer message received

Sending connections from the dashboard

At this point, everything is already configured but we can still go one better. It is often useful to be able to send test messages to queue managers directly from servers, without any service, which is exactly what can be done from an outgoing connection's definition page, as in these screenshots:

Zato web-admin send IBM MQ message listing

Zato web-admin send IBM MQ message form

Note that the message is sent from a Zato server, not from the dashboard - the latter is just a GUI that delivers your message to the server. This means that it is a genuine test of connectivity from your servers to remote queue managers.

Log files

Finally, it is worth to keep in mind that there are two server log files with details pertaining to communication with IBM MQ:

  • server.log - general messages, including entries related to IBM MQ
  • ibm-mq.log - low-level details about communication with queue managers

With that, we conclude this blog post - everything is set up, tested and you are ready to integrate with IBM MQ in your Python projects now!

Next steps

  • Start the tutorial to learn more technical details about Zato, including its architecture, installation and usage. After completing it, you will have a multi-protocol service representing a sample scenario often seen in banking systems with several applications cooperating to provide a single and consistent API to its callers.

  • Visit the support page if you would like to discuss anything about Zato with its creators

  • Para aprender más sobre las integraciones de Zato y API en español, haga clic aquí

This Zato article is a companion to an earlier post - previously, we covered accepting REST API calls and now we will look at how Zato services can invoke external REST endpoints.

Outgoing connections

Similar to how channels are responsible for granting access to your services via REST or other communication means, it is outgoing connections (outconns, as an abbreviation) that let the services access resources external to Zato, including REST APIs.

Here is a sample definition of a REST outgoing connection:

Zato web-admin outgoing connections

Zato web-admin creating a REST outgoing connection

The Python implementation will follow soon but, for now, let's observe that keeping the two separate has a couple of prominent advantages:

  • The same outgoing connection can be used by multiple services

  • The configuration is maintained in one place only - any change is immediately reflected on all servers and services can make use of the new configuration without any interruptions

Most of the options of REST outconns have the same meaning as with channels but TLS CA certs may require particular attention. This option dictates what happens if a REST endpoint is invoked using HTTPS rather than HTTP, how the remote end's TLS certificate is checked.

The option can be one of:

  • Default bundle - a built-in bundle of CA certificates will be used for validation. This is the same bundle that Mozilla uses and is a good choice if the API you are invoking is a well-known, public one with endpoints signed by one of the public certificate authorities.
  • If you upload your own CA certificates, they can be used for validation of external REST APIs - for instance, your company or a business partner may have their own internal CAs
  • Skip validation - no validation will be performed at all, any TLS certificate will be accepted, including self-signed ones. Usually, this option should not be used for non-development purposes.

Python code

A sample service making use of the outgoing connection is below.

# -*- coding: utf-8 -*-

# Zato
from zato.server.service import Service

class GetUserDetails(Service):
    """ Returns details of a user by the person's name.
    """
    name = 'api.user.get-details'

    def handle(self):

        # In practice, this would be an input parameter
        user_name = 'john.doe'

        # Name of the connection to use
        conn_name = 'My REST Endpoint'

        # Get a connection object
        conn = self.out.rest[conn_name].conn

        # A single dictionary with all the parameters,
        # Zato will know where each one goes to.
        params = {
            'name': user_name,
            'app_id': 'Zato',
            'app_version': '3.2'
        }

        # We are responsible for catching exceptions
        try:

            # Invoke the endpoint
            response = conn.get(self.cid, params)

            # Log the response
            self.logger.info('Response `%s`', response.data)

        # We caught an exception - log its details
        except Exception as e:
            self.logger.warn('Endpoint could not be invoked due to `%s`', e.args[0])

First, we obtain a connection handle, then the endpoint is invoked and a response is processed, which in this case means merely outputting its contents to server logs.

In the example, we use a constant user name and query string but in practice, they would be likely produced basing on user input.

Note the 'params' dictionary - when you invoke this service, Zato will know that 'name' should go to the URL path but all the remaining parameters will go to the request's query string. Again, this gives you additional flexibility, e.g. if the endpoint's URL changes from path parameters to query string, the service will continue to work without any changes.

Observe, too, that we are responsible for catching and handling potential exceptions arising from invoking REST endpoints.

Finally - because the outgoing connection's data format is JSON, you are not required to de-/serialise it yourself, the contents of 'response.data' is already a Python dict read from a JSON response.

At this point, the service is ready to be invoked - let's say, through REST, AMQP or from the scheduler - and when you do it, here is the output that will be seen in server logs:

INFO - Response `{'user_id': 123, 'username': 'john.doe', 'display_name': 'John Doe'}`

Now, you can extend it to invoke other systems, get data from an SQL database or integrate with other APIs.

Next steps

  • Start the tutorial to learn more technical details about Zato, including its architecture, installation and usage. After completing it, you will have a multi-protocol service representing a sample scenario often seen in banking systems with several applications cooperating to provide a single and consistent API to its callers.

  • Visit the support page if you would like to discuss anything about Zato with its creators

  • Para aprender más sobre las integraciones de Zato y API en español, haga clic aquí

This article is an excerpt from the broader set of changes to our documentation in preparation for Zato.

High-level overview

Zato and Python logo

Zato is a highly scalable, Python-based integration platform for APIs, SOA and microservices. It is used to connect distributed systems or data sources and to build API-first, backend applications. The platform is designed and built specifically with Python users in mind.

Zato is used for enterprise, business integrations, data science, IoT and other scenarios that require integrations of multiple systems.

Real-world, production Zato environments include:

  • A platform for processing payments from consumer devices

  • A system for a telecommunication operator integrating CRM, ERP, Billing and other systems as well as applications of the operator's external partners

  • A data science system for processing of information related to securities transactions (FIX)

  • A platform for public administration systems, helping achieve healthcare data interoperability through the integration of independent data sources, databases and health information exchanges (HIE)

  • A global IoT platform integrating medical devices

  • A platform to process events produced by early warning systems

  • Backend e-commerce systems managing multiple suppliers, marketplaces and process flows

  • B2B platforms to accept and process multi-channel orders in cooperation with backend ERP and CRM systems

  • Platforms integrating real-estate applications, collecting data from independent data sources to present unified APIs to internal and external applications

  • A system for the management of hardware resources of an enterprise cloud provider

  • Online auction sites

  • E-learning platforms

Zato offers connectors to all the popular technologies, such as REST, SOAP, AMQP, IBM MQ, SQL, Odoo, SAP, HL7, Redis, MongoDB, WebSockets, S3 and many more.

Running on premises, in the cloud, or under Docker, Kubernetes and other container technologies, Zato services are optimised for high performance - it is easily possible to run hundreds and thousands of services on typical server instances as offered by Amazon, Google Cloud, Azure or other cloud providers.

Zato servers offer high availability and no-downtime deployment. Servers form clusters that are used to scale systems both horizontally and vertically.

The software is 100% Open Source with commercial and community support available

A platform and language for interesting, reusable and atomic services

Zato promotes the design of, and helps you build, solutions composed of services which are interesting, reusable and atomic (IRA):

  • I for Interesting - each service should make its clients want to use it more and more. People should immediately see the value of using the service in their processes. An interesting service is one that strikes everyone as immediately useful in wider contexts, preferably with few or no conditions, prerequisites and obligations. An interesting service is aesthetically pleasing, both in terms of its technical usage as well as in its potential applicability in fields broader than originally envisaged. If people check the service and say "I know, we will definitely use it" or "Why don't we use it" you know that the service is interesting. If they say "Oh no, not this one again" or "No, thanks, but no" then it is the opposite.
  • R for Reusable - services can be used in different, independent business processes
  • A for Atomic - each service fullfils a single, atomic business need

Each service is deployed independently and, as a whole, they constitute an implementation of business processes taking place in your company or organisation.

With Zato, developers use Python to focus on the business logic exclusively and the platform takes care of scalability, availability, communication protocols, messaging, security or routing. This lets developers concentrate only on what is the very core of systems integrations - making sure their services are IRA.

Python is the perfect choice for API integrations, SOA and microservices, because it hits the sweet spot under several key headings:

  • It is a very high level language, with syntax close to how grammar of various spoken languages works, which makes it easy to translate business requirements into implementation
  • Yet, it is a solid, mainstream and full-featured, real programming language rather than a domain-specific one which means that it offers to developers a great degree of flexibility and choice in expressing their needs
  • Many Python developers have a strong web programming / open source background which means that it is little effort to take a step further, towards API integrations and backend servers. In turn, this means that it is easy to find good people for API projects.
  • Many Python developers have knowledge of multiple programming languages - this is very useful in the context of integration projects where one is typically faced with dozens of technologies, vendors or integration methods and techniques
  • Lower maintenance costs - thanks to the language's unique design, Python programmers tend to produce code that is easy to read and understand. From the perspective of multi-year maintenance, reading and analysing code, rather than writing it, is what most programmers do most of the time so it makes sense to use a language which makes it easy to carry out the most common tasks.

In short, Python can be construed as executable pseudo-code with many of its users already having roots in modern server-side programming so Zato, both from a technical and strategic perspective, is a natural choice for complex and sophisticated API solutions as a platform built in the language and designed for Python developers from day one.

More than services

Systems integrations commonly require two more features that Zato offers as well:

  • File transfer - allows you to move batch data between locations and to distribute it among systems and APIs

  • Single Sign-On (SSO) - a convenient REST interface lets you easily provide authentication and authorisation to users across multiple systems

Next steps

  • Start the tutorial to learn more technical details about Zato, including its architecture, installation and usage. After completing it, you will have a multi-protocol service representing a sample scenario often seen in banking systems with several applications cooperating to provide a single and consistent API to its callers.

  • Visit the support page if you would like to discuss anything about Zato with its creators

  • Para aprender más sobre las integraciones de Zato y API en español, haga clic aquí