Prismatic is currently in private beta! Request beta access
Photo of Taylor Reece
Taylor Reece, Developer Advocate
July 23, 2020 • 8 min read

Incorporating Prismatic Integrations Into Your Dev Processes

In our previous two posts, we examined a use case for Prismatic, and dove into how to assemble an integration. Today I'd like to look at incorporating integration development with Prismatic into your existing software development processes, first discussing reasons why you should consider saving your integrations as YAML within source control, and then looking at how to codify the integration from the previous post into a single YAML file.

Why should I consider storing integrations as YAML?

Let's first take a step back and think about how we manage cloud infrastructure and server configuration today.

When I create a proof-of-concept for a new tech stack in AWS, GCP, or Azure, I like to futz with cloud resources using the cloud providers' respective web consoles. The visual representations of how my resources interconnect helps me to quickly debug and make tweaks to infrastructure. When it comes time to replicate my tech stack for production, though, I definitely don't do it by hand. Replicating a complex environment for CI, QA, beta, staging, and prod would be a nightmare to build and maintain if done by hand. So, I reach for something like CloudFormation or Terraform so I can readily cookie-cutter out environments.

The same holds true for configuring servers. I like to play around with installing packages and editing Apache or HAProxy configuration manually until I get my environment set up how I like it, but in the end I write up my configuration using Ansible playbooks or Chef recipes so I can replicate my work readily.

Assembling Prismatic integrations is no different. The integration designer is a powerful tool that allows you to assemble an integration from scratch. You can test your integration from within the designer as you build it and observe inputs and outputs of your steps as you go. It's a great way to prototype and debug a new integration. Once you're satisfied with your integration within the integration designer, you can even export it to a YAML file with a simple prism integrations:export command.

Screen shot of Prismatic integration designer with components

Now, you could maintain your cloud environments by hand and configure every server you ever touch manually, and you can certainly maintain all of your integrations from within Prismatic's web app, but like infrastructure and server configuration definitions, saving your integration definitions in the same code repository as your core product provides you with several advantages:

  1. It's saved in source control. The normal advantages of source control (having code reviews, feature branches, merge requests, etc.) now apply to your integrations. Your team can see how an integration has changed over time, and can read through commit messages to figure out what changed when, and why. This gives you the added benefit of keeping your integrations in lock-step with your APIs. If your APIs change, your integrations that consume them can be modified and shipped out at the same time.
  2. Easy to replicate and QA. If your integration is saved in source control, your QA team can easily import your new integration definition into a test QA tenant to verify that it works alongside any new code you've written.
  3. Fits into existing CI/CD Pipelines. Your integration can be shipped automatically with minimal changes to your CI/CD pipeline. Just use Prismatic's CLI tool and add a simple prism integrations:import command to your build pipeline. You can build CI tests around your integrations with prism integrations:test to verify that the integration works as expected. When it comes time to deploy to production, simply have your production pipeline run prism integrations:import on your YAML file, and deploy instances of the integration to customers who need it. Deployment of your integration are entirely scriptable and testable.
  4. Developers work efficiently in code. This sounds like an obvious thing to say, but remember back to the first time you watched an experienced developer wrangle a code base using Vim or Emacs. I remember being astounded at just how quickly a good developer can search through files, pull up dependencies, and make changes to multiple lines in multiple files with a few keyboard strokes. If you're supporting hundreds of Prismatic integrations, you'll similarly want to be able to check which integrations use a particular API endpoint or configuration variable, and that's made easy in most code editors if your integrations are saved as code.

Now that we've touched on why saving integrations as YAML is advantageous, let's look at how to assemble an integration in YAML.

Progix's ERP Integration as YAML

The Progix ERP integration, introduced previously, consisted of five pieces: a trigger and four action steps. Let's look at the entirety of the YAML code first (also available on GitHub), and then highlight specific features:

---
name: AcmeERP Fuel Integration
description: >
  After a rocket is launched, fuel data is sent to this integration via
  a trigger payload.  The payload is verified, fuel data is converted
  from pounds to gallons, and XML-formatted data is sent to the
  customer's AcmeERP instance.

requiredConfigVars:
  acmeErpEndpoint: https://api.acmeerp.com/
  secret: secret

trigger:
  name: trigger
  description: Expects a data payload and X-Progix-Signature header

steps:
  - name: verifysignature
    description: Verify that the X-Progix-Signature is valid
    action:
      key: verifySignature
      componentKey: progix-sig-check
    inputs:
      signature: outputs.trigger.all.headers."x-progix-signature"
      body: outputs.trigger.all.body
      secret: configVars.secret

  - name: compute_gallons_fuel
    description: Convert incoming fuel data from pounds to gallons
    action:
      key: runCode
      componentKey: code
    inputs:
      code: |
        'module.exports = async (context, params) => {
          const gallonsToPoundsConversion = {
            Hydrazine: 8.38,
            Kerosene: 6.68,
            Nitromethane: 9.49,
            O2: 9.52,
          };
          const fuelUsed = JSON.parse(params.trigger.all.body).fuelUsed;
          return {
            fuelGallonsUsed: fuelUsed.reduce((obj, item) => {
              return {
                ...obj,
                [item.type]: item.pounds / gallonsToPoundsConversion[item.type],
              };
            }, {}),
          };
        };'

  - name: convert_json_to_xml
    description: >
      Convert JSON data from the code component to the
      XML that AcmeERP expects.
    action:
      key: jsonToXml
      componentKey: reformat
    inputs:
      data: outputs.compute_gallons_fuel.all.fuelGallonsUsed

  - name: send_data_to_acmeerp
    description: >
      HTTP POST XML data to AcmeERP endpoint using OAuth 2.0
    action:
      key: httpPost
      componentKey: http
    inputs:
      url: 'configVars.acmeErpEndpoint & "/fuelUsed"'
      data: outputs.convert_json_to_xml.all
      responseType: text

Right away notice that the file starts with name and description blocks, which are pretty straight forward. If you have done any AWS CloudFormation templating or Ansible playbook creation, YAML should be familiar to you. In YAML we can take advantage of multi-line strings using the block scalar style > and | characters, so long descriptions like this are readable, but render as a single cohesive sentence in the Prismatic web app:

description: >
  After a rocket is launched, fuel data is sent to this integration via
  a trigger payload.  The payload is verified, fuel data is converted
  from pounds to gallons, and XML-formatted data is sent to the
  customer's AcmeERP instance.

Next, we define our requiredConfigVars like we did within the web app. That's a simple key-value pairing of variable names and their default values (so acmeErpEndpoint defaults to https://api.acmeerp.com/ , for example):

requiredConfigVars:
  acmeErpEndpoint: https://api.acmeerp.com/
  secret: secret

After that we have the integration's trigger. Our trigger is simple, and contains a name and optional description. By default a trigger creates a webhook, though you can instead configure the trigger to fire on a schedule using crontab notation - see our docs, Specifying a Scheduled Trigger in YAML:

trigger:
  name: trigger
  description: Expects a data payload and X-Progix-Signature header

Finally, and most importantly, our steps block defines the series of steps that make up the integration. Each step contains a name and optional description. Then we declare what component action to invoke by listing a component key and action key, which we can get by referencing our component catalog or by running prism components:actions:list -x.

We then choose values for inputs that the action step takes (inputs are enumerated in our component catalog docs). Inputs can be strings or variables (like outputs or configuration variables). Steps can also contain outputs blocks, though we don't use them in this integration - check out our YAML quickstart for an example of how to use step outputs.

In this step, for example, we invoke the httpPost action against an HTTP url that is defined by a configuration variable, and posts the output data from the step named convert_json_to_xml to that endpoint:

- name: send_data_to_acmeerp
  description: >
    HTTP POST XML data to AcmeERP endpoint using OAuth 2.0
  action:
    key: httpPost
    componentKey: http
  inputs:
    url: 'configVars.acmeErpEndpoint & "/fuelUsed"'
    data: outputs.convert_json_to_xml.all
    responseType: text

We can also define code component steps using YAML multi-line syntax to pass in custom JavaScript code that handles vertical-specific business logic:

- name: compute_gallons_fuel
  description: Convert incoming fuel data from pounds to gallons
  action:
    key: runCode
    componentKey: code
  inputs:
    code: |
      'module.exports = async (context, params) => {
        const gallonsToPoundsConversion = {
          Hydrazine: 8.38,
          Kerosene: 6.68,
          Nitromethane: 9.49,
          O2: 9.52,
        };
        const fuelUsed = JSON.parse(params.trigger.all.body).fuelUsed;
        return {
          fuelGallonsUsed: fuelUsed.reduce((obj, item) => {
            return {
              ...obj,
              [item.type]: item.pounds / gallonsToPoundsConversion[item.type],
            };
          }, {}),
        };
      };'

That's it! With just 72 lines of YAML (26% of which is whitespace or optional description text), we have a fully functional integration that we can import into our Prismatic account. From there, we can rapidly make changes to our code and then run prism integrations:import and prism integrations:test to verify that our changes work as expected.

Learn More

We looked into the why and how of Prismatic integrations as YAML. For more information on writing your integrations as code, check out or Defining Integrations as Code docs article, and the accompanying Writing an Integration in YAML with Trigger Payloads, Inputs, and Outputs quickstart. For any other questions, check out our docs or reach out - we'd love to hear from you!


About Prismatic

Prismatic is the dev-first integration platform for B2B software companies and the easiest way to build, deploy, and support integrations. A complete toolkit for the whole organization, Prismatic includes an integration designer, testing framework, customer deployment management, logging, monitoring, alerting, and an embeddable customer integration portal. Prismatic is a solution for the real world, designed to handle messy, complex integration scenarios and work with existing toolchains. Flexible and extensible, Prismatic empowers teams to tackle bespoke and vertical-specific integrations between applications of all kinds, SaaS or legacy, with or without a modern API, regardless of protocol or data format. Born out of its founders’ experience scaling a software company with hundreds of unique integrations, Prismatic aims to help teams spend less time on integrations and more time driving core product innovation. Learn more at prismatic.io.

Get the latest from Prismatic

Subscribe to receive updates, blog posts, and more. You'll be the first to know when we launch!

You can unsubscribe at any time.