Stop doing that!

Avatar

Some habits are hard to break. But sometimes it’s worth it to put in the effort of learning new ways of doing things more efficiently. So leave your preconceived ideas behind and let’s take a look at how you can improve the way you work and the solutions you implement using OL Connect Workflow.

Branches

Let’s start with what’s probably the single most irritating thing I see in countless Workflow configurations:

You’re probably looking at this and thinking “uhm… what’s wrong with that?!?“. Glad you asked! But before I show you the proper way of doing the exact same thing, we need to talk about branches.

As you know, a workflow process works on temporary data files. The file is initially picked up from its input location (be it a folder, an HTTP request, an email, etc.) and copied into a temporary working folder, under a temporary and unique name (the ubiquitous %f system variable). Now each branch in Workflow has properties that allow you to specify whether the process should retain the same file it had before entering the branch, or whether it should use the file that comes out of the branch. Both are perfectly valid use cases, mind you.

By default, a Workflow branch is always set up to retain the file it had before entering the branch. Which means it needs to make a copy of that file. You see the problem now? Every time you enter a branch, by default, a temporary copy of the file is made. And in the above example, the branch is simply meant to copy the file to a destination folder, which could have been done just as well with the original file!

So the proper way to do this is simply to use the same Send to Folder task, but to insert it as an action instead of as an output:

Imagine you’re picking up a few thousand files and simply making a backup copy of those files before processing them. If your process simply uses a Send To Folder output task inside a branch, it actually needs to create double the number of files that it picked up. If those files are sizable, it’s a huge waste of time, I/O and disk space!

And not only that, but your process is now one task shorter because you removed the branch. I recently had to work on such a process that had around 280 tasks in all, which I trimmed down to about 240, just with the above technique. That’s 40 useless copies of every single file!

So please… stop doing that!

Comments

Ok, we all know comments are a good thing. In fact, we should always comment more, since it makes maintenance so much easier. However adding comment tasks everywhere is a waste of real estate in the Workflow Configuration tool:

Instead, use each individual task’s comment section. Since, for a few years now, an option has allowed you to specify that these comments should be used as the task’s description, the above should be modified to look like this:

Granted, this won’t make your process run any faster. But it will save you a whole lot of scrolling when you edit it!

And that doesn’t mean to say comment tasks are all bad, but you should use them to describe what a series of task do (for instance, to describe an entire branch), you shouldn’t use them for individual tasks.

So please… stop doing that!

Use variables

Many times, I see series of conditions that are checked sequentially to determine which OL Connect resource to use, most of the time based on a value in the data file. Something like this:

Now that’s perfectly valid, but again, it’s a waste of real estate and the last item in the list of conditions is always going to run just slightly slower than the previous items because for each condition, the process must open the data file, extract the value and determine if the condition is a match.

Instead, you could use something much more compact and efficient:

Here, we extract one value from the data file and store it in a variable, and we then use that variable value as the name of our OL Connect Resource. Of course, you need to name those resources appropriately, but even if you don’t, you can achieve the same thing with a very simple script that looks at an extracted value and sets the value of a variable to the proper resource name. And by the way, notice how the tasks comments make it easy to understand what’s going on in the process.

So repetitive conditions are slightly less efficient at run time, and a whole lot more cumbersome to manage when designing your configuration.

So please… stop doing that!

Conclusion

These are just a few of my pet peeves when I review Workflow configurations. We’ve all been guilty of doing some of the things I outlined here – including myself! – but hopefully, after reading this, you will have recognized some of these bad habits and will start applying more efficient techniques to your processes

If you have your own simple tricks that make your Workflow processes easier to manage or more efficient, feel free to add them in the comments below!



Leave a Reply to John Price Cancel reply

Your email address will not be published. Required fields are marked *

All comments (6)

  • Pete

    Ouch, that sting a little. LOL. I feel familiar with each of these but habits require effort to change. I can always use a reminder.

    Thanks for another great post Philippe 🙂

  • John Price

    Guilty as charged, I never thought of using the Output to Folder as an action and the impact it would have. Thanks for that.

    As for comments, I do wish we could have a bit more “bling” within the task comments. Maybe some task formatting, bold, colour maybe even a few fonts and support for links.

    However many thanks for the blogs, they always have some good stuff in them.

    • Philippe Fontan

      You can use

      html elements

      in your comments and they will get interpreted correctly. Things like <b>, <i>, <u>.

  • Jean-Cédric Hamel

    When having multiple processes that share a lot of common steps, make a sub-process out of these. This way, it is easier to apply changes that impact a lot of processes and keep each process cleaner and less confusing with steps you see everywhere else.

    It also diminish the scrolling need when going through your processes. Even if many of your processes are similar but with small variations, it is still a good idea to use sub-processes and apply some condition in those to adapt to their calling processes.

    **Upcoming OL Connect 2022.1 feature improvement**
    In the coming version 2022.1, we no longer need to store values into JobInfos variables to pass them down to sub-processes. We can now pass them directly to the variables defined in the sub-processes as runtime parameters in the Go Sub plugin….soooo neat!

  • Lou Pace

    Great article! How about proper use of global variables vs. local and job info variables within a process? This is a mistake I see often from end users.

    • Philippe Fontan

      Yep, global variables are often being abused and can wreak havoc on a system if several processes write to them simultaneously. I think it’s something I should address in a future article. Thanks for the suggestion!