Script Connector
Version 25.2.9314
Version 25.2.9314
Script Connector
The Script connector provides the ability to perform functions or operations that are not otherwise available in the built-in connectors of CData Arc.
Overview
Script connectors use the powerful ArcScript scripting language to accomplish a wide range of custom behavior. The script configured in the connector is run each time a document is placed in the Transactions tab for the connector. The input document can be read and modified in the script, or the input document can simply trigger an external function configured in the connector. Script connectors can also be configured to run on a schedule via Receive automation on the Automation tab.
These are a few of the tasks you can configure a Script connector to do:
- Replace values in the input document with data returned from custom SQL queries
- Call a custom Web service to invoke a specialized workflow
- Run an external script or executable via the sysExecute operation
After you decide the appropriate task for your Script connector, read the Scripting documentation to explore how to call various operations using ArcScript.
Connector Configuration
This section contains all of the configurable connector properties.
Script Tab
Options related to writing and testing the script.
- Action dropdown Choose the connector’s action. (See Actions for details on the three options.)
- ArcScript editor Use the editor to write your script. The toolbar contains standard editing options such as undo, redo, delete, copy, and format. It also includes an option to hide the Insert Options in the right pane.
- Insert Options The panel in the right pane makes it easy to insert code into your script with one click. The dropdown includes Inputs and Outputs (such as ConnectorId or Encoding), Snippets, Keywords, Operators and Formatters. Find the code you want to insert, and click to add it.
- Testing tab Specify whether to view the script input and/or output as the message body or headers. Options on this tab vary based on the script’s action type (for example, trigger scripts do not have inputs). Click Test to test the script.
- Log tab Displays the test log.
- Last update on Displays the datetime the script was last modified and saved.
- Auto update Use the toggle to automatically test the script after each modification.
Actions
This connector can perform any of the three connector action types:
- Trigger executes a script on a schedule and might produce files to be sent down the flow. This action serves as the starting point of a flow.
- Transform accepts messages from the flow as input for a script and produces output. This action serves as the middle of a flow.
- Terminal accepts messages from the flow as input for a script and acts as the terminal point of the flow.
Settings Tab
Configuration
Settings related to the core operation of the connector.
- Connector Id The static, unique identifier for the connector.
- Connector Type Displays the connector name and a description of what it does.
- Connector Description An optional field to provide a free-form description of the connector and its role in the flow.
Advanced Settings
- Local File Scheme A scheme for assigning filenames to messages that are output by the connector. You can use macros in your filenames dynamically to include information such as identifiers and timestamps. For more information, see Macros.
Message
- Save to Sent Folder Check this to copy files processed by the connector to the Sent folder for the connector.
- Sent Folder Scheme Instructs the connector to group messages in the Sent folder according to the selected interval. For example, the Weekly option instructs the connector to create a new subfolder each week and store all messages for the week in that folder. The blank setting tells the connector to save all messages directly in the Sent folder. For connectors that process many messages, using subfolders helps keep messsages organized and improves performance.
Logging
- Log Level The verbosity of logs generated by the connector. When you request support, set this to Debug.
- Log Subfolder Scheme Instructs the connector to group files in the Logs folder according to the selected interval. For example, the Weekly option instructs the connector to create a new subfolder each week and store all logs for the week in that folder. The blank setting tells the connector to save all logs directly in the Logs folder. For connectors that process many transactions, using subfolders helps keep logs organized and improves performance.
- Log Messages Check this to have the log entry for a processed file include a copy of the file itself. If you disable this, you might not be able to download a copy of the file from the Input or Output tabs.
Miscellaneous
Miscellaneous settings are for specific use cases.
- Other Settings Enables you to configure hidden connector settings in a semicolon-separated list (for example,
setting1=value1;setting2=value2
). Normal connector use cases and functionality should not require the use of these settings.
Automation Tab
Automation Settings
Settings related to the automatic processing of files by the connector.
- Send Whether messages arriving at the connector are automatically processed.
- Retry Interval The number of minutes that the connector waits before retrying a failed script execution.
- Max Attempts The number of times that the connector retries a failed script execution before raising an error.
- Receive Whether the connector should automatically run the script on a configured schedule.
- Receive Execution Interval The unit of measure to use for the receive automation schedule (such as Minutes, Hours, or Days).
- Minutes Past the Hour The minutes offset for an hourly schedule. Only applicable when the interval setting above is set to Hourly. For example, if this value is set to 5, the automation service downloads at 1:05, 2:05, 3:05, etc.
- Time The time of day that the attempt should occur. Only applicable when the interval setting above is set to Daily, Weekly, or Monthly.
- Day The day on which the attempt should occur. Only applicable when the interval setting above is set to Weekly or Monthly.
- Minutes The number of minutes to wait before attempting the download. Only applicable when the interval setting above is set to Minute.
- Cron Expression A five-position string representing a cron expression that determines when the attempt should occur. Only applicable when the interval setting above is set to Advanced.
Performance
Settings related to the allocation of resources to the connector.
- Max Workers The maximum number of worker threads consumed from the threadpool to process files on this connector. If set, this overrides the default setting on the Settings > Automation page.
- Max Files The maximum number of files sent by each thread assigned to the connector. If set, this overrides the default setting on the Settings > Automation page.
Alerts Tab
Settings related to configuring alerts.
Before you can execute Service Level Agreements (SLAs), you need to set up email alerts for notifications. By default, Arc uses the global settings on the Alerts tab. To use other settings for this connector, toggle Override global setting on.
By default, error alerts are enabled, which means that emails are sent whenever there is an error. To turn them off, uncheck the Enable checkbox.
Enter a Subject (mandatory), then optionally enter a comma-separated list of Recipient emails.
SLAs Tab
Settings related to configuring Service Level Agreements (SLAs).
SLAs enable you to configure the volume you expect connectors in your flow to send or receive, and to set the time frame in which you expect that volume to be met. CData Arc sends emails to warn the user when an SLA is not met, and marks the SLA as At Risk, which means that if the SLA is not met soon, it will be marked as Violated. This gives the user an opportunity to step in and determine the reasons the SLA is not being met, and to take appropriate actions. If the SLA is still not met at the end of the at-risk time period, the SLA is marked as violated, and the user is notified again.
To define an SLA, toggle Expected Volume on, then click the Settings tab.
- If your connector has separate send and receive actions, use the radio buttons to specify which direction the SLA pertains to.
- In the Expect at least portion of the window:
- Set the minimum number of transactions you expect to be processed (the volume)
- Use the Every fields to specify the time frame
- Indicate when the SLA should go into effect. If you choose Starting on, complete the date and time fields.
- Check the boxes for the days of the week that you want the SLA to be in effect. Use the dropdown to choose Everyday if necessary.
- In the Set status to ‘At Risk’ portion of the window, specify when the SLA should be marked as at risk.
- By default, notifications are not sent until an SLA is in violation. To change that, check Send an ‘At Risk’ notification.
The following example shows an SLA configured for a connector that expects to receive 1000 files every day Monday-Friday. An at-risk notification is sent 1 hour before the end of the time period if the 1000 files have not been received.
Transactions Tab
This tab lists all messages associated with the connector. Use the search bar to find specific messages, or click the funnel icon to apply a filter. You can filter by time, message direction, and/or status.
Options on this tab vary depending on the connector’s action type:
- If your connector is a Trigger, use the Receive button to start the flow.
- If your connector is a Transform or Terminal, use the Upload Files button to upload files to the flow.
Using the arc:push Keyword
The arc:push keyword is commonly used in Script connectors to pass items to the output feed of the script. Following are three common use cases.
1 Create and output a file:
<arc:set attr="output.filename" value="results.txt" />
<arc:set attr="output.data" value="This is my result" />
<arc:push item="output" />
2 Pass an input file as output:
<arc:set attr="output.filepath" value="[filepath]" />
<arc:push item="output" />
3 Write entries to the Script connector log file:
<arc:set attr="logger.log#" value="Custom Log entry line 1" />
<arc:set attr="logger.log#" value="Custom Log entry line 2" />
<arc:push item="logger" />
Macros
Using macros in file naming strategies can enhance organizational efficiency and contextual understanding of data. By incorporating macros into filenames, you can dynamically include relevant information such as identifiers, timestamps, and header information, providing valuable context to each file. This helps ensure that filenames reflect details important to your organization.
CData Arc supports these macros, which all use the following syntax: %Macro%
.
Macro | Description |
---|---|
ConnectorID | Evaluates to the ConnectorID of the connector. |
Ext | Evaluates to the file extension of the file currently being processed by the connector. |
Filename | Evaluates to the filename (extension included) of the file currently being processed by the connector. |
FilenameNoExt | Evaluates to the filename (without the extension) of the file currently being processed by the connector. |
MessageId | Evaluates to the MessageId of the message being output by the connector. |
RegexFilename:pattern | Applies a RegEx pattern to the filename of the file currently being processed by the connector. |
Header:headername | Evaluates to the value of a targeted header (headername ) on the current message being processed by the connector. |
LongDate | Evaluates to the current datetime of the system in long-handed format (for example, Wednesday, January 24, 2024). |
ShortDate | Evaluates to the current datetime of the system in a yyyy-MM-dd format (for example, 2024-01-24). |
DateFormat:format | Evaluates to the current datetime of the system in the specified format (format ). See Sample Date Formats for the available datetime formats |
Vault:vaultitem | Evaluates to the value of the specified vault item. |
Examples
Some macros, such as %Ext% and %ShortDate%, do not require an argument, but others do. All macros that take an argument use the following syntax: %Macro:argument%
Here are some examples of the macros that take an argument:
- %Header:headername%: Where
headername
is the name of a header on a message. - %Header:mycustomheader% resolves to the value of the
mycustomheader
header set on the input message. - %Header:ponum% resolves to the value of the
ponum
header set on the input message. - %RegexFilename:pattern%: Where
pattern
is a regex pattern. For example,%RegexFilename:^([\w][A-Za-z]+)%
matches and resolves to the first word in the filename and is case insensitive (test_file.xml
resolves totest
). - %Vault:vaultitem%: Where
vaultitem
is the name of an item in the vault. For example,%Vault:companyname%
resolves to the value of thecompanyname
item stored in the vault. - %DateFormat:format%: Where
format
is an accepted date format (see Sample Date Formats for details). For example,%DateFormat:yyyy-MM-dd-HH-mm-ss-fff%
resolves to the date and timestamp on the file.
You can also create more sophisticated macros, as shown in the following examples:
- Combining multiple macros in one filename:
%DateFormat:yyyy-MM-dd-HH-mm-ss-fff%%EXT%
- Including text outside of the macro:
MyFile_%DateFormat:yyyy-MM-dd-HH-mm-ss-fff%
- Including text within the macro:
%DateFormat:'DateProcessed-'yyyy-MM-dd_'TimeProcessed-'HH-mm-ss%