- fileNameRegex added to the Azure reader components, for filtering file names using a regex expression.
- Shared Variables: You may now create and manage variables, that can be shared between different partners, data centers and sites. This is useful for credentials (for example, to an SFTP repository) or any variable that is reused in different flows. It saves the hassle of retyping and minimizes manual errors; and also enables updating variable values in a single location, instead of manually updating different dataflows. For more information, see Shared Variables.
- The Google Cloud reader, datasource.read.googlecloud, has a new parameter fileNameRegex for filtering files by name, using regex.
- Bug fix: sometimes, when retrying to export a file, subsequent export attempts would cause the original file to be overwritten. This is now fixed by applying a different naming structure for retry files.
- New parameter addResponseHeaders added to datasource.write.external.generic, for sending on the response headers to the next step.
- You can now encrypt and decrypt files in IdentitySync flows using GPG, in addition to PGP that was previously supported.
- You can now create a dataflow based on an "Empty" template, that is not pre-populated with steps.
- The dataflow editor now includes an actions menu with the following options:
- Run Test
- Job Status
- Updates to the SAP Marketing Cloud writer:
- Support for phoneField and faxField, for mapping a SAP Customer Data Cloud field that contains the contact's phone or fax numbers, with the corresponding Marketing field.
- Support for PHONE and FAX values in the consent communicationType.
- New authenticationVersion parameter added to the Salesforce Marketing Cloud writer, to support using version 2 of the Marketing Cloud API.
- salt parameter added to the hashing components, field.hash.md5 and field.hash.sha2, for adding salt to the hashing algorithm.
- You can now read more than 1000 files from Amazon S3.
- You can now transfer files between SAP Customer Data Cloud and Google Cloud Storage using IdentitySync.
- SAP Marketing Cloud integration: You can now send a ContactOrigin field with your user records. If included, this will update the ContactOrigin in SAP Marketing Cloud. Otherwise, by default, this integration writes a value of "GIGYA_ID" as the ContactOrigin.
- New parameters in datasource.read.gigya.audit allow using IdentitySync to export audit log items from a different SAP Customer Data Cloud site.
- New notifyLastRecord parameter added to the record.evaluate script used for creating custom scripts. This is used to indicate that the last record in the batch has been handled.
- New synchronous parameter added to the SAP Marketing Cloud writer, enabling working in a-synchronous mode. A-synchronous mode is faster, but you will not receive feedback of errors and job status. These should be handled in SAP Marketing Cloud.
- maxConnections parameter added to datasource.write.external.generic.
- datasource.read.gigya.account has a new parameter, keepFieldNamesWithDotAsIs, that enables different handling of field names that contain a dot.
- updateDateField parameter added the subscription and consent objects in datasource.write.hybrismarketing, thus adding support for syncing the original subscription or consent date into SAP Marketing Cloud.
- When using the generic API writer, you can now pass a field value to parameters and headers, and not just a hard coded value. This enhances the flexibility and usability of the generic API writer.
- New addFilename parameter added to the parse components, file.parse.dsv and file.parse.json, for adding a to each record a "_filename" parameter. This is usually used for debugging.
- New look and feel of the IdentitySync Studio in Gigya's Console:
- New component, datasource.write.external.generic, for writing user record data to an external service endpoint. This writer complements existing writers that write data to specific target platforms (both 3rd party services and file storage platforms), and greatly increases the flexibility of the IdentitySync platform.
- New error parameter when building IdentitySync Custom Scripts, for sending a failed record to the error path.
- New writer, datasource.write.gigya.importComment for importing comments into the comment storage.
- New 'field' steps for hashing a field value:
- New parameters in datasource.lookup.gigya.account:
- isCaseSensitive allows performing a case insensitive lookup. Note that regardless of the value of this parameter, lookups of all 'basic-string' values are always case sensitive.
- matchBehavior decides what to do in case of a match between the source field and the Gigya field. This can be used when you wish to import only those records that do not exist on the the target platform.
- New parameters in datasource.read.gigya.comment: apiKey, userKey and secret added for reading Gigya comments from a different source site.
- batchSize parameter added to datasource.write.hybrismarketing, for indicating the size of the record batch, enabling the ability to sync contacts in a batch, and not one by one as previously.
- When a job fails after processing some files, it will now go into "retry" mode and attempt to handle the remaining files, while ignoring those already processed.
- fileNameRegex parameter added to the Amazon S3 reader, for filtering files by their name.
- Updates to the SAP Marketing cloud writer (datasource.write.hybrismarketing):
- New timeout parameter for configuring the time to wait for a response from the platform
- New mobileField parameter for passing a contact's mobile phone number into SAP Marketing Cloud
- New communicationType field in the consent object, for passing the communication type to which the contact consented
- timeout parameter added to SFTP and FTP writers and readers.
- In custom scripts, setSessionParameter is now limited to 100 lines.
- New templates available in IdentitySync Studio, when creating a dataflow:
Import Full Accounts from SFTP
Import Lite Accounts from SFTP
- New reader, datasource.read.azure.blob_token, for reading data "blobs" from the Azure Blob cloud storage using an access token.
- New parameters added to datasource.write.gigya.generic : apiKey, userKey and secret, enable the usage of the generic writer in a Gigya-to-Gigya data transfer scenario. These parameters are used as credentials from the source site, from which to read data.
- New component, datasource.delete.hybrismarketing, for deleting end-users from the SAP Marketing Cloud (Hybris Marketing) database, following a deletion from the Gigya database.
- New from parameter added to datasource.read.gigya.audit, for selecting the audit log from which to query (Gigya's main Audit Log, or the Consent Vault).
- New marketingAreaField in the datasource.write.hybrismarketing component, for passing the marketing area associated with a record into SAP Hybris Marketing.
- Support for adding an error path after datasource.write.salesforce.
- New component, datasource.lookup.gigya.account, allows you to perform a lookup of users in Gigya's database during an import flow, and make real-time decisions regarding the import process for existing users.
- New component, datasource.read.gigya.comment, executes a search in Gigya's comment database.
- New parameters added to datasource.write.hybrismarketing, for supporting writing subscription and consent information to Hybris Marketing.
- New addResponse parameter in datasource.write.gigya.generic enables including Gigya's API response in the output file, which can then be used in a later step.
- You can now connect the generic writer (datasource.write.gigya.generic) to a next step that follows a successful run, and not just a failed one.
- You can now stop a job mid-run, by hitting the Stop icon in the Job history window:
- The inferColumns parameter was removed from file.format.dsv, as column names are inferred automatically, or set manually using the columns parameter.
- Permissions to run an IdentitySync job are now granted automatically on the worker, for partners and users with the relevant permissions.
- When creating custom scripts using the record.evaluate component in IdentitySync studio, you can now expand to full-screen mode for easier code editing:
- When reviewing the details of a job in the Job History page, you can now sort by each one of the step metrics (e.g., by duration, step name, number of errors).
- New newsletterField in datasource.write.silverpop enables writing to Silverpop's built-in status field, rather than to a custom field.
- Updated Silverpop templates in IdentitySync studio use newsletterField by default.
- Partner ID and API key added to the email notification sent after a job executes.
- Bug fixes.
- You can now use IdentitySync to copy accounts from one Gigya site to another. For more information, see IdentitySync.
- New step metrics for advanced debugging and monitoring of dataflows that fail or take a long time to execute. Fore more information about monitoring dataflows, see IdentitySync.
- IdentitySync Studio can now be opened in full-screen mode.
- Status column added to the IdentitySync scheduler, displays a status of "busy" or "ready".
- Step parameters in IdentitySync Studio now include a link to the developer's guide:
- New consent parameter added to datasource.read.gigya.account to be used in implementations of Customer Consent. Enables retrieving users from Gigya's database based on the status of their consent to a given consent statement.
- New component, datasource.write.hybrismarketing, for writing user data directly to the SAP Hybris Marketing platform. For more information, see SAP Marketing Cloud.
- New action and sync_fields parameters in datasource.write.silverpop support choosing the method for handling existing user data, and specifying a unique ID for rows in Silverpop.
- Bug fixes
- Support for Mailchimp interest categories includes new interestsMapping parameter in datasource.write.mailchimp. See also the updated sample data flow.
- New Connector Type added to IdentitySync Studio, used for writing failed records to a separate file.
- IdentitySync Studio is the new visual data flow editor, used when creating and editing flows in the IdentitySync dashboard. With IdentitySync Studio, you can:
- Add parameters to each step in a convenient, friendly UI.
- Drag-and-drop new data flow steps and connect them to the flow by dragging arrows.
- Add a record.evaluate custom step and add the code in the JSON editor UI, complete with built-in code testing capabilities.
- Delete a step by selecting it and clicking "Delete".
- You can now delete data flows directly from the IdentitySync dashboard.
- Support for flexible time notation in the WHERE clause ("where" parameter) of datasource.read.gigya.account.
- Support for Silverpop: New components for reading from and writing to Silverpop, including dataflow examples. See Component Repository here and here, and inbound and outbound flows.
- Support for writing to Salesforce Marketing Cloud (Exacttarget). See Component Repository and outbound Dataflow.
- New maxFileSize parameter added to file.format.dsv and file.format.json, used to split the output files into multiple files of smaller size.
- New updatePolicy added to datasource.write.gigya.account for defining how to handle existing values in Gigya's database - whether to override them, or append to existing values.
- New formatting parameters added to file.format.krux: quoteFields and separator.
- New components and dataflow templates for integration with Campaign Monitor. See outbound and inbound dataflows, and the Component Repository
- Support for Constant Contact: New components for reading from and writing to Constant Contact, including dataflow examples. See Component Repository, here and here.
- New generic component that can call any of Gigya's APIs, including those that write data to a Gigya database: datasource.write.gigya.generic.
- New status parameter added to datasource.read.mailchimp.
- New datasource.write.exacttarget component for writing subscriber data to Salesforce Marketing Cloud (Exacttarget).
- Email notifications sent at the end of the dataflow execution now also includes the site ID.
- Support for Marketo inbound and outbound flows. See templates, Marketo Dataflow - Inbound and Marketo Dataflow - Outbound, and Component Repository.
- Bug fix: On export flow that includes PGP encryption and running on large amount of accounts, the job failed with heartbeat error.
- Improved export performance to support parallel search.
- New parameter maxConcurrency for improving performance of parallel search added to datasource.read.gigya.account and datasource.read.gigya.ds scripts. See Component Repository.
- New parameter from added to datasource.read.gigya.account for specifying the data source. See Component Repository.
- Bug fix: PGP decryption failed when trying to decrypt compressed file.
- Console enhancement: Added support for job history and job details.
Change in retry mechanism when writing to Gigya: If some of the records fail to update, a retry mechanism will handle those records, and the job status will be set to completed_with_errors.
Bug fix: In case of multiple next steps, clone records so each record manipulation will not affect the rest of the flow.
- Bug fix: in datasource.write.ftp and datasource.write.sftp, when defining a hierarchical path for the remotePath parameter, the job failed and a 'No such file' error was displayed.
- Optional blobPrefix parameter added to script datasource.read.azure.sas: If specified, only blobs whose names begin with this prefix will be extracted.
- New components in the Component Repository for reading and writing to Azure Blob cloud storage using shared access signature (SAS).
- Full dataflow sample for writing data to Azure Blob cloud storage using SAS, here: Azure SAS Dataflow.
- Bug fix: Corrected the format for likes in the Salesforce DMP Dataflow when using file.format.krux.
- New maxRetry parameter in datasource.write.mailchimp sets the maximum number of retry attempts before the job fails. The default is 30.
- New components in the Component Repository for reading and writing to Mailchimp.
- The field.array.extract script supports new types of arrays.
- Bug fix: When refreshing worker sites, the admin domain is used.
- New component: datasource.read.gigya.audit for retrieving audit log items using an SQL-like WHERE clause.
- New sortBy and sortOrder parameters added to datasource.read.ftp and to datasource.read.sftp, for sorting the files by a selected field (e.g. time).
- Developer custom components: Developers who have the _idx_script_developers permission can write custom IDX scripts in JS and add them to the Component Repository. To get this permission, open a Salesforce case.
New fileName parameter replaces the filePrefix, fileDateFormat and fileExtension parameters, for a more flexible formatting of file names. See the relevant file formatting components in the Component Repository.