- IdentitySync Studio can now be opened in full-screen mode.
- Status column added to the IdentitySync scheduler, displays a status of "busy" or "ready".
- Step parameters in IdentitySync Studio now include a link to the developer's guide:
- New consent parameter added to datasource.read.gigya.account to be used in implementations of Enterprise Preference Manager. Enables retrieving users from Gigya's database based on the status of their consent to a given consent statement.
- New component, datasource.write.hybrismarketing, for writing user data directly to the SAP Hybris Marketing platform. For more information, see Hybris Marketing.
- New action and sync_fields parameters in datasource.write.silverpop support choosing the method for handling existing user data, and specifying a unique ID for rows in Silverpop.
- Bug fixes
- Support for MailChimp interest categories includes new interestsMapping parameter in datasource.write.mailchimp. See also the updated sample data flow.
- New Connector Type added to IdentitySync Studio, used for writing failed records to a separate file.
- IdentitySync Studio is the new visual data flow editor, used when creating and editing flows in the IdentitySync dashboard. With IdentitySync Studio, you can:
- Add parameters to each step in a convenient, friendly UI.
- Drag-and-drop new data flow steps and connect them to the flow by dragging arrows.
- Add a record.evaluate custom step and add the code in the JSON editor UI, complete with built-in code testing capabilities.
- Delete a step by selecting it and clicking "Delete".
- You can now delete data flows directly from the IdentitySync dashboard.
- Support for flexible time notation in the WHERE clause ("where" parameter) of datasource.read.gigya.account.
- Support for Silverpop: New components for reading from and writing to Silverpop, including dataflow examples. See Component Repository here and here, and inbound and outbound flows.
- Support for writing to Salesforce Marketing Cloud (Exacttarget). See Component Repository and outbound Dataflow.
- New maxFileSize parameter added to file.format.dsv and file.format.json, used to split the output files into multiple files of smaller size.
- New updatePolicy added to datasource.write.gigya.account for defining how to handle existing values in Gigya's database - whether to override them, or append to existing values.
- New formatting parameters added to file.format.krux: quoteFields and separator.
- New components and dataflow templates for integration with Campaign Monitor. See outbound and inbound dataflows, and the Component Repository
- Support for Constant Contact: New components for reading from and writing to Constant Contact, including dataflow examples. See Component Repository, here and here.
- New generic component that can call any of Gigya's APIs, including those that write data to a Gigya database: datasource.write.gigya.generic.
- New status parameter added to datasource.read.mailchimp.
- New datasource.write.exacttarget component for writing subscriber data to Salesforce Marketing Cloud (Exacttarget).
- Email notifications sent at the end of the dataflow execution now also includes the site ID.
- Support for Marketo inbound and outbound flows. See templates, Marketo Dataflow - Inbound and Marketo Dataflow - Outbound, and Component Repository.
- New dataflow template for Game Mechanics lookup: Game Mechanics Dataflow.
- Bug fix: On export flow that includes PGP encryption and running on large amount of accounts, the job failed with heartbeat error.
- Improved export performance to support parallel search.
- New parameter maxConcurrency for improving performance of parallel search added to datasource.read.gigya.account and datasource.read.gigya.ds scripts. See Component Repository.
- New parameter from added to datasource.read.gigya.account for specifying the data source. See Component Repository.
- Bug fix: PGP decryption failed when trying to decrypt compressed file.
- Console enhancement: Added support for job history and job details.
Change in retry mechanism when writing to Gigya: If some of the records fail to update, a retry mechanism will handle those records, and the job status will be set to completed_with_errors.
Bug fix: In case of multiple next steps, clone records so each record manipulation will not affect the rest of the flow.
- Bug fix: in datasource.write.ftp and datasource.write.sftp, when defining a hierarchical path for the remotePath parameter, the job failed and a 'No such file' error was displayed.
- Optional blobPrefix parameter added to script datasource.read.azure.sas: If specified, only blobs whose names begin with this prefix will be extracted.
- New components in the Component Repository for reading and writing to Azure Blob cloud storage using shared access signature (SAS).
- Full dataflow sample for writing data to Azure Blob cloud storage using SAS, here: Azure SAS Dataflow.
- Bug fix: Corrected the format for likes in the Krux Dataflow when using file.format.krux.
- New maxRetry parameter in datasource.write.mailchimp sets the maximum number of retry attempts before the job fails. The default is 30.
- New components in the Component Repository for reading and writing to Mailchimp.
- The field.array.extract script supports new types of arrays.
- Bug fix: When refreshing worker sites, the admin domain is used.
- New component: datasource.read.gigya.audit for retrieving audit log items using an SQL-like WHERE clause.
- New sortBy and sortOrder parameters added to datasource.read.ftp and to datasource.read.sftp, for sorting the files by a selected field (e.g. time).
- Developer custom components: Developers who have the _idx_script_developers permission can write custom IDX scripts in JS and add them to the Component Repository. To get this permission, open a Salesforce case.
New fileName parameter replaces the filePrefix, fileDateFormat and fileExtension parameters, for a more flexible formatting of file names. See the relevant file formatting components in the Component Repository.