Gigya Job Openings

Salesforce DMP Dataflow

Skip to end of metadata
Go to start of metadata

The dataflow that loads data from Gigya to Salesforce DMP (previously Krux) includes field renaming, formatting the file in a Salesforce-compatible format, compressing the file and writing the data to S3. The various dataflow steps are described in greater detail here: Component Repository

Note that IdentitySync jobs are scheduled in UTC time. Therefore, the platform participating in the flow should be set to the UTC timezone to ensure that file requests are handled properly.

Following is a code sample of a full dataflow for passing data from Gigya to Salesforce DMP: 

{
      "name": "krux",
      "steps": [
        {
          "id": "account",
          "type": "datasource.read.gigya.account",
          "params": {
            "select": "UID,profile.gender,profile.age"
          },
          "next": [
            "rename"
          ]
        },
        {
          "id": "rename",
          "type": "field.rename",
          "params": {
            "fields": [
              {
                "sourceField": "UID",
                "targetField": "id"
              },
              {
                "sourceField": "profile.gender",
                "targetField": "g_gender"
              },
              {
                "sourceField": "profile.age",
                "targetField": "g_age"
              },
			 {
                "sourceField": "profile.likes",
                "targetField": "likes"
              }
            ]
          },
          "next": [
            "krux"
          ]
        },
        {
          "id": "krux",
          "type": "file.format.krux",
          "params": {
            "fileName": "Gigya_Krux_M6.csv",
            "createEmptyFile": true
          },
          "next": [
            "lzo"
          ]
        },
        {
          "id": "lzo",
          "type": "file.compress.lzo",
          "params": {
            "createIndexFile": true
          },
          "next": [
            "s3"
          ]
        },
        {
          "id": "s3",
          "type": "datasource.write.amazon.s3",
          "params": {
            "bucketName": "...",
            "accessKey": "...",
            "secretKey": "...",
            "objectKeyPrefix": "..."
          }
        }
      ]
    }
  • No labels