Create ScheduleJob

create a schedule job

Creating a job

📘

Supported jobs

Only PySpark and Python are supported for job creation via API. To create a BigQuery scheduled job, use Google BigQuery. Once created, the BigQuery scheduled job will be synchronized with LiveRamp within 15 minutes.

Before you create a job, do the following:

  • Create a Python code files in your code repository bucket that you want to run and note its path so that you can specify it with the mainFile parameter.
  • Decide when the job should run and at what interval (specify startDate, frequency + a cron expression, and timeZone). If you want to run the job immediately, specify onetimeJob: true.
  • (Optional) If you want to use any of the supported Python packages, verify the correct syntax.

Request example for Python or Pyspark

curl --location 'https://api.liveramp.com/analytics-environment/v1/schedule-jobs' \
--header 'Tenant-ID: ${tenantId}' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer ${access_token}' \
--data-raw '{
  "name": "test",
  "description": "test",
  "jobType": "PySpark",
  "clusterType": "Large",
  "dataprocVersion": "dataproc2.1",
  "mainFile": "gs://lranalytics-eu-123456-cert-awob-coderepo/testFolder01/111.py",
   "pyFiles": [
    "gs://lranalytics-eu-123456-cert-awob-coderepo/1.py"
  ],
  "enabled": true,
  "jobArgs": "",
  "additionalPackages": "",
  "onetimeJob": false,
  "startDate": "",
  "frequency": "25 0 * * 2",
  "timeZone": "Asia/Shanghai",

  "emailAlert": "failed",
  "tenantId": 123456,
  "userEmail": "[email protected]"
}'


For information on how this is done in Analytics Environment's user interface, see "Create a Job" in the Safe Haven help.

Language