Parameters

Describing parameters

Both datasets and recipes take an array of parameters that describe the allowable parameters of the components. Each parameter is a JSON object with the following fields:

  • name: Name of the parameter in the configuration dict. We highly recommend that you_use_slug_like_names.
  • type: Type of the parameter. Most common types are STRING, INT, DOUBLE, BOOLEAN, PASSWORD, SELECT and DATASET. See below for full documentation of available types
  • label: The user-visible name that appears in the form.
  • description: User-visible additional help, appears at the right of the form.
  • defaultValue: Prefill value of the parameter. Type must match the field type.
  • mandatory (boolean): Is this parameter required?
  • visibilityCondition: Show/hide this parameter depending on a condition. See Other topics.

Besides, these fields can be used for spefic types:

  • canSelectForeign (boolean, default false): Should this parameter show foreign elements? For DATASET(S), MODEL and FOLDER only.
  • columnRole: For COLUMN and COLUMNS only. See below.
  • selectChoices: For SELECT only. See below.
  • datasetParamName: For DATASET_COLUMN only. Parameter name of the related dataset. See below.
  • apiServiceParamName: For API_SERVICE_VERSION only. Parameter name of the related API Service. See below.

Important notes:

  • In the Python recipes, the parameters are the result of JSON deserialization. As such, you’ll only get the following data types: string, float, bool (in other words, a INT parameter is received as a float in Python)

Available parameter types

These are the main types, available for any custom piece of code written in DSS:

  • STRING: A simple string
  • INT: An integer.
  • DOUBLE: A decimal
  • BOOLEAN: A boolean
  • PASSWORD: A simple string, but the UI hides the typing
  • SELECT: Select a value among possible choices. See below.
  • MAP: A key -> value mapping
  • TEXTAREA: A string, but the UI shows a multi-line “textarea” control. Good for entering long and multi-line values.

Types for custom recipes

  • COLUMN: Select one column among the columns of an input dataset.
  • COLUMNS: Select multiple columns among the columns of an input dataset.

In recipes, it’s common to want to select one or several columns from one of the input datasets. This is done using the COLUMN and COLUMNS types.

You will need to give the name of the role from which you want to select a column. Note that if the given role is multi-dataset, only columns from the first dataset will be selected.

You declare it like that:

{
    "name" : "incol",
    "label" : "Input column",
    "type" : "COLUMN",
    "columnRole" : "in_role_1"
}
  • If you use COLUMN, you get the answer as a string
  • If you use COLUMNS, you get the answer as a list.

Types for custom macros

  • DATASET: Select exactly one dataset.
  • DATASETS: Select one or more datasets.
  • DATASET_COLUMN: Select a column among the specified dataset.

This type needs you to specify a datasetParamName to know from which parameter you want to list the dataset columns. See example below.

  • FOLDER: Select a DSS managed folder.
  • MODEL: Select a DSS saved model.
  • API_SERVICE: Select an API service.
  • API_SERVICE_VERSION: Select a version package among the specified API service.

This type needs you to specify an apiServiceParamName to know from which parameter you want to list the API service versions. See example below.

  • BUNDLE: Select a bundle.
  • VISUAL_ANALYSIS: Select a visual analysis.

For example:

{
    "name": "input_ds",
    "label" : "Input dataset",
    "type": "DATASET",
    "canSelectForeign": true,
    "mandatory" : true
},
{
    "name": "input_ds_column",
    "label" : "Column from the chosen dataset",
    "type": "DATASET_COLUMN",
    "datasetParamName": "input_ds",
    "mandatory" : true
},
{
    "name": "input_api_service",
    "label" : "API service",
    "type": "API_SERVICE",
    "mandatory" : true
},
{
    "name": "input_api_service_version",
    "label" : "API service package version",
    "type": "API_SERVICE_VERSION",
    "apiServiceParamName": "input_api_service",
    "mandatory" : true
}

Selects

Selects allow you to propose multiple choices, and have the user select one (and only one). Each choice has an identifier and a user-visible long string.

For example:

{
    "name" : "egg_type",
    "label" : "Choose your eggs",
    "type" : "SELECT",
    "selectChoices" : [
        { "value" : "scrambled", "label" : "Scrambled"},
        { "value" : "sunny_up", "label" : "Sunny-side up"}
    ]
}

Plugin-level configuration

Just like each dataset and recipe can accept params, so can a plugin. Plugin-level configuration allows you to have a centralized configuration that is shared by all datasets and all recipes instances of this plugin.

Another characteristic of plugin-level config is that it’s only readable and writable by the Administrator. As such, it can be the right place to store API keys, credentials, connection strings, …

Add settings to a plugin

To add settings to a plugin, edit the plugin.json file and add a "params" array in the JSON top-level object. The structure of this params array is similar to the one of datasets and recipes.

Read settings of a plugin

  • Datasets receive the plugin config (as a Python dict) in the constructor of their connector class. See the documentation of the Connector class or the automatically generated sample for more information.
  • Python recipes can read the plugin config (as a Python dict) by calling the dataiku.customrecipe.get_plugin_config() function
  • R recipes can read the plugin config by calling the dataiku::dkuPluginConfig() function