You can design and package a Dataiku flow into a reusable recipe for other projects.
Using an Application-as-recipe¶
To create a recipe from an existing Application-as-recipe, click on the New recipe button from the Flow. Application-as-recipes are grouped by category in this menu.
Application-as-recipes can only be run by users that:
Developing an Application-as-recipe¶
Only users that are administrator of the project can contribute to the development of an Application-as-recipe. But only users with the Develop plugins permission are allowed to configure project variables through the recipe settings with custom code.
To convert a project into an Application-as-recipe, click on Application designer from the project menu. A project can be converted either into a Dataiku application or into an Application-as-recipe. Once the project is converted, the project menu will open the Application designer directly.
The Application header panel allows to configure:
- the recipe name and description;
- which user can instantiate the application.
The Included content panel allows to configure the additional data — from the original project containing the application — to include into the application instances.
The icon defines the icon for the Application-as-recipe. Available icons can be found in Font Awesome v3.2.1.
Application-as-recipes with the same category are grouped under the same section in the New recipe menu.
This panel allows to define the inputs and outputs of the recipe that is to say a mapping between elements of the project using the Application-as-recipe and the corresponding elements in the Application-as-recipe flow. Each element is made of:
- a label: this label is displayed in the recipe editor to identify the element
- a type: an element can be a Dataset, a Managed folder or a Saved model
- the corresponding element in the Application-as-recipe flow
It is mandatory to specify the Scenario to build the outputs of the recipe. This scenario will be executed when running the Application-as-recipe.
- Partitioned inputs and outputs are not supported.
- Outputs must be writable by DSS (e.g. should not be a BigQuery or Redshift dataset)