Other administration tasks

Here are some more global administration tasks that can be performed using the DSS Public API:

  • Reading and writing general instance settings
  • Managing user and group impersonation rules for Multi-user security
  • Managing (creating / modifying) code environments
  • Listing long running tasks, getting their status, aborting them
  • Listing running notebooks, getting their status, unloading them

Reference documentation

class dataikuapi.dss.admin.DSSGeneralSettings(client)

The general settings of the DSS instance. Do not create this directly, use dataikuapi.DSSClient.get_general_settings()

save()

Save the changes that were made to the settings on the DSS instance Note: this call requires an API key with admin rights

get_raw()

Get the settings as a dictionary

add_impersonation_rule(rule, is_user_rule=True)

Add a rule to the impersonation settings

Parameters:
get_impersonation_rules(dss_user=None, dss_group=None, unix_user=None, hadoop_user=None, project_key=None, scope=None, rule_type=None, is_user=None)

Retrieve the user or group impersonation rules that matches the parameters

Parameters:
  • dss_user – a DSS user or regular expression to match DSS user names
  • dss_group – a DSS group or regular expression to match DSS groups
  • unix_user – a name to match the target UNIX user
  • hadoop_user – a name to match the target Hadoop user
  • project_key – a project_key
  • scope – project-scoped (‘PROJECT’) or global (‘GLOBAL’)
  • type – the rule user or group matching method (‘IDENTITY’, ‘SINGLE_MAPPING’, ‘REGEXP_RULE’)
  • is_user – True if only user-level rules should be considered, False for only group-level rules, None to consider both
remove_impersonation_rules(dss_user=None, dss_group=None, unix_user=None, hadoop_user=None, project_key=None, scope=None, rule_type=None, is_user=None)

Remove the user or group impersonation rules that matches the parameters from the settings

Parameters:
  • dss_user – a DSS user or regular expression to match DSS user names
  • dss_group – a DSS group or regular expression to match DSS groups
  • unix_user – a name to match the target UNIX user
  • hadoop_user – a name to match the target Hadoop user
  • project_key – a project_key
  • scope – project-scoped (‘PROJECT’) or global (‘GLOBAL’)
  • type – the rule user or group matching method (‘IDENTITY’, ‘SINGLE_MAPPING’, ‘REGEXP_RULE’)
  • is_user – True if only user-level rules should be considered, False for only group-level rules, None to consider both
class dataikuapi.dss.admin.DSSUserImpersonationRule(raw=None)

Helper to build user-level rule items for the impersonation settings

scope_global()

Make the rule apply to all projects

scope_project(project_key)

Make the rule apply to a given project

Args:
project_key : the project this rule applies to
user_identity()

Make the rule map each DSS user to a UNIX user of the same name

user_single(dss_user, unix_user, hadoop_user=None)

Make the rule map a given DSS user to a given UNIX user

Args:
dss_user : a DSS user unix_user : a UNIX user hadoop_user : a Hadoop user (optional, defaults to unix_user)
user_regexp(regexp, unix_user, hadoop_user=None)

Make the rule map a DSS users matching a given regular expression to a given UNIX user

Args:
regexp : a regular expression to match DSS user names unix_user : a UNIX user hadoop_user : a Hadoop user (optional, defaults to unix_user)
class dataikuapi.dss.admin.DSSGroupImpersonationRule(raw=None)

Helper to build group-level rule items for the impersonation settings

group_identity()

Make the rule map each DSS user to a UNIX user of the same name

group_single(dss_group, unix_user, hadoop_user=None)

Make the rule map a given DSS user to a given UNIX user

Args:
dss_group : a DSS group unix_user : a UNIX user hadoop_user : a Hadoop user (optional, defaults to unix_user)
group_regexp(regexp, unix_user, hadoop_user=None)

Make the rule map a DSS users matching a given regular expression to a given UNIX user

Args:
regexp : a regular expression to match DSS groups unix_user : a UNIX user hadoop_user : a Hadoop user (optional, defaults to unix_user)
class dataikuapi.dss.admin.DSSCodeEnv(client, env_lang, env_name)

A code env on the DSS instance. Do not create this directly, use dataikuapi.DSSClient.get_code_env()

delete()

Delete the code env Note: this call requires an API key with admin rights

get_definition()

Get the code env’s definition

Note: this call requires an API key with admin rights

Returns:the code env definition, as a dict
set_definition(env)

Set the code env’s definition. The definition should come from a call to get_definition()

Fields that can be updated in design node:

  • env.permissions, env.usableByAll, env.desc.owner
  • env.specCondaEnvironment, env.specPackageList, env.externalCondaEnvName, env.desc.installCorePackages, env.desc.installJupyterSupport, env.desc.yarnPythonBin

Fields that can be updated in automation node (where {version} is the updated version):

  • env.permissions, env.usableByAll, env.owner
  • env.{version}.specCondaEnvironment, env.{version}.specPackageList, env.{version}.externalCondaEnvName, env.{version}.desc.installCorePackages, env.{version}.desc.installJupyterSupport, env.{version}.desc.yarnPythonBin

Note: this call requires an API key with admin rights

Parameters:data (dict) – a code env definition
Returns:the updated code env definition, as a dict
set_jupyter_support(active)

Update the code env jupyter support

Note: this call requires an API key with admin rights

Parameters:active – True to activate jupyter support, False to deactivate
update_packages()

Update the code env packages so that it matches its spec

Note: this call requires an API key with admin rights

class dataikuapi.dss.future.DSSFuture(client, job_id, state=None)

A future on the DSS instance

abort()

Abort the future

get_state()

Get the status of the future, and its result if it’s ready

peek_state()

Get the status of the future, and its result if it’s ready

get_result()

Get the future result if it’s ready, raises an Exception otherwise

has_result()

Checks whether the future has a result ready

wait_for_result()

Wait and get the future result

class dataikuapi.dss.notebook.DSSNotebook(client, project_key, notebook_name, state=None)

A Python/R/Scala notebook on the DSS instance

unload(session_id=None)

Stop the notebook and release its resources

get_state()

Get the status of the notebook

get_sessions()

Get the list of the running sessions of this notebook