Shortcuts

ding.utils

autolog

Please refer to ding/utils/autolog for more details.

TimeMode

RangedData

TimeRangedData

LoggedModel

BaseTime

NaturalTime

TickTime

TimeProxy

LoggedValue

data.structure

Please refer to ding/utils/data/structure for more details.

Cache

LifoDeque

data.base_dataloader

Please refer to ding/utils/data/base_dataloader for more details.

IDataLoader

data.collate_fn

Please refer to ding/utils/data/collate_fn for more details.

ttorch_collate

default_collate

timestep_collate

diff_shape_collate

default_decollate

data.dataloader

Please refer to ding/utils/data/dataloader for more details.

AsyncDataLoader

data.dataset

Please refer to ding/utils/data/dataset for more details.

DatasetStatistics

NaiveRLDataset

D4RLDataset

HDF5Dataset

D4RLTrajectoryDataset

D4RLDiffuserDataset

FixedReplayBuffer

PCDataset

load_bfs_datasets

BCODataset

SequenceDataset

hdf5_save

naive_save

offline_data_save_type

create_dataset

bfs_helper

Please refer to ding/utils/bfs_helper for more details.

get_vi_sequence

collection_helper

Please refer to ding/utils/collection_helper for more details.

iter_mapping

ding.utils.collection_helper.iter_mapping(iter_: Iterable[_IterType], mapping: Callable[[_IterType], _IterTargetType])[源代码]
Overview:

Map a list of iterable elements to input iteration callable

Arguments:
  • iter_(_IterType list): The list for iteration

  • mapping (Callable [[_IterType], _IterTargetType]): A callable that maps iterable elements function.

Return:
  • (iter_mapping object): Iteration results

Example:
>>> iterable_list = [1, 2, 3, 4, 5]
>>> _iter = iter_mapping(iterable_list, lambda x: x ** 2)
>>> print(list(_iter))
[1, 4, 9, 16, 25]

compression_helper

Please refer to ding/utils/compression_helper for more details.

CloudPickleWrapper

class ding.utils.compression_helper.CloudPickleWrapper(data: Any)[源代码]
Overview:

CloudPickleWrapper can be able to pickle more python object(e.g: an object with lambda expression).

Interfaces:

__init__, __getstate__, __setstate__.

__init__(data: Any) None[源代码]
Overview:

Initialize the CloudPickleWrapper using the given arguments.

Arguments:
  • data (Any): The object to be dumped.

dummy_compressor

ding.utils.compression_helper.dummy_compressor(data: Any) Any[源代码]
Overview:

Return the raw input data.

Arguments:
  • data (Any): The input data of the compressor.

Returns:
  • output (Any): This compressor will exactly return the input data.

zlib_data_compressor

ding.utils.compression_helper.zlib_data_compressor(data: Any) bytes[源代码]
Overview:

Takes the input compressed data and return the compressed original data (zlib compressor) in binary format.

Arguments:
  • data (Any): The input data of the compressor.

Returns:
  • output (bytes): The compressed byte-like result.

Examples:
>>> zlib_data_compressor("Hello")

lz4_data_compressor

ding.utils.compression_helper.lz4_data_compressor(data: Any) bytes[源代码]
Overview:

Return the compressed original data (lz4 compressor).The compressor outputs in binary format.

Arguments:
  • data (Any): The input data of the compressor.

Returns:
  • output (bytes): The compressed byte-like result.

Examples:
>>> lz4.block.compress(pickle.dumps("Hello"))
b'R€•      ŒHello”.'

jpeg_data_compressor

ding.utils.compression_helper.jpeg_data_compressor(data: ndarray) bytes[源代码]
Overview:

To reduce memory usage, we can choose to store the jpeg strings of image instead of the numpy array in the buffer. This function encodes the observation numpy arr to the jpeg strings.

Arguments:
  • data (np.array): the observation numpy arr.

Returns:
  • img_str (bytes): The compressed byte-like result.

get_data_compressor

ding.utils.compression_helper.get_data_compressor(name: str)[源代码]
Overview:

Get the data compressor according to the input name.

Arguments:
  • name(str): Name of the compressor, support ['lz4', 'zlib', 'jpeg', 'none']

Return:
  • compressor (Callable): Corresponding data_compressor, taking input data returning compressed data.

Example:
>>> compress_fn = get_data_compressor('lz4')
>>> compressed_data = compressed(input_data)

dummy_decompressor

ding.utils.compression_helper.dummy_decompressor(data: Any) Any[源代码]
Overview:

Return the input data.

Arguments:
  • data (Any): The input data of the decompressor.

Returns:
  • output (bytes): The decompressed result, which is exactly the input.

lz4_data_decompressor

ding.utils.compression_helper.lz4_data_decompressor(compressed_data: bytes) Any[源代码]
Overview:

Return the decompressed original data (lz4 compressor).

Arguments:
  • data (bytes): The input data of the decompressor.

Returns:
  • output (Any): The decompressed object.

zlib_data_decompressor

ding.utils.compression_helper.zlib_data_decompressor(compressed_data: bytes) Any[源代码]
Overview:

Return the decompressed original data (zlib compressor).

Arguments:
  • data (bytes): The input data of the decompressor.

Returns:
  • output (Any): The decompressed object.

jpeg_data_decompressor

ding.utils.compression_helper.jpeg_data_decompressor(compressed_data: bytes, gray_scale=False) ndarray[源代码]
Overview:

To reduce memory usage, we can choose to store the jpeg strings of image instead of the numpy array in the buffer. This function decodes the observation numpy arr from the jpeg strings.

Arguments:
  • compressed_data (bytes): The jpeg strings.

  • gray_scale (bool): If the observation is gray, gray_scale=True,

    if the observation is RGB, gray_scale=False.

Returns:
  • arr (np.ndarray): The decompressed numpy array.

get_data_decompressor

ding.utils.compression_helper.get_data_decompressor(name: str) Callable[源代码]
Overview:

Get the data decompressor according to the input name.

Arguments:
  • name(str): Name of the decompressor, support ['lz4', 'zlib', 'none']

备注

For all the decompressors, the input of a bytes-like object is required.

Returns:
  • decompressor (Callable): Corresponding data decompressor.

Examples:
>>> decompress_fn = get_data_decompressor('lz4')
>>> origin_data = compressed(compressed_data)

default_helper

Please refer to ding/utils/default_helper for more details.

get_shape0

ding.utils.default_helper.get_shape0(data: List | Dict | Tensor | Tensor) int[源代码]
Overview:

Get shape[0] of data’s torch tensor or treetensor

Arguments:
  • data (Union[List,Dict,torch.Tensor,ttorch.Tensor]): data to be analysed

Returns:
  • shape[0] (int): first dimension length of data, usually the batchsize.

lists_to_dicts

ding.utils.default_helper.lists_to_dicts(data: List[dict | NamedTuple] | Tuple[dict | NamedTuple], recursive: bool = False) Mapping[object, object] | NamedTuple[源代码]
Overview:

Transform a list of dicts to a dict of lists.

Arguments:
  • data (Union[List[Union[dict, NamedTuple]], Tuple[Union[dict, NamedTuple]]]):

    A dict of lists need to be transformed

  • recursive (bool): whether recursively deals with dict element

Returns:
  • newdata (Union[Mapping[object, object], NamedTuple]): A list of dicts as a result

Example:
>>> from ding.utils import *
>>> lists_to_dicts([{1: 1, 10: 3}, {1: 2, 10: 4}])
{1: [1, 2], 10: [3, 4]}

dicts_to_lists

ding.utils.default_helper.dicts_to_lists(data: Mapping[object, List[object]]) List[Mapping[object, object]][源代码]
Overview:

Transform a dict of lists to a list of dicts.

Arguments:
  • data (Mapping[object, list]): A list of dicts need to be transformed

Returns:
  • newdata (List[Mapping[object, object]]): A dict of lists as a result

Example:
>>> from ding.utils import *
>>> dicts_to_lists({1: [1, 2], 10: [3, 4]})
[{1: 1, 10: 3}, {1: 2, 10: 4}]

override

ding.utils.default_helper.override(cls: type) Callable[[Callable], Callable][源代码]
Overview:

Annotation for documenting method overrides.

Arguments:
  • cls (type): The superclass that provides the overridden method. If this

    cls does not actually have the method, an error is raised.

squeeze

ding.utils.default_helper.squeeze(data: object) object[源代码]
Overview:

Squeeze data from tuple, list or dict to single object

Arguments:
  • data (object): data to be squeezed

Example:
>>> a = (4, )
>>> a = squeeze(a)
>>> print(a)
>>> 4

default_get

ding.utils.default_helper.default_get(data: dict, name: str, default_value: Any | None = None, default_fn: Callable | None = None, judge_fn: Callable | None = None) Any[源代码]
Overview:

Getting the value by input, checks generically on the inputs with at least data and name. If name exists in data, get the value at name; else, add name to default_get_set with value generated by default_fn (or directly as default_value) that is checked by `` judge_fn`` to be legal.

Arguments:
  • data(dict): Data input dictionary

  • name(str): Key name

  • default_value(Optional[Any]) = None,

  • default_fn(Optional[Callable]) = Value

  • judge_fn(Optional[Callable]) = None

Returns:
  • ret(list): Splitted data

  • residual(list): Residule list

list_split

ding.utils.default_helper.list_split(data: list, step: int) List[list][源代码]
Overview:

Split list of data by step.

Arguments:
  • data(list): List of data for spliting

  • step(int): Number of step for spliting

Returns:
  • ret(list): List of splitted data.

  • residual(list): Residule list. This value is None when data divides steps.

Example:
>>> list_split([1,2,3,4],2)
([[1, 2], [3, 4]], None)
>>> list_split([1,2,3,4],3)
([[1, 2, 3]], [4])

error_wrapper

ding.utils.default_helper.error_wrapper(fn, default_ret, warning_msg='')[源代码]
Overview:

wrap the function, so that any Exception in the function will be catched and return the default_ret

Arguments:
  • fn (Callable): the function to be wraped

  • default_ret (obj): the default return when an Exception occurred in the function

Returns:
  • wrapper (Callable): the wrapped function

Examples:
>>> # Used to checkfor Fakelink (Refer to utils.linklink_dist_helper.py)
>>> def get_rank():  # Get the rank of linklink model, return 0 if use FakeLink.
>>>    if is_fake_link:
>>>        return 0
>>>    return error_wrapper(link.get_rank, 0)()

LimitedSpaceContainer

class ding.utils.default_helper.LimitedSpaceContainer(min_val: int, max_val: int)[源代码]
Overview:

A space simulator.

Interfaces:

__init__, get_residual_space, release_space

__init__(min_val: int, max_val: int) None[源代码]
Overview:

Set min_val and max_val of the container, also set cur to min_val for initialization.

Arguments:
  • min_val (int): Min volume of the container, usually 0.

  • max_val (int): Max volume of the container.

acquire_space() bool[源代码]
Overview:

Try to get one pice of space. If there is one, return True; Otherwise return False.

Returns:
  • flag (bool): Whether there is any piece of residual space.

decrease_space() None[源代码]
Overview:

Decrease one piece in space. Decrement max_val.

get_residual_space() int[源代码]
Overview:

Get all residual pieces of space. Set cur to max_val

Arguments:
  • ret (int): Residual space, calculated by max_val - cur.

increase_space() None[源代码]
Overview:

Increase one piece in space. Increment max_val.

release_space() None[源代码]
Overview:

Release only one piece of space. Decrement cur, but ensure it won’t be negative.

deep_merge_dicts

ding.utils.default_helper.deep_merge_dicts(original: dict, new_dict: dict) dict[源代码]
Overview:

Merge two dicts by calling deep_update

Arguments:
  • original (dict): Dict 1.

  • new_dict (dict): Dict 2.

Returns:
  • merged_dict (dict): A new dict that is d1 and d2 deeply merged.

deep_update

ding.utils.default_helper.deep_update(original: dict, new_dict: dict, new_keys_allowed: bool = False, whitelist: List[str] | None = None, override_all_if_type_changes: List[str] | None = None)[源代码]
Overview:

Update original dict with values from new_dict recursively.

Arguments:
  • original (dict): Dictionary with default values.

  • new_dict (dict): Dictionary with values to be updated

  • new_keys_allowed (bool): Whether new keys are allowed.

  • whitelist (Optional[List[str]]):

    List of keys that correspond to dict values where new subkeys can be introduced. This is only at the top level.

  • override_all_if_type_changes(Optional[List[str]]):

    List of top level keys with value=dict, for which we always simply override the entire value (dict), if the “type” key in that value dict changes.

备注

If new key is introduced in new_dict, then if new_keys_allowed is not True, an error will be thrown. Further, for sub-dicts, if the key is in the whitelist, then new subkeys can be introduced.

flatten_dict

ding.utils.default_helper.flatten_dict(data: dict, delimiter: str = '/') dict[源代码]
Overview:

Flatten the dict, see example

Arguments:
  • data (dict): Original nested dict

  • delimiter (str): Delimiter of the keys of the new dict

Returns:
  • data (dict): Flattened nested dict

Example:
>>> a
{'a': {'b': 100}}
>>> flatten_dict(a)
{'a/b': 100}

set_pkg_seed

ding.utils.default_helper.set_pkg_seed(seed: int, use_cuda: bool = True) None[源代码]
Overview:

Side effect function to set seed for random, numpy random, and torch's manual seed. This is usaually used in entry scipt in the section of setting random seed for all package and instance

Argument:
  • seed(int): Set seed

  • use_cuda(bool) Whether use cude

Examples:
>>> # ../entry/xxxenv_xxxpolicy_main.py
>>> ...
# Set random seed for all package and instance
>>> collector_env.seed(seed)
>>> evaluator_env.seed(seed, dynamic_seed=False)
>>> set_pkg_seed(seed, use_cuda=cfg.policy.cuda)
>>> ...
# Set up RL Policy, etc.
>>> ...

one_time_warning

ding.utils.default_helper.one_time_warning(warning_msg: str) None[源代码]
Overview:

Print warning message only once.

Arguments:
  • warning_msg (str): Warning message.

split_fn

ding.utils.default_helper.split_fn(data, indices, start, end)[源代码]
Overview:

Split data by indices

Arguments:
  • data (Union[List, Dict, torch.Tensor, ttorch.Tensor]): data to be analysed

  • indices (np.ndarray): indices to split

  • start (int): start index

  • end (int): end index

split_data_generator

ding.utils.default_helper.split_data_generator(data: dict, split_size: int, shuffle: bool = True) dict[源代码]
Overview:

Split data into batches

Arguments:
  • data (dict): data to be analysed

  • split_size (int): split size

  • shuffle (bool): whether shuffle

RunningMeanStd

class ding.utils.default_helper.RunningMeanStd(epsilon=0.0001, shape=(), device=device(type='cpu'))[源代码]
Overview:

Wrapper to update new variable, new mean, and new count

Interfaces:

__init__, update, reset, new_shape

Properties:
  • mean, std, _epsilon, _shape, _mean, _var, _count

__init__(epsilon=0.0001, shape=(), device=device(type='cpu'))[源代码]
Overview:

Initialize self. See help(type(self)) for accurate signature; setup the properties.

Arguments:
  • env (gym.Env): the environment to wrap.

  • epsilon (Float): the epsilon used for self for the std output

  • shape (:obj: np.array): the np array shape used for the expression of this wrapper on attibutes of mean and variance

property mean: ndarray
Overview:

Property mean gotten from self._mean

static new_shape(obs_shape, act_shape, rew_shape)[源代码]
Overview:

Get new shape of observation, acton, and reward; in this case unchanged.

Arguments:

obs_shape (Any), act_shape (Any), rew_shape (Any)

Returns:

obs_shape (Any), act_shape (Any), rew_shape (Any)

reset()[源代码]
Overview:

Resets the state of the environment and reset properties: _mean, _var, _count

property std: ndarray
Overview:

Property std calculated from self._var and the epsilon value of self._epsilon

update(x)[源代码]
Overview:

Update mean, variable, and count

Arguments:
  • x: the batch

make_key_as_identifier

ding.utils.default_helper.make_key_as_identifier(data: Dict[str, Any]) Dict[str, Any][源代码]
Overview:

Make the key of dict into legal python identifier string so that it is compatible with some python magic method such as __getattr.

Arguments:
  • data (Dict[str, Any]): The original dict data.

Return:
  • new_data (Dict[str, Any]): The new dict data with legal identifier keys.

remove_illegal_item

ding.utils.default_helper.remove_illegal_item(data: Dict[str, Any]) Dict[str, Any][源代码]
Overview:

Remove illegal item in dict info, like str, which is not compatible with Tensor.

Arguments:
  • data (Dict[str, Any]): The original dict data.

Return:
  • new_data (Dict[str, Any]): The new dict data without legal items.

design_helper

Please refer to ding/utils/design_helper for more details.

SingletonMetaclass

class ding.utils.design_helper.SingletonMetaclass(name, bases, namespace, **kwargs)[源代码]
Overview:

Returns the given type instance in input class

Interfaces:

__call__

instances = {}

fast_copy

Please refer to ding/utils/fast_copy for more details.

_FastCopy

file_helper

Please refer to ding/utils/file_helper for more details.

read_from_ceph

ding.utils.file_helper.read_from_ceph(path: str) object[源代码]
Overview:

Read file from ceph

Arguments:
  • path (str): File path in ceph, start with "s3://"

Returns:
  • (data): Deserialized data

_get_redis

ding.utils.file_helper._get_redis(host='localhost', port=6379)[源代码]
Overview:

Ensures redis usage

Arguments:
  • host (str): Host string

  • port (int): Port number

Returns:
  • (Redis(object)): Redis object with given host, port, and db=0

read_from_redis

ding.utils.file_helper.read_from_redis(path: str) object[源代码]
Overview:

Read file from redis

Arguments:
  • path (str): Dile path in redis, could be a string key

Returns:
  • (data): Deserialized data

_ensure_rediscluster

ding.utils.file_helper._ensure_rediscluster(startup_nodes=[{'host': '127.0.0.1', 'port': '7000'}])[源代码]
Overview:

Ensures redis usage

Arguments:
  • List of startup nodes (dict) of
    • host (str): Host string

    • port (int): Port number

Returns:
  • (RedisCluster(object)): RedisCluster object with given host, port, and False for decode_responses in default.

read_from_rediscluster

ding.utils.file_helper.read_from_rediscluster(path: str) object[源代码]
Overview:

Read file from rediscluster

Arguments:
  • path (str): Dile path in rediscluster, could be a string key

Returns:
  • (data): Deserialized data

read_from_file

ding.utils.file_helper.read_from_file(path: str) object[源代码]
Overview:

Read file from local file system

Arguments:
  • path (str): File path in local file system

Returns:
  • (data): Deserialized data

_ensure_memcached

ding.utils.file_helper._ensure_memcached()[源代码]
Overview:

Ensures memcache usage

Returns:
  • (MemcachedClient instance): MemcachedClient’s class instance built with current memcached_client’s server_list.conf and client.conf files

read_from_mc

ding.utils.file_helper.read_from_mc(path: str, flush=False) object[源代码]
Overview:

Read file from memcache, file must be saved by torch.save()

Arguments:
  • path (str): File path in local system

Returns:
  • (data): Deserialized data

read_from_path

ding.utils.file_helper.read_from_path(path: str)[源代码]
Overview:

Read file from ceph

Arguments:
  • path (str): File path in ceph, start with "s3://", or use local file system

Returns:
  • (data): Deserialized data

save_file_ceph

ding.utils.file_helper.save_file_ceph(path, data)[源代码]
Overview:

Save pickle dumped data file to ceph

Arguments:
  • path (str): File path in ceph, start with "s3://", use file system when not

  • data (Any): Could be dict, list or tensor etc.

save_file_redis

ding.utils.file_helper.save_file_redis(path, data)[源代码]
Overview:

Save pickle dumped data file to redis

Arguments:
  • path (str): File path (could be a string key) in redis

  • data (Any): Could be dict, list or tensor etc.

save_file_rediscluster

ding.utils.file_helper.save_file_rediscluster(path, data)[源代码]
Overview:

Save pickle dumped data file to rediscluster

Arguments:
  • path (str): File path (could be a string key) in redis

  • data (Any): Could be dict, list or tensor etc.

read_file

ding.utils.file_helper.read_file(path: str, fs_type: None | str = None, use_lock: bool = False) object[源代码]
Overview:

Read file from path

Arguments:
  • path (str): The path of file to read

  • fs_type (str or None): The file system type, support {'normal', 'ceph'}

  • use_lock (bool): Whether use_lock is in local normal file system

save_file

ding.utils.file_helper.save_file(path: str, data: object, fs_type: None | str = None, use_lock: bool = False) None[源代码]
Overview:

Save data to file of path

Arguments:
  • path (str): The path of file to save to

  • data (object): The data to save

  • fs_type (str or None): The file system type, support {'normal', 'ceph'}

  • use_lock (bool): Whether use_lock is in local normal file system

remove_file

ding.utils.file_helper.remove_file(path: str, fs_type: None | str = None) None[源代码]
Overview:

Remove file

Arguments:
  • path (str): The path of file you want to remove

  • fs_type (str or None): The file system type, support {'normal', 'ceph'}

import_helper

Please refer to ding/utils/import_helper for more details.

try_import_ceph

ding.utils.import_helper.try_import_ceph()[源代码]
Overview:

Try import ceph module, if failed, return None

Returns:
  • (Module): Imported module, or None when ceph not found

try_import_mc

ding.utils.import_helper.try_import_mc()[源代码]
Overview:

Try import mc module, if failed, return None

Returns:
  • (Module): Imported module, or None when mc not found

try_import_redis

ding.utils.import_helper.try_import_redis()[源代码]
Overview:

Try import redis module, if failed, return None

Returns:
  • (Module): Imported module, or None when redis not found

try_import_rediscluster

ding.utils.import_helper.try_import_rediscluster()[源代码]
Overview:

Try import rediscluster module, if failed, return None

Returns:
  • (Module): Imported module, or None when rediscluster not found

import_module

ding.utils.import_helper.import_module(modules: List[str]) None[源代码]
Overview:

Import several module as a list

Arguments:
  • (str list): List of module names

k8s_helper

Please refer to ding/utils/k8s_helper for more details.

get_operator_server_kwargs

exist_operator_server

pod_exec_command

K8sType

K8sLauncher

lock_helper

Please refer to ding/utils/lock_helper for more details.

LockContextType

class ding.utils.lock_helper.LockContextType(value)[源代码]
Overview:

Enum to express the type of the lock.

PROCESS_LOCK = 2
THREAD_LOCK = 1

LockContext

class ding.utils.lock_helper.LockContext(lock_type: LockContextType = LockContextType.THREAD_LOCK)[源代码]
Overview:

Generate a LockContext in order to make sure the thread safety.

Interfaces:

__init__, __enter__, __exit__.

Example:
>>> with LockContext() as lock:
>>>     print("Do something here.")
__init__(lock_type: LockContextType = LockContextType.THREAD_LOCK)[源代码]
Overview:

Init the lock according to the given type.

Arguments:
  • lock_type (LockContextType): The type of lock to be used. Defaults to LockContextType.THREAD_LOCK.

acquire()[源代码]
Overview:

Acquires the lock.

release()[源代码]
Overview:

Releases the lock.

get_rw_file_lock

ding.utils.lock_helper.get_rw_file_lock(name: str, op: str)[源代码]
Overview:

Get generated file lock with name and operator

Arguments:
  • name (str): Lock’s name.

  • op (str): Assigned operator, i.e. read or write.

Returns:
  • (RWLockFairD): Generated rwlock

FcntlContext

class ding.utils.lock_helper.FcntlContext(lock_path: str)[源代码]
Overview:

A context manager that acquires an exclusive lock on a file using fcntl. This is useful for preventing multiple processes from running the same code.

Interfaces:

__init__, __enter__, __exit__.

Example:
>>> lock_path = "/path/to/lock/file"
>>> with FcntlContext(lock_path) as lock:
>>>    # Perform operations while the lock is held
__init__(lock_path: str) None[源代码]
Overview:

Initialize the LockHelper object.

Arguments:
  • lock_path (str): The path to the lock file.

get_file_lock

ding.utils.lock_helper.get_file_lock(name: str, op: str) FcntlContext[源代码]
Overview:

Acquires a file lock for the specified file.

Arguments:
  • name (str): The name of the file.

  • op (str): The operation to perform on the file lock.

log_helper

Please refer to ding/utils/log_helper for more details.

build_logger

TBLoggerFactory

LoggerFactory

pretty_print

log_writer_helper

Please refer to ding/utils/log_writer_helper for more details.

DistributedWriter

enable_parallel

normalizer_helper

Please refer to ding/utils/normalizer_helper for more details.

DatasetNormalizer

flatten

Normalizer

GaussianNormalizer

CDFNormalizer

CDFNormalizer1d

empirical_cdf

atleast_2d

LimitsNormalizer

orchestrator_launcher

Please refer to ding/utils/orchestrator_launcher for more details.

OrchestratorLauncher

create_components_from_config

wait_to_be_ready

profiler_helper

Please refer to ding/utils/profiler_helper for more details.

Profiler

pytorch_ddp_dist_helper

Please refer to ding/utils/pytorch_ddp_dist_helper for more details.

get_rank

get_world_size

allreduce

allreduce_async

reduce_data

allreduce_data

get_group

dist_mode

dist_init

dist_finalize

DDPContext

simple_group_split

to_ddp_config

registry

Please refer to ding/utils/registry for more details.

Registry

render_helper

Please refer to ding/utils/render_helper for more details.

render_env

render

get_env_fps

fps

scheduler_helper

Please refer to ding/utils/scheduler_helper for more details.

Scheduler

segment_tree

Please refer to ding/utils/segment_tree for more details.

njit

SegmentTree

SumSegmentTree

MinSegmentTree

_setitem

_reduce

_find_prefixsum_idx

slurm_helper

Please refer to ding/utils/slurm_helper for more details.

get_ip

get_manager_node_ip

get_cls_info

node_to_partition

node_to_host

find_free_port_slurm

system_helper

Please refer to ding/utils/system_helper for more details.

get_ip

get_pid

get_task_uid

PropagatingThread

find_free_port

time_helper_base

Please refer to ding/utils/time_helper_base for more details.

TimeWrapper

time_helper_cuda

Please refer to ding/utils/time_helper_cuda for more details.

get_cuda_time_wrapper

time_helper

Please refer to ding/utils/time_helper for more details.

build_time_helper

EasyTimer

TimeWrapperTime

WatchDog

loader.base

Please refer to ding/utils/loader/base for more details.

ILoaderClass

loader.collection

Please refer to ding/utils/loader/collection for more details.

CollectionError

collection

tuple

length

length_is

contains

cofilter

tpselector

loader.dict

Please refer to ding/utils/loader/dict for more details.

DictError

dict

loader.exception

Please refer to ding/utils/loader/exception for more details.

CompositeStructureError

loader.mapping

Please refer to ding/utils/loader/mapping for more details.

MappingError

mapping

mpfilter

mpkeys

mpvalues

mpitems

item

item_or

loader.norm

Please refer to ding/utils/loader/norm for more details.

_callable_to_norm

norm

normfunc

_unary

_binary

_binary_reducing

INormClass

lcmp

loader.number

Please refer to ding/utils/loader/number for more details.

numeric

interval

is_negative

is_positive

non_negative

non_positive

negative

positive

_math_binary

plus

minus

minus_with

multi

divide

divide_with

power

power_with

msum

mmulti

_msinglecmp

mcmp

loader.string

Please refer to ding/utils/loader/string for more details.

enum

_to_regexp

rematch

regrep

loader.types

Please refer to ding/utils/loader/types for more details.

is_type

to_type

is_callable

prop

method

fcall

fpartial

loader.utils

Please refer to ding/utils/loader/utils for more details.

keep

raw

optional

check_only

check

Read the Docs v: latest
Versions
latest
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.