llm.config
HParamT
module-attribute
¶
Supported Hyperparameter types (i.e., JSON types).
Config
¶
Config(
mapping: (
Mapping[str, Any] | Iterable[tuple[str, Any]] | None
) = None,
/,
**kwargs: Any,
)
Dict-like configuration class with attribute access.
Example
Parameters:
-
mapping(Mapping[str, Any] | Iterable[tuple[str, Any]] | None, default:None) –Initial mapping or iterable of tuples of key-value pairs.
-
kwargs(Any, default:{}) –Keywords arguments to add a key-value pairs to the config.
Source code in llm/config.py
flattened_config
¶
Convert a config to a flat JSONable dictionary.
Note
If
torch.distributed.is_initialized(),
the world_size will be added to the config.
Parameters:
Returns:
-
dict[str, HParamT]–Flat dictionary containing only
bool,float,int,str, or -
dict[str, HParamT]–Nonevalues.
Source code in llm/config.py
flatten_mapping
¶
flatten_mapping(
d: Mapping[str, Any],
parent: str | None = None,
sep: str = "_",
) -> dict[str, Any]
Flatten mapping into dict by joining nested keys via a separator.
Warning
This function does not check for key collisions. E.g.,
Parameters:
-
d(Mapping[str, Any]) –Input mapping. All keys and nested keys must by strings.
-
parent(str | None, default:None) –Parent key to prepend to top-level keys in
d. -
sep(str, default:'_') –Separator between keys.
Returns:
Source code in llm/config.py
load_config
¶
Load Python file as a Config.
Note
Attributes starting with _, modules, classes, functions, and
builtins will not be loaded from the Python file.
Parameters:
Returns:
-
Config–Configuration attributes loaded from the Python file.