multigrid.rllib package#

This package provides tools for using MultiGrid environments with the RLlib MultiAgentEnv API.

Usage#

Use a specific environment configuration from multigrid.envs by name:

>>> import multigrid.rllib # registers environment configurations with RLlib
>>> from ray.rllib.algorithms.ppo import PPOConfig
>>> algorithm_config = PPOConfig().environment(env='MultiGrid-Empty-8x8-v0')

Wrap an environment instance with RLlibWrapper:

>>> import gymnasium as gym
>>> import multigrid.envs
>>> env = gym.make('MultiGrid-Empty-8x8-v0', agents=2, render_mode='human')
>>> from multigrid.rllib import RLlibWrapper
>>> env = RLlibWrapper(env)

Wrap an environment class with to_rllib_env():

>>> from multigrid.envs import EmptyEnv
>>> from multigrid.rllib import to_rllib_env
>>> MyEnv = to_rllib_env(EmptyEnv, default_config={'size': 8})
>>> config = {'agents': 2, 'render_mode': 'human'}
>>> env = MyEnv(config)
class multigrid.rllib.RLlibWrapper[source]#

Bases: Wrapper, MultiAgentEnv

Wrapper for a MultiGridEnv environment that implements the RLlib MultiAgentEnv interface.

__init__(env: MultiGridEnv)[source]#

Wraps an environment to allow a modular transformation of the step() and reset() methods.

Args:

env: The environment to wrap

get_agent_ids()[source]#

Returns a set of agent ids in the environment.

Returns:

Set of agent ids.

step(*args, **kwargs)[source]#

Uses the step() of the env that can be overwritten to change the returned data.

multigrid.rllib.to_rllib_env(env_cls: type[multigrid.base.MultiGridEnv], *wrappers: Wrapper, default_config: dict = {}) type[ray.rllib.env.multi_agent_env.MultiAgentEnv][source]#

Convert a MultiGridEnv environment class to an RLLib MultiAgentEnv class.

Note that this is a wrapper around the environment class, not environment instances.

Parameters:
env_clstype[MultiGridEnv]

MultiGridEnv environment class

wrappersgym.Wrapper

Gym wrappers to apply to the environment

default_configdict

Default configuration for the environment

Returns:
rllib_env_clstype[MultiAgentEnv]

RLlib MultiAgentEnv environment class

Submodules#