-
Notifications
You must be signed in to change notification settings - Fork 11
Can not copy a GymEnv environment when using a LightSim2Grid backend #97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi, I can confirm this issue for the following environment.
This issue has been open for about four months. I was wondering if there are any plans to work on it or if there is an estimated timeline for a fix. Any information would be greatly appreciated, as it would help me plan accordingly. Thank you for your time and attention to this matter! |
Hello, Indeed grid2op and deepcopy does work really well together. At the moment, grid2op uses the "env.copy()" which is rather well tested currently. I have no "bandwidth" at the moment to fix this. I'll see if I can reproduce and provide a quick fix. |
Hello @Borroot , Do you try to initialize a l2rpn_baselines.PPO_SB3.utils.SB3Agent instance ? (see first item of my notes) Have a good day |
Thank you both for your fast responses! @EBoguslawski Thank you for pointing this out, however it concerns a modified version of the @BDonnot Thank you for your response. I will use import grid2op
from lightsim2grid import LightSimBackend
import copy
env = grid2op.make("l2rpn_case14_sandbox", backend=LightSimBackend())
# This line raises an error
copy.deepcopy(env_gym) |
Signed-off-by: DONNOT Benjamin <benjamin.donnot@rte-france.com>
Hello, I made a quick fix which will work correctly for lightsim2grid. But it's not efficient from grid2op perspective (it will un necessarily copy things that should not be). I'll push the modifications on pypi tomorrow if all tests passed correctly. And you'll be able to install it with
This fix will be part of next release of lightsim2grid |
Thank you @BDonnot, I will check it out tomorrow. I know this is off-topic, and let me know if I should ask this question elsewhere, but do you have another recommendation on how to execute multiple actions in the same state? Currently I copy the environment, but since this inefficient like you said, another method would be preferred. I know it is also possible using |
Without a proper understanding, and depending on what you want exactly to do, there is:
All have pros and cons and depending on what you want to do it's more efficient to use any of the above. Copy of the env is the most expensive and the things that is the less "reinforcement learning" as I don't think it's easy to fit it in a MDP framework. Obs.simulate is faster than the copy. The modeling is the same as the one in the environment. The major difference between obs.simulate and env copy is that obs.simulate does not assumer perfect knowledge of the future. It relies on provided forecast (part of the env and part of what the agent can observe) The simulator is even more simple. In the sense it only provide an access to the... Power grid simulator. It does not assume anything about the future or whatever. It is only a simple object that can perform actions on a given grid state. |
Hello @Borroot , I just made the release on pypi, you can use it with:
(as it is a development version, pypi will not automatically download it, so you have to add the |
Hi @BDonnot Thank you so much for your quick action, I can confirm it works now! Also thanks for elaborating on the options available, this helps! |
Environment
3.12
1.10.5
0.9.2.post2
ubuntu
Bug description
Hello,
I create a
GymEnv
environment from a grid2op environment. If I use aLightSimBackend
backend when creating the grid2op environment, then I obtain an error if I try to copy theGymEnv
environment withcopy.deepcopy
.It works on an older environment with older versions (python
3.8
, Grid2op1.9.8
, LightSim2Grid0.7.0.post1
)Notes:
GymEnv
classHow to reproduce
Code snippet
Current output
The text was updated successfully, but these errors were encountered: