Use Eval With Dictionary Without Losing Imported Modules In Python2
I have a string to be executed inside my python program and I want to change some variables in the string like x[1], x[2] to something else. I had previously used eval with 2 argum
Solution 1:
Build your globals dict
with globals()
as a base:
from math import log
# Copy the globals() dict so changes don't affect real globals
eval_globals = globals().copy()
# Tweak the copy to add desired new global
eval_globals[x[1]] = 1# eval using the updated copyeval('log(x[1])', eval_globals)
Alternatively, you can use three-arg eval
to use globals()
unmodified, but also supply a local
s dict
that will be checked (and modified) first, in preference to global values:
eval('log(x[1])', globals(), {x[1]: 1})
In theory, the latter approach could allow the expression to mutate the original globals, so adding .copy()
to make it eval('log(x[1])', globals().copy(), {x[1]: 1})
minimizes the risk of that happening accidentally. But pathological/malicious code could work around that; eval
is dangerous after all, don't trust it for arbitrary inputs no matter how sandboxed you make it.
Post a Comment for "Use Eval With Dictionary Without Losing Imported Modules In Python2"