class InferenceContext: (source)
Provide context for inference.
Store already inferred nodes to save time. Account for already visited nodes to stop infinite recursion.
Method | __init__ |
Do not instiate me directly, use copy_context() or Parser._new_context(). |
Method | __str__ |
Undocumented |
Method | clone |
Clone inference path |
Method | nodes |
Undocumented |
Method | push |
Push node into inference path |
Class Variable | __slots__ |
Undocumented |
Class Variable | max |
Undocumented |
Instance Variable | path |
type: set(NodeNG) |
Property | inferred |
Inferred (cached) nodes to their mapped results. |
Property | nodes |
Number of nodes inferred in this context and all its clones/descendents |
Instance Variable | _cache |
Store cache here instead of using a global variable. |
Instance Variable | _nodes |
Undocumented |
Clone inference path
For example, each side of a binary operation (BinOp) starts with the same context but diverge as each side is inferred so the InferenceContext will need be cloned
Returns | |
InferenceContext | Undocumented |
Note | |
If a new cache is needed for this context, use copy_context
with argument: cache={}. |
Number of nodes inferred in this context and all its clones/descendents
Wrap inner value in a mutable cell to allow for mutating a class variable in the presence of __slots__
Store cache here instead of using a global variable.
This dict is shared by all InferenceContext instances
created by the method parser.Parser._new_context
.
Two different parser.Parser
instances will use two different caches.
More on the cache: https://github.com/PyCQA/astroid/pull/1009