Description
Algorithms can have traited attributes with range boundaries.
When from GUI such a field gets populate with a value outside the range, we do not launch the operation, but throw an error as the parameter is out f the specified range.
We should not throw an error, but rather a warning, and let the operation run.
Original discussion with Paula:
Paula:
I'm launching some simulations in the cluster. I guess that some restrictions
have been added with respect to the parameter ranges. The generic 2D
oscillator is failing for a value of I_ext = 2.1, which is a value I normally
use. I know I should probably change the range in the code.
However, while the current ranges are supposed give sensible defaults, it is
not always the case. In some cases is just a guesstimate. While restricting
the values to the range for PSEs seems ok, if I wish to try one specific value
out of the range, this should also be possible.
What I meant is that it is ok to check whether the default value is within the specified range, so we have some consistency.
That's how Mihai found those bad ranges in the model and it was really useful and it could be part of the test modules.
However, imposing that restriction through the interface without a scientific validation of the ranges – somewhat related to TVB-987 – doesn't seem like a good idea to me.
Lia:
I think I understand what you mean.
From a functional point of view, here it is a proposal on how to put these in practice:
- keep the ranges in the traited definition of the attributes
- possibly add another boolean flag Range.Strict
- when Range.Strict=True throw an error when trying to launch an algorithm with input value outside of the range
- when Range.Strict=False and input value outside of the range min/max interval, only display a warning message, but let the operation run
- default could be Range.Strict=False, until the scientific validation in related to
TVB-987is done
Do you think that something like that could accommodate better this part ?
Paula:
I have mixed feelings about adding too many flags. It could be a solution though.
Since I don't have a clear timeline for the scientific validation issue, it feels that adding a flag will be one of those things that starts as a temporary solution and then stays there forever .
Why was this restriction added?
Lia:
I agree. We should avoid "temporary" solutions
The restriction was added to avoid having NaN values generated.....
although the restriction can not completely solve the Nan issue, unless it has meaningful values,
and even so, it also needs some correlations between fields, or else we have the danger of becoming too restrictive.
Another possibility would be to remove the restriction entirely.
The range min/max values could be then used just when delimiting the PSE in web-ui.
Do you think that would be a better medium-term solution ?
Paula:
As you noticed NaN values cannot be entirely avoided. We may restrict parameter ranges, but for any other combination of connectivity matrix or the integration time-step size NaNs will be generated.
Speaking of nans, they are that not bad, in fact I don't want to get rid of them. In some publications, the PSE maps have some distinct colour for Nans (eg, white or some other colour differnt from the colourmap).
What we could have is a way to track these simulations by saving at what time point it occurred and probably tag the simulation or the resulting time-series, so when we compute the time-series metrics we directly set the value to NaN.
I'm not against that (PSE). This check is useful though.
So, maybe recycle it for the unittests.
Lia:
Ok. Then I will summarise our little discussion in a Jira issue (to have it archived for historic purposes)
And in the next version this check will no longer throw an error when running from GUI.
I am thinking to still display a warning message in the logs. What do you think ?
Paula:
That's a good idea. I forgot to ask if showing a warning message to the user is possible.