-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement simulation object validation checks #94
Implement simulation object validation checks #94
Conversation
Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement (CLA), and it seems that the following contributors have not yet signed it: @rmoretti9. In order for us to review and merge your code, please sign the CLA at meetiqm.com. Once you have signed the CLA, write a comment here and we'll verify. Please note that it may take some time for us to verify it. |
I have now signed the CLA |
Oh hi :D Thanks for submission! It will take me some time to receive confirmation that you CLA form went through but in the mean time I will go through the code and test that it works |
Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement (CLA), and it seems that the following contributors have not yet signed it: @rmoretti9. In order for us to review and merge your code, please sign the CLA at meetiqm.com. Once you have signed the CLA, write a comment here and we'll verify. Please note that it may take some time for us to verify it. |
I introduced the function I'm currently trying to figure out how to simplify the introduction of new validation checks, and making them less prone to nested if statements. For instance, |
@cla-bot check |
The cla-bot has been summoned, and re-checked this pull request! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would recommend to change the function signature of validate_simulation
so that instead of taking a list of Simulation
s, it takes a Simulation
and a Solution
object, and handle them explicitly instead of addressing them by simulation[0]
and simulation[1]
. You can see for example in elmer_export.py
if it detects that there is no Solution
object paired with Simulation
, then it will construct a Solution
object from solution_parameters
. But at that point we have already run validate_simulation
. So maybe validate_simulation
could instead be invoked at the beginning of the for sim_sol in simulations
when we have explicitly defined simulation
and solution
.
port_and_simulation_type_compatibility
could probably be separated into two cases: "If no ports, solution cant be X" and "if edgeport, solution cant be X". It's good that these validation functions have single responsibility, and its better to get a bigger list of clearly defined checks than a small list of checks that do subchecks. One ValueError
return per function is probably a good rule of thumb. The check functions could also be separated into some distinct new file so they are more easily maintained.
You could maybe introduce a custom exception type instead of using ValueError
, we might treat these errors somehow differently from generic errors in the future.
We would also like some unit tests for these checks. Since the checks don't rely on any real geometry but only on ports, you could probably mock a Simulation
object without a Cell
and just insert arbitrary ports in there.
Thank you @qpavsmi, I think I implemented your suggestions now, but I'm a bit confused about the unit tests since I have very little experience with them. I made a test module in |
Took me some time to realise this, but it's actually not your test failing, but your pull request breaks one of the existing tests: test_export_produces_output_files. It creates a completely empty simulation (no ports either), and checks that it can be exported without causing errors. Now if solution type is not specified, it defaults to It's best to modify this test so that it sets the solution type to something that passes this check. I found one way to fix this test is to fix the fixtures we defined for simulation tests. Notice that in the failing unit test we give As this is getting very complicated, I'll just say exactly what to do to fix this test. We would like to be able to set which solution type to use by calling
After this change the test should pass. Consider making similar change for existing elmer test fixtures that we have. I would recommend to study the fixtures since that could help you structure the tests you are writing better. Generally its good practice that each test function is written in such a way that they can be independently run, which means they set everything needed for the test (and tear it down if needed). Of course it gets annoying when the same set up is copy pasted for each unit test. This is where fixtures are useful, and pytest makes sure that the fixture is reconstructed from scratch for every unit test so that they are not made dependent on their order of execution. So in your example the initialisation of You will also need to test for negative cases: with this simulation and that solution I expect it to fail. You can do this using |
Remember to run |
Hi @qpavsmi, thanks for the useful suggestion. I fixed the |
I think after addressing this round of suggestions we can approve and get this merged :) |
161af97
to
c3fe9b2
Compare
Thanks @qpavsmi, I squashed my commits and marked this Pull Request as ready for review |
I'm not sure if its visible to you but there are many pending review points that I outlined based on the code changes These would be good to address before we merge |
c3fe9b2
to
97fd20c
Compare
It seems like I cannot access these comments (or I don't know how to do that), but I modified the code based on the screenshot's suggestion. Except for the one about adding the empty new line. I'm not sure where it should be added. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be visible now
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
klayout_package/python/kqcircuits/simulations/export/simulation_export.py
Outdated
Show resolved
Hide resolved
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
tests/simulations/simulation_validate/test_simulation_validate.py
Outdated
Show resolved
Hide resolved
tests/simulations/simulation_validate/test_simulation_validate.py
Outdated
Show resolved
Hide resolved
tests/simulations/simulation_validate/test_simulation_validate.py
Outdated
Show resolved
Hide resolved
tests/simulations/simulation_validate/test_simulation_validate.py
Outdated
Show resolved
Hide resolved
tests/simulations/simulation_validate/test_simulation_validate.py
Outdated
Show resolved
Hide resolved
You should be able to see the suggestions now I think: #94 (review) |
97fd20c
to
f2dbe58
Compare
I applied all the suggested changes. Maybe the usage of @pytest.mark.parametrize in tests is a bit redundant as it is now, but it allows adding new tests quite easily |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for sticking through the review process :D Implementation is very clear now and the unit tests are comprehensive. Approving now, will merge after I have run some local tests.
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
f2dbe58
to
3124abb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving again after fixing linter issues
klayout_package/python/kqcircuits/simulations/export/simulation_validate.py
Outdated
Show resolved
Hide resolved
3124abb
to
3b372c7
Compare
This pull request addresses issue #91