-
-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactored data writing #13
Conversation
# TODO: fails for weird reasons. | ||
#assert (write_function.function_space == self._write_function_space) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets figure this out before we merge this, it looks like we are messing up some types in DOLFINx
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure how you called the write_data
function but the assertion failed in my case when I forgot to interpolate to the function space given during the initialization.
If one simply uses flux.sub(0)
as an argument for the write_data
function, but specified the write function space W=V_g.sub(0).collapse()
, everything works as expected (if the assertion is commented out) but the assertion fails. Therefore, my guess would be that before calling the function, you missed to interpolate to W
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor things still to be done. I did some formatting
Generally sounds like a good plan. One thing that alarmed me a bit:
You are talking about the following API here, right? Why is this necessary? Is this the normal use of the (after looking at the code I understood that the |
@BenjaminRodenberg |
If that's a Dolfinx API thing, then Dolfinx users will also know how to deal with this situation (hopefully). I'm just from the good old FEniCS days, so it confuses me. But this probably just means I should learn more about Dolfinx ;) |
precice_data.append(sampled_data[lid]) | ||
if space.num_sub_spaces == 0: | ||
return FunctionType.SCALAR | ||
elif space.num_sub_spaces == 2: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't it be >= 2?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we support 3D, yes. Could also be simply "!= 0" as it's never 1 I think.
We should make sure to not forget about this PR. @PhilipHildebrand is currently working on the heat equation. As far as I understand the scope of this PR, it is not related to the heat equation, because this PR only deals with vector-valued quantities. |
@NiklasVin, @IshaanDesai and I discussed yesterday to prioritize #28. I'm closing this PR. If there is still something from this PR that we need to migrate (potentially in a different PR than #28) please write here and I will reopen. |
(Merging on #1 as it relies on it, but deserves a separe discussion)
Fixed the bug where writing vector data failed. More generally, I rewrote the data writing code to be more flexible. Instead of deciding whether to look at DOFs or geometrical vertices then doing ugly conversion, we simply look where the preCICE vertices are located and evaluate (
Function.eval()
) the function at these points. So if at some point we decide to switch from mesh-based mapping to DOF-based mapping, nothing to change here :)Also fixed a bug where
determine_function_type
returned the wrong result. New implementation is shorter and simpler.I removed the old
convert_fenicsx_to_precice
function as new ones are used.Tests were adapted, and now the write_vector_data test passes.
When calling
eval()
, we must provide the list of vertices and the cells where they live (e.g. "sample function at point (0.5, 0.7) assuming the point is in cell 14"). The cells are computed once at initialization, so actual evaluation is pretty efficient (there is no nearest-neighbor like search of the matching cell at each call).