Skip to content

FIX: Misinterpretation of voxel ordering in LTAs #129

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jul 21, 2021

Conversation

oesteban
Copy link
Collaborator

The structarray was used directly and the extra axis actually changed the order of operations when the direction cosines were scaled by the voxel sizes.

A new test case has been added for which this error was apparent. This bug caused nipreps/fmriprep#2307, nipreps/fmriprep#2393, and nipreps/fmriprep#2410. nipreps/fmriprep#2444 just works around the problem by using lta_convert instead of NiTransforms. The lta_convert tool can be now dropped.

Resolves: #125

The structarray was used directly and the extra axis actually changed
the order of operations when the direction cosines were scaled by the
voxel sizes.

A new test case has been added for which this error was apparent.
This bug caused nipreps/fmriprep#2307, nipreps/fmriprep#2393, and
nipreps/fmriprep#2410.
nipreps/fmriprep#2444 just works around the problem by using
``lta_convert`` instead of *NiTransforms*.
The ``lta_convert`` tool can be now dropped.

Resolves: #125
@oesteban oesteban requested a review from effigies July 19, 2021 16:19
@pep8speaks
Copy link

pep8speaks commented Jul 19, 2021

Hello @oesteban! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-07-21 14:31:36 UTC

@codecov-commenter
Copy link

codecov-commenter commented Jul 19, 2021

Codecov Report

Merging #129 (0ebf0e6) into master (0847e98) will decrease coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #129      +/-   ##
==========================================
- Coverage   98.89%   98.88%   -0.01%     
==========================================
  Files          12       12              
  Lines        1084     1081       -3     
  Branches      138      138              
==========================================
- Hits         1072     1069       -3     
  Misses          6        6              
  Partials        6        6              
Flag Coverage Δ
travis 98.88% <100.00%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
nitransforms/io/lta.py 99.32% <100.00%> (-0.02%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0847e98...0ebf0e6. Read the comment docs.

@oesteban oesteban force-pushed the fix/lta-concatenation branch from 78bb9c5 to f615827 Compare July 20, 2021 09:13
@oesteban oesteban requested a review from effigies July 20, 2021 09:14
@oesteban oesteban force-pushed the fix/lta-concatenation branch from f615827 to 1d1d202 Compare July 20, 2021 09:15
Comment on lines +44 to +54
oracle = np.loadtxt(StringIO("""\
-3.0000 0.0000 -0.0000 91.3027
-0.0000 2.0575 -2.9111 -25.5251
0.0000 2.1833 2.7433 -105.0820
0.0000 0.0000 0.0000 1.0000"""))

lta_text = "\n".join(
(data_path / "bold-to-t1w.lta").read_text().splitlines()[13:21]
)
r2r = VG.from_string(lta_text)
assert np.allclose(r2r.as_affine(), oracle, rtol=1e-4)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AFAICT this test doesn't fail before this PR. What exactly is the edge case we're fixing here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this failed before this PR.

Copy link
Collaborator Author

@oesteban oesteban Jul 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oesteban
Copy link
Collaborator Author

Good to go? Do you feel the comments have been successfully addressed?

@effigies
Copy link
Member

Sorry, just getting back to this this morning. Give me one hour.

Copy link
Member

@effigies effigies left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. One tiny suggestion to avoid a call to flatten().

Co-authored-by: Chris Markiewicz <[email protected]>
@oesteban oesteban merged commit 12141b5 into master Jul 21, 2021
@oesteban oesteban deleted the fix/lta-concatenation branch July 21, 2021 15:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BUG: Errors reading LTAs, especially interpreting VOX2VOX type
4 participants