Skip to content

Add running doctest to pytest default #7840

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Oct 29, 2022

Conversation

Cjkjvfnby
Copy link
Contributor

Describe your change:

Add default arguments to pytest.ini. This helps when you want to test your code locally.

  -l, --showlocals      Show locals in tracebacks (disabled by default)
  --doctest-modules     Run doctests in all .py modules
  --doctest-continue-on-failure
                        For a given doctest, continue to run after the first failure

  • Add an algorithm?
  • Fix a bug or typo in an existing algorithm?
  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
  • If this pull request resolves one or more open issues then the commit message contains Fixes: #{$ISSUE_NO}.

@algorithms-keeper algorithms-keeper bot added enhancement This PR modified some existing files awaiting reviews This PR is ready to be reviewed labels Oct 29, 2022
@cclauss
Copy link
Member

cclauss commented Oct 29, 2022

Can you please run ini2toml on this file and also .coveragerc and then put the results in a new pyproject.toml file so that we can move into the PEP 621 future? flake8 remains in a disparate struggle to remain incompatible with PEP 621 so please leave their config in their own file.

@cclauss cclauss changed the title Add running doctest to pytest defualt Add running doctest to pytest default Oct 29, 2022
pytest.ini Outdated
[pytest]
markers =
mat_ops: mark a test as utilizing matrix operations.
addopts = --durations=10
addopts = --durations=10 --doctest-modules --doctest-continue-on-failure --showlocals
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's leave out continue on failure.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the story behind that?

Usually, it's great to see all problems at once.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CONTRIBUTING.md describes how contributors can run the test on their own computers. This is much preferred to commits to the repo that send repeated notifications to maintainers. We have 200+ open PRs so we need to reduce interrupts.

@cclauss
Copy link
Member

cclauss commented Oct 29, 2022

I question the utility of coverage -- It generates a TON of output on every test run and no one reads it. All this clutter makes it very difficult (and a lot of scrolling!) for first-time contributors to find their error messages in our GitHub Actions output. If someone really wants to understand the holes in test coverage, they can run it on their own computers and on their own time.

@Cjkjvfnby
Copy link
Contributor Author

I question the utility of coverage -- It generates a TON of output on every test run and no one reads it.

Totally agree, I just noticed that it exists by accident. If tests pass I won't go to see action logs.
I don't like that text format and prefer to use HTML reporting on my machine.

Maybe some aggregator similar to pre-commit.ci could help with it. I have never used such tools. https://about.codecov.io/pricing/

There are a bunch of GitHub actions that work with coverage. Need to do a bit of experimenting with that https://github.com/marketplace?type=actions&query=python+coverage+

pytest.ini Outdated
[pytest]
markers =
mat_ops: mark a test as utilizing matrix operations.
addopts = --durations=10
addopts = --durations=10 --doctest-modules --doctest-continue-on-failure --showlocals
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CONTRIBUTING.md describes how contributors can run the test on their own computers. This is much preferred to commits to the repo that send repeated notifications to maintainers. We have 200+ open PRs so we need to reduce interrupts.

@algorithms-keeper algorithms-keeper bot added awaiting changes A maintainer has requested changes to this PR and removed awaiting reviews This PR is ready to be reviewed labels Oct 29, 2022
@Cjkjvfnby
Copy link
Contributor Author

This is the same report but a bit more detailed.

--doctest-continue-on-failure
                        For a given doctest, continue to run after the first failure
def foo():
    """
    int(1)
    >>> 2

    int(2)
    >>> 3
    """

pytest --doctest-modules scratch_7.py

================================================================== FAILURES ==================================================================
__________________________________________________________ [doctest] scratch_7.foo ___________________________________________________________
031 
032     int(1)
033     >>> 2
Expected nothing
Got:
    2

scratch_7.py:33: DocTestFailure
========================================================== short test summary info =========================================================== 

pytest --doctest-modules --doctest-continue-on-failure scratch_7.py

================================================================== FAILURES ================================================================== 
__________________________________________________________ [doctest] scratch_7.foo ___________________________________________________________ 
031
032     int(1)
033     >>> 2
Expected nothing
Got:
    2

scratch_7.py:33: DocTestFailure
031
032     int(1)
033     >>> 2
034
035     int(2)
036     >>> 3
Expected nothing
Got:
    3

scratch_7.py:36: DocTestFailure
========================================================== short test summary info ===========================================================

@algorithms-keeper algorithms-keeper bot added awaiting reviews This PR is ready to be reviewed and removed awaiting changes A maintainer has requested changes to this PR labels Oct 29, 2022
@Cjkjvfnby
Copy link
Contributor Author

i have moved pytest and coverage setting to pyproject.toml and return --doctest-continue-on-failure back (#7840 (comment))

pyproject.toml Outdated
markers = [
"mat_ops: mark a test as utilizing matrix operations.",
]
addopts = "--durations=10 --doctest-modules --doctest-continue-on-failure --showlocals"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's have a list of strings instead if that's possible.

Copy link
Member

@cclauss cclauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

@algorithms-keeper algorithms-keeper bot removed the awaiting reviews This PR is ready to be reviewed label Oct 29, 2022
@cclauss cclauss merged commit a9bd68d into TheAlgorithms:master Oct 29, 2022
@Cjkjvfnby Cjkjvfnby deleted the add-doctest-to-pytest-defualt branch November 5, 2022 22:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement This PR modified some existing files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants