Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Python doctests #20

Open
jakubkaczor opened this issue Oct 13, 2022 · 5 comments
Open

Add support for Python doctests #20

jakubkaczor opened this issue Oct 13, 2022 · 5 comments
Labels
enhancement New feature or request

Comments

@jakubkaczor
Copy link

As in the title. For me, the most important thing is to be able to run tests from in-code Docstrings, but as far as I know, it is also possible to have them in other text files, for example in documentation. pytest has options to run doctests. It would be nice if one could run doctests together with the other framework, as these are rather a complement, not displacement. Not as with unittest–pytest dichotomy.

https://docs.python.org/3/library/doctest.html

@rcarriga rcarriga added the enhancement New feature or request label Oct 17, 2022
@OddBloke
Copy link
Contributor

OddBloke commented Nov 7, 2022

I went looking at this out of curiosity, and ran into two problems.

Firstly. https://github.com/nvim-neotest/neotest-python/blob/master/lua/neotest-python/base.lua#L13 means that only test_ or _test files will be considered. With addopts=--doctest-modules in pytest.ini, doctests in files with that naming are discovered and run.

Secondly, pytest returns a different exception repr class for doctest failures, so we hit Exception: Unhandled error type (<class '_pytest.doctest.ReprFailDoctest'>), please report to neotest-python repo.

I haven't investigated any further, but that's a starting point for anyone interested.

@OddBloke
Copy link
Contributor

OddBloke commented Nov 7, 2022

This diff generates errors for the appropriate lines (with somewhat sketchy message content), but those lines don't get annotated: I'm not sure how to debug that further.

--- a/neotest_python/pytest.py
+++ b/neotest_python/pytest.py
@@ -5,6 +5,7 @@ from typing import Callable, Dict, List, Optional, Union
 from .base import NeotestAdapter, NeotestError, NeotestResult, NeotestResultStatus
 
 import pytest
+from _pytest.doctest import ReprFailDoctest
 from _pytest._code.code import ExceptionRepr
 from _pytest.terminal import TerminalReporter
 
@@ -115,6 +116,12 @@ class NeotestResultCollector:
                     if str(traceback_entry.path) == abs_path:
                         error_line = traceback_entry.lineno
                 errors.append({"message": error_message, "line": error_line})
+            elif isinstance(exc_repr, ReprFailDoctest):
+                error_line = error_message = None
+                for reprlocation_line, _ in exc_repr.reprlocation_lines:
+                    error_message = reprlocation_line.message
+                    error_line = reprlocation_line.lineno
+                errors.append({"message": error_message, "line": error_line})
             else:
                 # TODO: Figure out how these are returned and how to represent
                 raise Exception(

@OddBloke
Copy link
Contributor

OddBloke commented Nov 9, 2022

OK, so I think the lack of annotation is because the doctests are not detected as test nodes by the parsing code. When the results come in, they don't correspond to a known test ID, so they are discarded.

I don't think we'll be able to use pure Treesitter querying to detect doctest nodes: it, unsurprisingly, parses docstrings simply as "string" objects.

@duguyue100
Copy link

@OddBloke Any update on this? Would be super useful if this feature is integrated.

@OddBloke
Copy link
Contributor

Hi @duguyue100! Nothing new on my end, I'm afraid: I don't have a specific need, I just looked into it to see if it was low-hanging fruit (and it wasn't for me, at least!).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants