Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Get eval results from metadata #597

Open
wants to merge 11 commits into
base: master
Choose a base branch
from

Conversation

flying-sheep
Copy link
Contributor

Fixes #596

No idea how to elegantly pass the SyntaxNode all the way down to the role, and I’m not super interested in figuring that out, it’s your call how to implement that.

@codecov-commenter
Copy link

codecov-commenter commented May 7, 2024

Codecov Report

Attention: Patch coverage is 89.47368% with 2 lines in your changes missing coverage. Please review.

Project coverage is 80.98%. Comparing base (af2d197) to head (6d8447d).
Report is 14 commits behind head on master.

Current head 6d8447d differs from pull request most recent head 498cad5

Please upload reports for the commit 498cad5 to get more accurate results.

Files Patch % Lines
myst_nb/core/execute/base.py 87.50% 1 Missing ⚠️
myst_nb/core/execute/inline.py 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #597      +/-   ##
==========================================
+ Coverage   80.97%   80.98%   +0.01%     
==========================================
  Files          30       30              
  Lines        2702     2714      +12     
==========================================
+ Hits         2188     2198      +10     
- Misses        514      516       +2     
Flag Coverage Δ
pytests 80.98% <89.47%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@flying-sheep
Copy link
Contributor Author

@agoose77 is there a way this can make it into this week’s release?

@flying-sheep
Copy link
Contributor Author

flying-sheep commented Jun 27, 2024

Hi @agoose77, would be great if you’d take a look!

The current doc build failures are not mine, but #610

@agoose77
Copy link
Collaborator

I will hold this for now; we need to rectify the different ways that myst-nb and jupyterlab-myst perform caching of output expressions.

@flying-sheep
Copy link
Contributor Author

This just uses the user expression result that’s already stored in the metadata. It doesn’t touch how it’s being stored.

That doesn’t seem to have a potential for problems, or am I missing something?

@agoose77
Copy link
Collaborator

This just uses the user expression result that’s already stored in the metadata. It doesn’t touch how it’s being stored.

That doesn’t seem to have a potential for problems, or am I missing something?

My initial concerns were around the way that execution should be handled. But your choice of wording has helped me to disambiguate the two cases; reading from a notebook when execution is disabled is something that we can (and should) to. I'll revisit this PR shortly (probably next week).

@flying-sheep
Copy link
Contributor Author

Hi, it’s been a month! Would be great if you could take a look here

@agoose77
Copy link
Collaborator

Thanks @flying-sheep for your continued patience. I want to add support for inline-expressions in jupyter-cache as well, if we're going to get this one in. That way, all of our execution modes will support expressions. I'll circle back once the myst-nb release / JB cycle is done

@flying-sheep
Copy link
Contributor Author

Thanks for the update. I’d say merging this as-is would probably already add value, as it’s a tiny change with big effects.

Do you know what adding support for jupyter-cache would entail?

@flying-sheep
Copy link
Contributor Author

ping

@bsipocz bsipocz changed the title Get eval results from metadata BUG: Get eval results from metadata Nov 26, 2024
@bsipocz bsipocz added the bug Something isn't working label Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

{eval}thing does not work with nb_execution_mode == 'off'
4 participants