Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Colin/sanity check #445

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from
Draft

Conversation

ColinFay
Copy link
Contributor

This is a PR implementing a sanity-check for a Rmd. (WIP for now)

Copy link
Collaborator

@schloerke schloerke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work!

Need to find the true setup chunks. This can be done in a followup PR.

Would like to make sure we are not using knitr::*$get() as much as possible (as there is no real knitr environment while creating the exercise objects).

@@ -39,7 +39,8 @@ Imports:
renv (>= 0.8.0),
curl,
promises,
digest
digest,
cli
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's move this to Suggests and check for it before running the main function

# set global tutorial option which we can use as a basis for hooks
# (this is so we don't collide with hooks set by the user or
# by other packages or Rmd output formats)
knitr::opts_chunk$set(tutorial = TRUE)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's also restore to the original chunk value to what it was on.exit()

Comment on lines +112 to +114
# Setting learnr options
learnr::tutorial_options()

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a no-op

Suggested change
# Setting learnr options
learnr::tutorial_options()

# inside the knitr environment
on.exit({
knitr::knit_code$restore()
})
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never hurts / always safe to use add = TRUE

Suggested change
})
}, add = TRUE)

# This will register code inside `knitr::knit_code`
# The results is a list containing all the elements from the Rmd
# (i.e code + title + yaml)
res <- knitr:::split_file(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a min version for this knitr function?

Comment on lines +363 to +365
setup = {
paste0(grab_chunk("setup")[[1]]$code, collapse = "\n")
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is currently not used in evaluate_exercise(). Can set to NULL.

Suggested change
setup = {
paste0(grab_chunk("setup")[[1]]$code, collapse = "\n")
},
setup = NULL,

Comment on lines +163 to +164
setup_chunk <- usefull_chunks[["setup"]]
usefull_chunks[["setup"]] <- NULL
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should only need to evaluate this once. Save the result into an environment

Suggested change
setup_chunk <- usefull_chunks[["setup"]]
usefull_chunks[["setup"]] <- NULL
setup_chunk <- usefull_chunks[["setup"]]
usefull_chunks[["setup"]] <- NULL
setup_chunk_envir <- eval(parse(text = setup_chunk), envir = new.env(parent = globalenv()))


res <- evaluate_exercise(
exercise_blt,
envir = new.env()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use the previously source 'setup' environment

Suggested change
envir = new.env()
envir = setup_chunk_envir

usefull_chunks[chunk_][[1]]$options
},
engine = {
usefull_chunks[chunk_][[1]]$enginee %||% "r"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
usefull_chunks[chunk_][[1]]$enginee %||% "r"
usefull_chunks[chunk_][[1]]$engine %||% "r"

```

## Flight

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
```{r random_chunk}
# should not be checked
1 + 1
```

}

# Add the options list to the chunk
usefull_chunks[[i]]$options <- options
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: do not overwrite useful chunk options permanently. A exercise could be a setup chunk for another exercise.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To help with this, I think we should abstract out a check_exercise_chunk() method that can be lapplyed over the exercises.

This inner should also have the ability to allow for the caller to pass in their own function value. If an author has a bad merge, and they are using gradethis::grade_code(), all it would be checking is if x == x, not if x is correct.

Both of these functions would serve great purposes. check_exercises() would be a quick and dirty check. check_exercise_chunk() could be used within CI to validate that tutorials work as expected with many different inputs.


Something like...

check_exercise_chunk <- function(
  file,
  ..., # should be ignored
  chunks = file_to_chunks(file),
  user_code = NULL # or expression
) {
  # ....
  if (rlang::is_expression(user_code)) {
    user_code <- solution_code
  }
  # ....
  # return feedback object
}

# would use the solutions for every exercise
check_exercises("file.Rmd") 

# would use the supplied user code to 'ex1' in 'file.Rmd'
check_exercise_chunk("file.Rmd", "ex1", user_code = rlang::expr({
  # this is user code
  40 + 2
})

I would be REALLY happy to have this inside gradethis to validate that gradethis is working as expected with MANY different types of testing documents.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants