New Feature: NbGrader extension for automatically generating test code #1633
Replies: 6 comments
-
This is awesome! 🎉 Thank you so much for sharing! In principle I think it would be great to have something like this as part of core nbgrader---I am sure a lot of instructors would find it super useful. However, there's one catch, which is that one of the main design prinicples of nbgrader is that, like then notebook itself, it should be language agnostic. It sounds like AutoTest is python-specific at the moment, but I have a sense that the way you've designed it, it might be able to be made to support other types of syntax? Do you think that's true? I think if you could add support for being able to hook in other languages then this would be a great addition to core nbgrader. What I'm thinking is something along the lines of how |
Beta Was this translation helpful? Give feedback.
-
Hi @jhamrick ! Very glad that you like it :)
It actually isn't! AutoTest is language agnostic. We have implemented all the various bits needed for python and R, but getting Julia in there too would take very little effort -- just adding an item to a few dictionaries in the source. I realize now all of our demos/etc are in python, which may have been a bit misleading -- that's largely because we're actually using AutoTest right now in developing a python-flavoured course here at UBC. But we do a lot of teaching here in R, so language-agnosticism was a primary objective from the get-go. In fact, instructors don't even have to manually specify language in a config file -- Autotest detects which language is being used from the notebook itself: and then uses appropriate language-specific code wherever necessary, e.g. when cleaning outputs from the kernel: But there really aren't too many places where language-specific code is necessary inside AutoTest itself. Most of the language-specific stuff happens in the template file that the instructor has to write to specify how to dispatch tests based on object types. Example here for python. At one point early in the development I had some working R examples, but those must have gotten lost after some repo restructuring...I'll open an issue in the AutoTest repo to implement some R examples in the demo container.
Ah! I took a look at the Anyway, given that this seems to be of interest:
|
Beta Was this translation helpful? Give feedback.
-
Ah, fantastic!
Yes, that would be great! And yes,
Yep! |
Beta Was this translation helpful? Give feedback.
-
Alright, sounds good! It may take us a bit to get around to it -- we'll likely have to do some work to update it from nbgrader 0.6.2 to the current main branch -- but we'll open one as soon as we can. |
Beta Was this translation helpful? Give feedback.
-
Brief update on this thread for anyone visiting a year later: the relevant PR for this is #1817 . There are older PRs open (#1650 and #1656) that are now stale as of 0.9.x . |
Beta Was this translation helpful? Give feedback.
-
Closing this discussion since #1817 has been merged. |
Beta Was this translation helpful? Give feedback.
-
Hello nbgrader team!
This is Alireza Iranpour (cc Trevor Campbell @trevorcampbell and Tiffany Timbers @ttimbers) from the UBC Department of Statistics. Just wanted to reach out to let you know about a new extension to nbgrader, called AutoTest, that we have recently developed. AutoTest significantly reduces the workload of creating autograded notebooks by automating the generation of test code.
Please find the link to the repo here
Read below for more details (and see our demo video here), but we were wondering: would there be any interest in integrating AutoTest within nbgrader itself (via a pull request)? We think this could be a very useful tool for the general nbgrader userbase.
(NB: currently AutoTest is based on nbgrader 0.6.2 -- we of course intend to bring it up to date with the current version of nbgrader prior to opening a PR)
How test cases are added currently in standard nbgrader:
Conventionally, to make a notebook auto-graded, instructors have to write custom assert statements for each question and, in almost all cases, hard code the correct value against which the student output/answers will be compared. Doing this for one / multiple assignments is error prone and time consuming. And for tests where the correct value should be hidden from the student (e.g., multiple choice questions), the amount of work is even higher: the instructor typically needs to hash the correct solution, paste it into the notebook, and write code to compare the hashed student answer.
AutoTest addresses these challenges by:
Other features of AutoTest:
How AutoTest works:
AutoTest is an additional preprocessor that runs in the generate assignments converter, and transforms
### AUTOTEST
lines into proper assert statements.AutoTest is a non-destructive addition and does not interfere with nbgrader’s normal operations. It is only triggered and used when the AutoTest delimiter is found in an auto-graded cell within a given notebook. Students do not need any additional software -- AutoTest just inserts code into the release notebook upon generation.
Thanks -- we look forward to your response!
nbgrader --0.6.2
Beta Was this translation helpful? Give feedback.
All reactions