Final deliverable: Packaging
- Due No Due Date
- Points 0
- Submitting a website url or a file upload
This checkpoint is to make sure your work is readily accessible to others who want to use it.
How it will work
You will "package" your work as described below (depending on whether it's an element or not), and another team will try to use it to create an example question. Their experience doing this will inform the documentation that ultimately gets packaged with your element.
Using the following arbitrary list of teams, team N will have their work tested by team (N+1) mod 6:
If your project takes the form of an element
- Make sure all element files are self-contained in a single subdirectory under the
elements/
directory of thepl-ucb-star-assessments
"course" repo on GitHub. If there are external dependencies (other than a Docker image) that do not belong in this directory, let's talk about it so we can figure out how best to package your work! We will be making these available to other courses via advertising mechanisms to be determined. - At the top level of your elements subdirectory, include a README.md that has clear instructions for using the element, including some code for an example question (i.e. an excerpt that might appear in
question.html
) and a screenshot of the result. - Within the
questions
subdirectory for your project, add one or more example questions to which the README refers. - In the README, include a link to your project's "in progress slide deck" used throughout the semester, on Google Drive.
- If your element/questions also require a separate Docker image (e.g. for a custom autograder that doesn't run inside PL itself), please include the Dockerfile with your element code, and send Armando a link to your Docker image: we have a Dockerhub account for the ACE Lab where we can host it and make it readily available. Please also include an example
info.json
file that shows how to reference it at autograding time. - If your element relies on other specific files being available in clientFilesQuestion or serverFilesQuestion, or in clientFilesCourse or serverFilesCourse, make sure the README explains that. Basically, given the README and one of the example questions
If your project isn't a new element
If your project takes the form of one or more new questions, the in subdirectory under questions
for your project, include one or more example questions (question.html
, and if appropriate, an annotated server.py
) that are explained in a README at the top level. Indicate clearly how instructors should create additional questions: what parts of your example question.html
and/or server.py
files will they need to modify?
Deployment/adoption: open practice
We intend to keep an "open practice server" running for students who want to practice on any of your projects. This will take the form of keeping open the STAR Assessments "course" we created for server deployment, and arranging to allow any student to "enroll" in the course and attempt any of its assessments.
If you'd like to make your project accessible as a general practice tool for all students:
- On your own branch of the repo, create one or more assessment(s) (in
courseInstance/assessments/*/infoAssessment.json
) and give each a clear title reflecting what it contains ("Practice problems for..."). The assessment title should include both the topic names and (if appropriate) a Berkeley course number tie-in. You can setup the assessment as "homework" allowing unlimited attempts and so forth. Here's the documentation for doing that Links to an external site.. - Within the assessment, make sure the question(s) are sufficiently self-explanatory for students to be able to practice. There won't really be a way for the students to ask questions.
- On your local computer, make sure your entire repo loads without errors and the assessment in PL behaves the way you expect.
- Open a pull request to merge those changes to the main branch. Armando or Dan will review the PR and merge it, and make them live.
- Finally, write a 1-paragraph description that we can use to publicize these to EECS students on an ongoing basis. We will create a page on the ACELab Mastery Learning project site that points to your work!
Deployment/adoption: in courses
If a course instructor needs help onboarding their course to PrairieLearn to use your exercises, let one of us know!