MemoryLab is a drag-and-drop web application designed for computer science education. It allows students to visualize Python memory models by arranging blocks representing variables, objects, functions, and values on an interactive canvas.
- Key Features
- Developer Setup (Local)
- Production Deployment
- Adding Questions to the Database
- Technical Details: Validation Algorithm
- Automated Grading with MarkUs
- Interactive Canvas: A Scratch-style interface for building memory diagrams using frames, objects, and values.
- Practice vs. Test Modes:
- Practice Mode: A guided environment with structured assistance to assist with model construction.
- Test Mode: A free-form environment with no constraints for independent self-assessment.
- Question Bank: Includes built-in exercises drawn from previous first-year computer science courses and tests at the University of Toronto.
- Automatic Grading: Provides detailed, traceable feedback by treating memory models as graph-isomorphism problems.
- Multi-Format Export: Save models as JSON (for re-importing or uploading to MarkUs), SVG, or PNG.
Follow these steps to set up and run the development environment.
git clone https://github.com/YOUR_USERNAME/memory-model-editor.git
cd memory-model-editorMemoryLab requires PostgreSQL 17 or above.
- macOS:
brew install postgresql@17 && brew services start postgresql@17 - Ubuntu/Debian:
sudo apt update && sudo apt install postgresql-17 postgresql-contrib-17 && sudo systemctl start postgresql - Windows: Download and install from postgresql.org.
cd backend
cp .env.example .envUpdate the .env file with your database connection string:
DATABASE_URL=postgresql://your_username@localhost:5432/memorylab?sslmode=disable
NODE_ENV=development
cd backend
npm run import-questionsThis command creates the database, sets up the practice_questions and test_questions tables, and imports data from the JSON files in backend/database/.
| Component | Commands | URL |
|---|---|---|
| Frontend | cd frontend && npm install && npm run dev |
http://localhost:3000 |
| Backend | cd backend && npm install && npm run dev |
http://localhost:3001 |
Ensure a PostgreSQL instance (v17+) is running on your server. For OS-specific installation steps, refer to the Database Setup section above.
cd backend
cp .env.production .envUpdate the .env file with your database connection string:
DATABASE_URL=postgresql://username:password@hostname:5432/memorylab?sslmode=require
NODE_ENV=production# Backend
cd backend
npm install
npm run import-questions# Frontend
cd frontend
npm install# Build & Start Backend
cd backend
npm run build
npm start# Build & Start Frontend
cd frontend
npm run build
npm start- Edit
backend/database/practiceQuestions.jsonorbackend/database/testQuestions.json. - Add your new question to the array following the existing format.
- Run
npm run import-questionsto re-import.
Frame:
{
"id": null,
"name": "__main__",
"type": ".frame",
"order": 1,
"value": {"variable_name": id_reference}
}Primitive (int, str, bool):
{ "id": 1, "type": "int", "value": 5 }List:
{ "id": 2, "type": "list", "value": [1, 2, 3] }The validation function (backend/src/modules/canvasEditor/validateAnswer.ts) evaluates submissions by treating the memory model as a graph-isomorphism problem. The goal is to determine if the user's canvas is structurally identical to the solution.
- Nodes: Unique "boxes" (primitives, lists, or function frames).
- Edges: References formed by variable names (in frames) or pointers (in containers).
The function finds a single, consistent bijection (one-to-one mapping) between the user's box IDs and the answer's box IDs.
- Top-Level Checks (
validateAnswer): Retrieves the solution, scans for duplicate IDs, and validates the structure of function frames. - Frame and Variable Comparison (
compareFrames): Iterates through frames. If a variable exists in both the solution and the submission, it callscompareIds. - Recursive ID Comparison (
compareIds):- Bijection Enforcement: Checks if the current mapping conflicts with existing mappings in
answerToInputMapandinputToAnswerMap. - Type Verification: Confirms types (e.g., list vs. int) match exactly.
- Primitive Value Check: Direct value comparison for primitives.
- Circular References: Uses a
visitedmap to track pairs of IDs to prevent infinite loops in recursive structures.
- Bijection Enforcement: Checks if the current mapping conflicts with existing mappings in
- Collection Checks:
- Arrays (
checkArray): Order-sensitive comparison of elements. - Sets (
checkSet): Performs tentative recursive matching to find a valid mapping in unordered data. - Dictionaries/Objects (
checkDict/checkObject): Matches keys/attributes exactly and recurses on the referenced IDs.
- Arrays (
- Final Checks:
- Call Stack Order: Verifies that frames are ordered correctly.
- Orphan Detection: Identifies user boxes not reached during traversal (unmapped boxes).
You can use the validateAnswer.ts logic to automate grading for student submissions exported as JSON.
- Environment: Ensure the MarkUs testing environment has Node.js and the compiled
validateAnswer.js(from the/distfolder). - Grading Script: Create a wrapper script that loads the student's exported JSON and the solution JSON from your database.
const {
validateAnswer,
} = require("./dist/modules/canvasEditor/validateAnswer");
// Load files
const studentSubmission = JSON.parse(fs.readFileSync(studentJsonPath));
const expectedSolution = JSON.parse(fs.readFileSync(solutionJsonPath));
// Run algorithm
const result = validateAnswer(studentSubmission, expectedSolution);
// Output for MarkUs
if (result.correct) {
process.stdout.write("Test Passed: 100%");
} else {
process.stdout.write("Errors identified:\n" + result.errors.join("\n"));
}- Feedback: The
result.errorsarray provides specific reasons for failure, which can be piped directly into the MarkUs feedback file.