Chat about this codebase

AI-powered code exploration

Online

Project Overview

Description

MovieStripes is a serverless web app where players guess movie titles from “barcode” images generated by stacking color-averaged video frames. It streamlines puzzle generation, delivery and validation using cloud services, enabling a scalable, low-maintenance gaming experience.

Gameplay Concept

  1. The system extracts frames from a movie, averages each frame’s colors vertically and stacks them to form a barcode image.
  2. Players view the barcode and submit their guess of the movie title.
  3. The app verifies answers in real time and tracks scores and progress.

Core Technologies

  • AWS Lambda: runs image-processing and answer-validation functions
  • Amazon S3: stores raw videos, generated barcode images and frontend assets
  • Amazon RDS (PostgreSQL): persists puzzles, users and score data
  • Static Frontend: served via S3/CloudFront, built with modern JavaScript (HTML/CSS/JS)
  • Serverless Framework: defines infrastructure as code for consistent deployments

Why It Matters

  • Scalable Serverless Architecture: auto-scales with demand and minimizes operational overhead
  • Extensible Puzzle Workflow: admins upload new movies and metadata; Lambdas regenerate puzzles automatically
  • Demonstrates Image Processing: showcases real-world use of frame extraction and color averaging
  • Engaging User Experience: offers a unique, visually driven game that encourages repeat play and competition

Getting Started

Get a local copy of MovieStripes up and running in minutes. This setup uses Docker Compose to launch Postgres, the Node.js backend, and the React frontend with live-reload and initial schema import.

Prerequisites

• Docker & Docker Compose (v1.27+)
• Ports 3000, 3001, 5432 free

Clone the Repository

git clone https://github.com/yourusername/moviestripes.git
cd moviestripes

Environment Configuration

Create a .env at project root (optional—defaults are set in docker-compose.yaml):

cp .env.example .env
# Edit .env to override:
# DATABASE_URL=postgresql://movieuser:moviepass@postgres:5432/moviebarcode
# REACT_APP_API_URL=http://localhost:3001

Start the Full Stack

From project root:

docker-compose up --build -d

--build rebuilds images when you change Dockerfiles
-d runs containers in the background

Verify all services are healthy:

docker-compose ps

Access the App

• Frontend → http://localhost:3000
• API → http://localhost:3001

Inspecting Logs & Containers

Tail backend logs:

docker-compose logs -f backend

Open a shell in the backend container:

docker-compose exec backend sh

Run a one-off backend script (e.g., seed database):

docker-compose run --rm backend node scripts/seed-db.js

Tear Down

Stop containers and remove volumes:

docker-compose down -v

Manual Development Mode

If you prefer running services without Docker:

  1. Backend
    cd backend
    npm install
    export DATABASE_URL=postgresql://movieuser:moviepass@localhost:5432/moviebarcode
    npm run dev
    
  2. Frontend
    cd frontend
    npm install
    export REACT_APP_API_URL=http://localhost:3001
    npm start
    
  3. Postgres
    Install Postgres 15 locally, create user/database matching DATABASE_URL, then import schema:
    psql -U movieuser -d moviebarcode -f backend/schema.sql
    

By following these steps, you’ll have a full local MovieStripes environment that mirrors production’s PostgreSQL schema, Lambda-style API, and React front end—ready for development and testing.

Architecture & Core Concepts

MovieStripes decomposes into dedicated services that collaborate to generate daily movie‐barcode puzzles, serve them to players, and persist data. This section outlines each layer and how they interact in both production (AWS) and local development.

1. System Overview

Components:

  • Frontend: React SPA served on S3 + CloudFront (or Docker Compose port 3000)
  • API Layer: Serverless Lambdas behind API Gateway (or Express in Docker on port 3001)
  • Barcode Generator: MovieBarcodeGenerator invoked by scheduler or CLI
  • Data Store: RDS (Postgres) for metadata, S3 for barcode images
  • Scheduler: CloudWatch Events → Puzzle‐generator Lambda (or cron on local machine)

Visualization:

[User Browser] ↓ React SPA ──→ GET /api/puzzle/today, POST /api/guess ↓ ↑ API Gateway / Express │ ↓ │ Lambda Functions / Node.js │ ├─ puzzle-retrieval │ ├─ guess-validation │ └─ puzzle-generator ──→ MovieBarcodeGenerator ↓ S3 Bucket (barcodes) ↓ RDS (puzzles metadata)

2. Frontend (React SPA)

  • Fetches current puzzle metadata (GET /api/puzzle/today)
  • Renders barcode image URL from S3
  • Submits guesses (POST /api/guess)
  • Provides autocomplete via /api/movies/search?q=

Basic fetch example:

// src/services/api.js
const API_BASE = process.env.REACT_APP_API_URL;

export async function fetchTodayPuzzle() {
  const res = await fetch(`${API_BASE}/api/puzzle/today`);
  if (!res.ok) throw new Error('Failed to load puzzle');
  return res.json(); // { id, title: null, barcodeUrl, scheduledDate }
}

3. API Layer

In production, each handler is a Lambda behind API Gateway. Locally, an Express app on port 3001 exposes identical routes.

Key routes:

  • GET /api/puzzle/today → returns today’s puzzle record
  • POST /api/guess → validates user guess against stored title
  • GET /api/movies/search?q= → returns title suggestions for autocomplete
  • POST /api/admin/puzzle (secured) → triggers puzzle upload

Example Lambda handler (Node.js):

// backend/lambdas/puzzle-generator.js
const MovieBarcodeGenerator = require('../barcode-generator');
const { insertPuzzle } = require('../db');
const s3 = new (require('aws-sdk/clients/s3'))();

exports.handler = async () => {
  // 1. Generate barcode
  const gen = new MovieBarcodeGenerator();
  const { success, path, filename, frameCount, error } =
    await gen.generateBarcode('/tmp/trailer.mp4', 'Inception (2010)');
  if (!success) throw new Error(error);

  // 2. Upload to S3
  const fileContent = require('fs').readFileSync(path);
  await s3.putObject({
    Bucket: process.env.BARCODE_BUCKET,
    Key: filename,
    Body: fileContent,
    ContentType: 'image/jpeg'
  }).promise();

  // 3. Insert metadata into RDS
  await insertPuzzle({
    title: 'Inception',
    year: 2010,
    genre: 'Sci-Fi',
    director: 'Christopher Nolan',
    barcodeUrl: `https://${process.env.BARCODE_BUCKET}.s3.amazonaws.com/${filename}`,
    scheduledDate: new Date().toISOString().slice(0, 10)
  });

  return { statusCode: 200, body: 'Puzzle generated' };
};

4. Barcode Generation Pipeline

Encapsulated in MovieBarcodeGenerator.generateBarcode():

  1. Validate input video path
  2. Extract frames via FFmpeg at fps
  3. Compute average color per frame (using sharp)
  4. Composite vertical strips into one image
  5. Save JPEG to ./generated-barcodes/{sanitized_title}_barcode.jpg
  6. Return { success, path, filename, frameCount }

Use quality presets:

const presets = generator.getQualityPreset('medium');
// { fps: 2, width: 800, height: 200 }
await generator.generateBarcode(videoPath, title, presets);

5. Data Storage

PostgreSQL schema (loaded from backend/schema.sql):

CREATE TABLE puzzles (
  id SERIAL PRIMARY KEY,
  title TEXT NOT NULL,
  year INT,
  genre TEXT,
  director TEXT,
  barcode_url TEXT NOT NULL,
  scheduled_date DATE UNIQUE NOT NULL
);
  • RDS (Prod) or local Postgres (Dev via Docker)
  • S3 (Prod) or local stub/minio (Dev) for barcode assets

6. Puzzle Scheduling

Production:

  • CloudWatch Event (cron 0 0 UTC) → puzzle-generator Lambda
    Local:
  • docker-compose exec backend python upload_puzzle.py --schedule-today …

7. Local Development vs. Production

Local (Docker Compose):

  • Postgres on localhost:5432
  • Express API on localhost:3001
  • React dev server on localhost:3000
  • Local filesystem stub for S3

Production (AWS):

  • RDS instance
  • S3 bucket for barcodes
  • API Gateway + Lambdas
  • CloudFront distribution for SPA

8. Extending the System

• Adding a new Lambda: – Define handler in backend/lambdas/ – Update serverless.yml/CloudFormation
• Modifying the barcode pipeline: – Extend MovieBarcodeGenerator options
– Plug in custom color‐extraction logic
• Schema changes: – Add migration in backend/migrations/ – Update ORM models or raw SQL helpers

These core concepts ensure clear boundaries between UI, API, compute, and storage, simplifying development, testing, and scaling.

Deployment & Configuration

This section covers local setup with Docker Compose, production deployment to AWS ECS, and managing runtime configuration for the krishjp/movie-barcodes services.

Local Development with Docker Compose

Define and run PostgreSQL, backend and frontend together:

# docker-compose.yaml
version: "3.8"

services:
  postgres:
    image: postgres:15
    environment:
      POSTGRES_DB: moviebarcode
      POSTGRES_USER: movieuser
      POSTGRES_PASSWORD: moviepass
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data
      - ./backend/schema.sql:/docker-entrypoint-initdb.d/schema.sql
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U movieuser -d moviebarcode"]
      interval: 5s
      timeout: 5s
      retries: 5

  backend:
    build: ./backend
    image: krishjp/movie-barcodes-backend:local
    depends_on:
      postgres:
        condition: service_healthy
    environment:
      DATABASE_URL: postgres://movieuser:moviepass@postgres:5432/moviebarcode
    ports:
      - "4000:4000"

  frontend:
    build: ./frontend
    image: krishjp/movie-barcodes-frontend:local
    depends_on:
      - backend
    environment:
      REACT_APP_API_URL: http://localhost:4000/graphql
    ports:
      - "3000:3000"

volumes:
  postgres_data:

Commands:

  • docker-compose up --build
  • docker-compose down -v (reset database)

AWS Deployment

1. Build and Push to Amazon ECR

# 1. Create ECR repos
aws ecr create-repository --repository-name movie-barcodes-backend
aws ecr create-repository --repository-name movie-barcodes-frontend

# 2. Authenticate Docker to ECR
aws ecr get-login-password --region us-east-1 \
  | docker login --username AWS --password-stdin <account-id>.dkr.ecr.us-east-1.amazonaws.com

# 3. Build & tag images
docker build -t krishjp/movie-barcodes-backend:prod ./backend
docker tag krishjp/movie-barcodes-backend:prod <account-id>.dkr.ecr.us-east-1.amazonaws.com/movie-barcodes-backend:latest

docker build -t krishjp/movie-barcodes-frontend:prod ./frontend
docker tag krishjp/movie-barcodes-frontend:prod <account-id>.dkr.ecr.us-east-1.amazonaws.com/movie-barcodes-frontend:latest

# 4. Push
docker push <account-id>.dkr.ecr.us-east-1.amazonaws.com/movie-barcodes-backend:latest
docker push <account-id>.dkr.ecr.us-east-1.amazonaws.com/movie-barcodes-frontend:latest

2. Deploy to ECS Fargate

Minimal CloudFormation snippet for a Task Definition and Service:

Resources:
  MovieBarcodesTaskDef:
    Type: AWS::ECS::TaskDefinition
    Properties:
      Cpu: "256"
      Memory: "512"
      NetworkMode: awsvpc
      RequiresCompatibilities: 
        - FARGATE
      ContainerDefinitions:
        - Name: backend
          Image: !Sub "${AWS::AccountId}.dkr.ecr.${AWS::Region}.amazonaws.com/movie-barcodes-backend:latest"
          PortMappings:
            - ContainerPort: 4000
          Environment:
            - Name: DATABASE_URL
              Value: {{resolve:ssm:/movie-barcodes/db-url:1}}
        - Name: frontend
          Image: !Sub "${AWS::AccountId}.dkr.ecr.${AWS::Region}.amazonaws.com/movie-barcodes-frontend:latest"
          PortMappings:
            - ContainerPort: 80

  MovieBarcodesService:
    Type: AWS::ECS::Service
    Properties:
      Cluster: arn:aws:ecs:us-east-1:<account-id>:cluster/movie-barcodes
      DesiredCount: 2
      LaunchType: FARGATE
      TaskDefinition: !Ref MovieBarcodesTaskDef
      NetworkConfiguration:
        AwsvpcConfiguration:
          Subnets:
            - subnet-abc123
          AssignPublicIp: ENABLED

Deploy with:

aws cloudformation deploy \
  --template-file aws/ecs-stack.yaml \
  --stack-name movie-barcodes \
  --capabilities CAPABILITY_IAM

Runtime Configuration Management

Environment Variables

  • Local: define in .env and reference in docker-compose.yaml.
  • ECS: inject via Task Definition Environment or reference SSM Parameters.

AWS Systems Manager Parameter Store

Store secrets centrally:

aws ssm put-parameter \
  --name /movie-barcodes/db-url \
  --value "postgres://movieuser:moviepass@mydb.cluster.xxx.us-east-1.rds.amazonaws.com:5432/moviebarcode" \
  --type SecureString

Reference in Task Definition:

Environment:
  - Name: DATABASE_URL
    Value: {{resolve:ssm:/movie-barcodes/db-url:1}}

Secrets Rotation

Integrate with AWS Secrets Manager and configure automatic rotation for production credentials.


This configuration ensures seamless transitions from local development to production on AWS, with secure, scalable runtime settings for the krishjp/movie-barcodes application.

API Reference

Detailed reference for all publicly exposed REST endpoints used by the frontend or admin scripts.

GET /api/puzzle/today – Fetch Today’s Puzzle

Purpose
Return today’s movie puzzle metadata and barcode URL.

Endpoint
GET /api/puzzle/today

Request
No parameters.

Response
200 OK

{
  "movieId": 42,
  "title": "Inception",         // hidden from clients; for logging only
  "imdb_id": "tt1375666",       // used to fetch barcode
  "publishedDate": "2025-08-13",
  "barcodeUrl": "/api/barcodes/tt1375666"
}

Example (browser / Node fetch)

const res = await fetch('/api/puzzle/today');
if (!res.ok) throw new Error('Failed to load puzzle');
const puzzle = await res.json();
// puzzle.movieId, puzzle.barcodeUrl

Practical Tips
• Preload the barcode image by setting an src to barcodeUrl.
• Store movieId for subsequent POST /api/guess calls.


POST /api/guess – Submit a Movie Guess

Purpose
Validate a user’s guess, generate hints, or reveal the answer once correct or after 6 attempts.

Endpoint
POST /api/guess

Request Headers
• Content-Type: application/json

Request Body

{
  "guess": "Inception",
  "movieId": 42,
  "guessNumber": 2
}

Response Shapes
Ongoing game (200 OK)

{
  "correct": false,
  "gameOver": false,
  "guessNumber": 2,
  "hints": ["Genre: Sci-fi"]
}

Correct guess (200 OK)

{
  "correct": true,
  "gameOver": true,
  "guessNumber": 3,
  "answer": {
    "title": "Inception",
    "year": 2010,
    "genre": "Sci-fi",
    "director": "Christopher Nolan",
    "imdb_id": "tt1375666"
  }
}

Game over by limit (200 OK)

{
  "correct": false,
  "gameOver": true,
  "guessNumber": 6,
  "answer": { /* same as above */ }
}

Error Responses
400 Bad Request – malformed JSON or missing fields
404 Not Found – invalid movieId
500 Internal Server Error – DB or unexpected crash

Example

const payload = {
  guess: userInput.trim(),
  movieId: storedMovieId,
  guessNumber: currentAttempt
};
const res = await fetch('/api/guess', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify(payload)
});
const data = await res.json();
if (!res.ok) throw new Error(data.message || 'Guess failed');
// use data.hints or data.answer

GET /api/search – Search Movies by Title

Purpose
Return a list of movies matching a search query for admin or autocomplete.

Endpoint
GET /api/search?query={text}

Request
Query Parameters
• query (string): partial or full title

Response
200 OK

[
  { "id": 42, "title": "Inception", "year": 2010, "imdb_id": "tt1375666" },
  { "id": 73, "title": "Inception: The Cobol Job", "year": 2010, "imdb_id": "tt1790736" }
]

Example

async function searchMovies(term) {
  const res = await fetch(`/api/search?query=${encodeURIComponent(term)}`);
  if (!res.ok) throw new Error('Search failed');
  return res.json();
}

GET /api/history – Retrieve Puzzle History

Purpose
List past daily puzzles with their dates and IDs.

Endpoint
GET /api/history

Request
Optional query parameters for pagination (not currently implemented).

Response
200 OK

[
  { "movieId": 39, "publishedDate": "2025-08-12" },
  { "movieId": 38, "publishedDate": "2025-08-11" },
  … 
]

Example

const history = await fetch('/api/history').then(r => r.json());
// Render list of dates and link to barcode/authors

GET /api/barcodes/:imdb_id – Fetch Barcode Image

Purpose
Return a PNG barcode for the given IMDB ID.

Endpoint
GET /api/barcodes/:imdb_id

Request
Path Parameter
• imdb_id (string): movie’s IMDB identifier

Response
200 OK
Content-Type: image/png
Binary PNG stream

Example (in React)

function Barcode({ imdbId }) {
  return <img src={`/api/barcodes/${imdbId}`} alt="Movie Barcode" />;
}

GET /health – Health Check

Purpose
Verify server status for uptime monitoring.

Endpoint
GET /health

Request
No parameters.

Response
200 OK

{ "status": "ok" }

Example

const ok = await fetch('/health').then(r => r.json());
if (ok.status !== 'ok') alert('API down');
## Admin & Development Guide

This guide covers day-to-day content management, schema evolution, test execution, and best practices for contributing enhancements.

### Maintaining Movie & Puzzle Content

Use the Python CLI script to automate uploading barcodes, inserting movie metadata, and scheduling puzzles.

#### Setup Environment
Ensure AWS and database credentials are set:

export DB_HOST= export DB_USER= export DB_PASS= export DB_NAME= export S3_BUCKET=

Install dependencies:
```bash
cd backend
pip install boto3 psycopg2-binary python-dotenv

Upload & Insert Workflow

python upload_and_insert.py \
  --title "Inception" \
  --year 2010 \
  --genre "Sci-Fi" \
  --director "Christopher Nolan" \
  --barcode ./barcodes/inception_2010.png \
  --release-date 2010-07-16 \
  --puzzle-date 2025-09-01

Key steps in upload_and_insert.py:

  1. Upload PNG to S3 under barcodes/{filename}
  2. Insert into movies table and retrieve movie_id
  3. Schedule in daily_puzzles with specified puzzle_date

Practical Tips

  • Omit --puzzle-date to let the script pick the next open slot.
  • Use --preview to validate inputs without committing.
  • For bulk loads, wrap calls in a shell loop or extend for CSV import.

Evolving the Database Schema

All tables, indexes, and the get_daily_puzzle function live in backend/schema.sql. To modify or extend:

  1. Edit schema.sql, adding your ALTER TABLE or new definitions.
  2. Apply changes in your dev RDS:
    psql "host=$DB_HOST user=$DB_USER dbname=$DB_NAME" \
      -f backend/schema.sql
    
  3. Verify with quick queries:
    -- Check new column
    SELECT column_name
      FROM information_schema.columns
     WHERE table_name = 'movies';
    

Adding a column example:

ALTER TABLE movies
  ADD COLUMN rating NUMERIC(2,1) DEFAULT 5.0;

Then update ingestion scripts (upload_and_insert.py, loader functions) to handle the new field.

Running Tests

Backend (Jest + Supertest)

cd backend
npm install
npm test             # run all tests
npx jest --watch     # watch mode
npm test -- --runInBand  # single-threaded

Sample test (backend/tests/health.test.js):

const request = require('supertest');
const app = require('../server');  // exports Express app

test('GET /health returns OK', async () => {
  const res = await request(app).get('/health');
  expect(res.status).toBe(200);
  expect(res.text).toMatch(/OK/);
});

Frontend (React Testing Library)

cd frontend
npm install
npm test              # interactive watch
npm test -- --watchAll=false  # single run

Sample test (frontend/src/App.test.js):

import { render, screen } from '@testing-library/react';
import App from './App';

test('renders learn react link', () => {
  render(<App />);
  expect(screen.getByText(/learn react/i)).toBeInTheDocument();
});

Generate coverage:

npm test -- --coverage

Unified Test Command

Add to root package.json:

"scripts": {
  "test": "npm run test --prefix backend && npm run test --prefix frontend"
}

Then:

npm install
npm test

Contributing Enhancements

  1. Fork and clone the repo:
    git clone git@github.com:youruser/movie-barcodes.git
    
  2. Install root dependencies (uses npm workspaces):
    npm install
    
  3. Create a feature branch:
    git checkout -b feature/add-trailer-script
    
  4. Follow code style:
    • Run npm run lint in each package (backend/frontend).
    • Use Prettier for formatting.
  5. Write or update tests for any new functionality.
  6. Push branch and open a PR with:
    • Description of changes
    • Testing instructions
    • Screenshots or logs for non-trivial features

Example: Adding a Batch Trailer Downloader

File: backend/trailer-downloader.js exposes TrailerDownloader for bulk fetching:

import TrailerDownloader from './trailer-downloader';

async function downloadAll(ids) {
  const downloader = new TrailerDownloader({
    concurrency: 4,
    outputDir: './trailers',
    manifest: './trailers/manifest.json'
  });
  try {
    await downloader.downloadMany(ids);
    console.log('All trailers downloaded');
  } finally {
    downloader.cleanup();
  }
}

downloadAll(['tt1375666', 'tt0468569']);
  • Add tests in backend/tests/ mocking yt-dlp calls.
  • Update README.md with usage and configuration options.

By following these practices, you ensure smooth content updates, reliable schema migrations, robust testing, and clean, maintainable contributions.