Compare commits

..

14 Commits

Author SHA1 Message Date
Claude b374d7eb87 fix: resolve lint and formatting failures
- Fix ESLint errors in compare.ts: remove unused params, use toSorted(),
  restructure negated conditions, move imgTag to module scope, replace
  process.exit with throw
- Fix ESLint errors in analyze-deps.ts: use toSorted(), wrap callback
  in arrow function, replace process.exit with throw
- Run Prettier on compare.ts, run-scenarios.ts, and navigation-bar.svelte

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 22:22:33 +00:00
Claude d20def9f66 fix: base screenshots + switch to artifact-based image hosting
- Fix blank base screenshots by aggressively killing the preview server
  between PR and base steps (fuser -k on port 4173) and clearing the
  SvelteKit build cache before rebuilding the base version
- Replace git branch push with GitHub Actions artifact upload:
  - Upload full screenshots as a zipped artifact for download
  - Upload an HTML report with embedded base64 images as a non-zipped
    artifact (archive: false) for direct browser viewing
- Update compare.ts to generate both a text-only markdown summary
  (for the PR comment) and a self-contained HTML visual comparison
- Downgrade permissions from contents:write to contents:read since
  we no longer push to the repository

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 20:38:58 +00:00
Claude 13e8a0121f fix: disable pnpm verifyDepsBeforeRun for base steps and restore PR source
The workspace has verifyDepsBeforeRun: install which triggers pnpm install
before every pnpm exec/run. After checking out the base web/SDK code, the
package.json files no longer match the lockfile, causing pnpm to re-install
and corrupt the workspace state. This broke both the base build and the
compare step.

- Set PNPM_VERIFY_DEPS_BEFORE_RUN=false on all steps after the base checkout
- Add a "Restore PR source" step to reset web/SDK to the PR version before
  running compare, so subsequent pnpm commands work normally

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:59:29 +00:00
Claude fe4c0a95d5 fix: use pnpm build without install for base and ensure dirs exist
- Remove pnpm install from base build steps since it modifies the workspace
  lockfile and can break tsx/other deps used by later steps
- Just run pnpm build using the existing node_modules from the PR install
- Ensure screenshot directories exist before compare step runs

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude c5abd18a64 fix: handle base build failures and add resilient API mocking
- Drop --frozen-lockfile for base builds since the PR's lockfile may not
  match the base branch's package.json
- Add continue-on-error and step dependencies so the workflow continues
  gracefully when base builds fail
- Add catch-all /api/** mock to return empty JSON for unmocked endpoints
- Block socket.io WebSocket connections that prevent networkidle
- Add networkidle timeout fallback in screenshot runner

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude cf1a9ed3f5 fix: add catch-all API mock and socket.io block for base screenshots
Base screenshots showed loading spinners or were missing entirely because:
1. Unmocked API calls (e.g. /api/people, /api/search/explore) hit the static
   preview server which returns HTML instead of JSON, preventing networkidle
2. Socket.io WebSocket connections never complete handshake, blocking networkidle

Add a catch-all /api/** mock (registered first, so specific mocks take priority)
that returns empty JSON for any unmocked endpoint. Block socket.io connections.
Also add a networkidle timeout fallback in run-scenarios.ts so screenshots are
still captured even if networkidle doesn't resolve within 15s.

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude feacf9b134 fix: improve screenshot waiting, add inline images to PR comments
- Increase waitForSelector timeout from 5s to 15s for slower page loads
- Add explicit wait for loading spinners to disappear before screenshot
- Push screenshot images to a temporary branch for inline display
- Read report.md from compare.ts with raw.githubusercontent.com URLs
- Accept optional image base URL in compare.ts CLI
- Upgrade contents permission to write for pushing screenshot branch

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude 67eb33b3a7 fix: use GitHub API for changed files detection and replace add-pr-comment
- Use pulls.listFiles API instead of git diff to detect changed web files
  (resolves issue where git diff returned empty due to checkout strategy)
- Replace mshick/add-pr-comment with actions/github-script for all PR
  comments (fixes "Unexpected input 'github-token'" warning)
- Skip pnpm/playwright setup when no web changes detected (faster skip)
- Add diagnostic logging for changed file detection

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude 23e3d43578 fix: add tsx dependency and fix workflow step ordering
- Add tsx as an e2e devDependency (was missing, causing silent failures)
- Replace `npx tsx` with `pnpm exec tsx` throughout
- Move `pnpm install` before `playwright install` (correct ordering)
- Remove `2>/dev/null` that was hiding analyzer errors
- Add debug logging to route analysis step
- Set default working-directory for the job

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude 028c8a2276 fix: use env vars instead of template expansion in visual-review workflow
Moves all ${{ }} expressions out of `run:` blocks and into `env:` to
prevent potential code injection via template expansion (zizmor finding).
Also switches the initial PR comment to use github-script with env vars
instead of add-pr-comment with inline template interpolation.

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude b7f4cc8171 test: add visible navbar changes to test visual review workflow
Adds a red background tint and [VISUAL TEST] label to the navigation bar.
This commit is intended to be reverted after testing the visual-review
workflow — it exercises the dependency analyzer by changing a shared
component used across many pages.

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Claude 5c11d15008 feat: add visual review workflow for automated PR screenshot comparison
Adds a label-triggered GitHub Actions workflow that automatically generates
before/after screenshots when web UI changes are made in a PR. Uses smart
dependency analysis to only screenshot pages affected by the changed files.

Key components:
- Reverse dependency analyzer: traces changed files through the import graph
  to find which +page.svelte routes are affected
- Screenshot scenarios: Playwright tests using existing mock-network infrastructure
  (no Docker/backend needed) for fast, deterministic screenshots
- Pixel comparison: generates diff images highlighting changed pixels
- GitHub Actions workflow: triggered by 'visual-review' label, posts results
  as a PR comment with change percentages per page

https://claude.ai/code/session_01XSTqDJXuR4jaLN7SGm3uES
2026-03-01 19:52:59 +00:00
Luis Nachtigall f4e156494f feat(mobile): add playbackStyle to local asset entity and related database schema (#26596)
* feat: add playbackStyle to local asset entity and related database schema

* implement conversion function for playbackStyle in local sync service

* implement conversion function for playbackStyle in local sync service

* refactor: remove deducedPlaybackStyle from TrashedLocalAssetEntityData

* add playbackStyle column to trashed local asset entity

* make playbackStyle non-nullable across the mobile codebase

* Streamline playbackStyle backfill:
- only backfill local assets playbackStyle in flutter/dart code
- only update trashed local assets in db migration

* bump target database version to 23 and update migration logic for playbackStyle

* set playback_style to 0 in merged_asset.drift as its a getter in base asset

* run make pigeon

* Populate playbackStyle for trashed assets during native migration
2026-03-01 14:50:21 -05:00
Min Idzelis 84abad564e fix(server): deduplicate shared links in getAll query (#26395) 2026-03-01 14:41:15 -05:00
61 changed files with 11480 additions and 917 deletions
+406
View File
@@ -0,0 +1,406 @@
name: Visual Review
on:
pull_request:
types: [labeled, synchronize]
permissions: {}
jobs:
visual-diff:
name: Visual Diff Screenshots
runs-on: ubuntu-latest
if: >-
(github.event.action == 'labeled' && github.event.label.name == 'visual-review') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'visual-review'))
permissions:
contents: read
pull-requests: write
defaults:
run:
working-directory: ./e2e
steps:
- id: token
uses: immich-app/devtools/actions/create-workflow-token@05e16407c0a5492138bb38139c9d9bf067b40886 # create-workflow-token-action-v1.0.1
with:
app-id: ${{ secrets.PUSH_O_MATIC_APP_ID }}
private-key: ${{ secrets.PUSH_O_MATIC_APP_KEY }}
- name: Checkout PR branch
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
token: ${{ steps.token.outputs.token }}
fetch-depth: 0
- name: Determine changed web files
id: changed-files
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ steps.token.outputs.token }}
script: |
const files = [];
const perPage = 100;
let page = 1;
while (true) {
const { data } = await github.rest.pulls.listFiles({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number,
per_page: perPage,
page,
});
files.push(...data);
if (data.length < perPage) break;
page++;
}
const webPrefixes = ['web/', 'i18n/', 'open-api/typescript-sdk/'];
const webFiles = files
.map(f => f.filename)
.filter(f => webPrefixes.some(p => f.startsWith(p)));
console.log(`Total PR files: ${files.length}`);
console.log(`Web-related files: ${webFiles.length}`);
for (const f of webFiles) {
console.log(` ${f}`);
}
core.setOutput('files', webFiles.join('\n'));
core.setOutput('has_changes', webFiles.length > 0 ? 'true' : 'false');
- name: Setup pnpm
if: steps.changed-files.outputs.has_changes == 'true'
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Setup Node
if: steps.changed-files.outputs.has_changes == 'true'
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
with:
node-version-file: './e2e/.nvmrc'
cache: 'pnpm'
cache-dependency-path: '**/pnpm-lock.yaml'
- name: Install e2e dependencies
if: steps.changed-files.outputs.has_changes == 'true'
run: pnpm install --frozen-lockfile
- name: Install Playwright
if: steps.changed-files.outputs.has_changes == 'true'
run: pnpm exec playwright install chromium --only-shell
- name: Analyze affected routes
if: steps.changed-files.outputs.has_changes == 'true'
id: routes
env:
CHANGED_FILES: ${{ steps.changed-files.outputs.files }}
run: |
echo "Changed files:"
echo "$CHANGED_FILES"
echo "---"
ROUTES=$(echo "$CHANGED_FILES" | xargs pnpm exec tsx src/screenshots/analyze-deps.ts 2>&1 | tee /dev/stderr | grep "^ /" | sed 's/^ //' || true)
echo "routes<<EOF" >> "$GITHUB_OUTPUT"
echo "$ROUTES" >> "$GITHUB_OUTPUT"
echo "EOF" >> "$GITHUB_OUTPUT"
if [ -z "$ROUTES" ]; then
echo "has_routes=false" >> "$GITHUB_OUTPUT"
else
echo "has_routes=true" >> "$GITHUB_OUTPUT"
# Build the scenario filter JSON array
SCENARIO_NAMES=$(pnpm exec tsx -e "
import { getScenariosForRoutes } from './src/screenshots/page-map.ts';
const routes = process.argv.slice(1);
const scenarios = getScenariosForRoutes(routes);
console.log(JSON.stringify(scenarios.map(s => s.name)));
" $ROUTES)
echo "scenarios=$SCENARIO_NAMES" >> "$GITHUB_OUTPUT"
echo "Scenarios: $SCENARIO_NAMES"
fi
- name: Post initial comment
if: steps.changed-files.outputs.has_changes == 'true' && steps.routes.outputs.has_routes == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
AFFECTED_ROUTES: ${{ steps.routes.outputs.routes }}
with:
github-token: ${{ steps.token.outputs.token }}
script: |
const routes = process.env.AFFECTED_ROUTES || '';
const body = `## Visual Review\n\nGenerating screenshots for affected pages...\n\nAffected routes:\n\`\`\`\n${routes}\n\`\`\``;
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const existing = comments.data.find(c => c.body && c.body.includes('## Visual Review'));
if (existing) {
await github.rest.issues.updateComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: existing.id,
body,
});
} else {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body,
});
}
# === Screenshot PR version ===
- name: Build SDK (PR)
if: steps.routes.outputs.has_routes == 'true'
run: pnpm install --frozen-lockfile && pnpm build
working-directory: ./open-api/typescript-sdk
- name: Build web (PR)
if: steps.routes.outputs.has_routes == 'true'
run: pnpm install --frozen-lockfile && pnpm build
working-directory: ./web
- name: Take screenshots (PR)
if: steps.routes.outputs.has_routes == 'true'
env:
PW_EXPERIMENTAL_SERVICE_WORKER_NETWORK_EVENTS: '1'
SCREENSHOT_OUTPUT_DIR: ${{ github.workspace }}/screenshots/pr
SCREENSHOT_SCENARIOS: ${{ steps.routes.outputs.scenarios }}
SCREENSHOT_BASE_URL: http://127.0.0.1:4173
run: |
# Start the preview server in background
cd ../web && pnpm preview --port 4173 --host 127.0.0.1 &
SERVER_PID=$!
# Wait for server to be ready
for i in $(seq 1 30); do
if curl -s http://127.0.0.1:4173 > /dev/null 2>&1; then
echo "Server ready after ${i}s"
break
fi
sleep 1
done
# Run screenshot tests
pnpm exec playwright test --config playwright.screenshot.config.ts || true
# Stop the preview server and all children (pnpm spawns vite as child)
kill $SERVER_PID 2>/dev/null || true
sleep 1
# Ensure port is fully released — kill any lingering vite process
fuser -k 4173/tcp 2>/dev/null || true
sleep 1
# === Screenshot base version ===
# Disable pnpm's verifyDepsBeforeRun for all base steps since the base
# checkout changes package.json files, making them mismatch the lockfile.
- name: Checkout base web directory
if: steps.routes.outputs.has_routes == 'true'
env:
BASE_SHA: ${{ github.event.pull_request.base.sha }}
run: |
# Restore web directory from base branch
git checkout "$BASE_SHA" -- web/ open-api/typescript-sdk/ i18n/ || true
# Clear SvelteKit build cache to avoid stale artifacts from the PR build
rm -rf web/.svelte-kit web/build
working-directory: .
- name: Build SDK (base)
if: steps.routes.outputs.has_routes == 'true'
continue-on-error: true
id: base-sdk
env:
PNPM_VERIFY_DEPS_BEFORE_RUN: 'false'
run: pnpm build
working-directory: ./open-api/typescript-sdk
- name: Build web (base)
if: steps.routes.outputs.has_routes == 'true' && steps.base-sdk.outcome == 'success'
continue-on-error: true
id: base-web
env:
PNPM_VERIFY_DEPS_BEFORE_RUN: 'false'
run: pnpm build
working-directory: ./web
- name: Take screenshots (base)
if: steps.routes.outputs.has_routes == 'true' && steps.base-web.outcome == 'success'
env:
PW_EXPERIMENTAL_SERVICE_WORKER_NETWORK_EVENTS: '1'
SCREENSHOT_OUTPUT_DIR: ${{ github.workspace }}/screenshots/base
SCREENSHOT_SCENARIOS: ${{ steps.routes.outputs.scenarios }}
SCREENSHOT_BASE_URL: http://127.0.0.1:4173
PNPM_VERIFY_DEPS_BEFORE_RUN: 'false'
run: |
# Kill any process still on port 4173 from the PR step
fuser -k 4173/tcp 2>/dev/null || true
sleep 1
# Start the preview server in background
cd ../web && pnpm preview --port 4173 --host 127.0.0.1 &
SERVER_PID=$!
# Wait for server to be ready
for i in $(seq 1 30); do
if curl -s http://127.0.0.1:4173 > /dev/null 2>&1; then
echo "Server ready after ${i}s"
break
fi
sleep 1
done
# Run screenshot tests
pnpm exec playwright test --config playwright.screenshot.config.ts || true
# Stop the preview server
kill $SERVER_PID 2>/dev/null || true
fuser -k 4173/tcp 2>/dev/null || true
- name: Restore PR source
if: steps.routes.outputs.has_routes == 'true'
env:
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
run: |
git checkout "$HEAD_SHA" -- web/ open-api/typescript-sdk/ i18n/ || true
working-directory: .
# === Compare and report ===
- name: Compare screenshots
if: steps.routes.outputs.has_routes == 'true'
env:
WORKSPACE_DIR: ${{ github.workspace }}
run: |
# Ensure directories exist even if base screenshots were skipped
mkdir -p "$WORKSPACE_DIR/screenshots/base" "$WORKSPACE_DIR/screenshots/pr" "$WORKSPACE_DIR/screenshots/diff"
pnpm exec tsx src/screenshots/compare.ts \
"$WORKSPACE_DIR/screenshots/base" \
"$WORKSPACE_DIR/screenshots/pr" \
"$WORKSPACE_DIR/screenshots/diff"
- name: Upload screenshot artifacts
if: steps.routes.outputs.has_routes == 'true'
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: visual-review-screenshots
path: screenshots/
retention-days: 14
- name: Upload HTML report
if: steps.routes.outputs.has_routes == 'true'
id: html-report
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
path: screenshots/diff/visual-review.html
archive: false
retention-days: 14
- name: Post comparison results
if: steps.routes.outputs.has_routes == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
REPORT_URL: ${{ steps.html-report.outputs.artifact-url }}
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
with:
github-token: ${{ steps.token.outputs.token }}
script: |
const fs = require('fs');
const path = require('path');
const reportPath = path.join(process.env.GITHUB_WORKSPACE, 'screenshots', 'diff', 'report.md');
let body;
try {
body = fs.readFileSync(reportPath, 'utf8');
} catch {
body = '## Visual Review\n\nScreenshot comparison failed. Check the workflow artifacts for details.';
}
// Append links to the HTML report artifact and workflow run
const reportUrl = process.env.REPORT_URL;
const runUrl = process.env.RUN_URL;
body += '\n---\n';
if (reportUrl) {
body += `[View full visual comparison](${reportUrl}) | `;
}
body += `[Download all screenshots](${runUrl}#artifacts)\n`;
// Find and update existing comment or create new one
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const botComment = comments.data.find(c =>
c.body && c.body.includes('## Visual Review')
);
if (botComment) {
await github.rest.issues.updateComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: botComment.id,
body,
});
} else {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body,
});
}
- name: No web changes
if: steps.changed-files.outputs.has_changes != 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ steps.token.outputs.token }}
script: |
const body = '## Visual Review\n\nNo web-related file changes detected in this PR. Visual review not needed.';
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const existing = comments.data.find(c => c.body && c.body.includes('## Visual Review'));
if (existing) {
await github.rest.issues.updateComment({
owner: context.repo.owner, repo: context.repo.repo,
comment_id: existing.id, body,
});
} else {
await github.rest.issues.createComment({
owner: context.repo.owner, repo: context.repo.repo,
issue_number: context.issue.number, body,
});
}
- name: No affected routes
if: steps.changed-files.outputs.has_changes == 'true' && steps.routes.outputs.has_routes != 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
github-token: ${{ steps.token.outputs.token }}
script: |
const body = '## Visual Review\n\nChanged files don\'t affect any pages with screenshot scenarios configured.\nTo add coverage, define new scenarios in `e2e/src/screenshots/page-map.ts`.';
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const existing = comments.data.find(c => c.body && c.body.includes('## Visual Review'));
if (existing) {
await github.rest.issues.updateComment({
owner: context.repo.owner, repo: context.repo.repo,
comment_id: existing.id, body,
});
} else {
await github.rest.issues.createComment({
owner: context.repo.owner, repo: context.repo.repo,
issue_number: context.issue.number, body,
});
}
+1
View File
@@ -24,6 +24,7 @@ open-api/typescript-sdk/build
mobile/android/fastlane/report.xml
mobile/ios/fastlane/report.xml
screenshots-output
vite.config.js.timestamp-*
.pnpm-store
.devcontainer/library
-1
View File
@@ -11,7 +11,6 @@ services:
immich-server:
container_name: immich-e2e-server
image: immich-server:latest
shm_size: 128mb
build:
context: ../
dockerfile: server/Dockerfile
+5 -1
View File
@@ -18,7 +18,10 @@
"format:fix": "prettier --write .",
"lint": "eslint \"src/**/*.ts\" --max-warnings 0",
"lint:fix": "pnpm run lint --fix",
"check": "tsc --noEmit"
"check": "tsc --noEmit",
"screenshots": "pnpm exec playwright test --config playwright.screenshot.config.ts",
"screenshots:compare": "pnpm exec tsx src/screenshots/compare.ts",
"screenshots:analyze": "pnpm exec tsx src/screenshots/analyze-deps.ts"
},
"keywords": [],
"author": "",
@@ -51,6 +54,7 @@
"sharp": "^0.34.5",
"socket.io-client": "^4.7.4",
"supertest": "^7.0.0",
"tsx": "^4.21.0",
"typescript": "^5.3.3",
"typescript-eslint": "^8.28.0",
"utimes": "^5.2.1",
+27
View File
@@ -0,0 +1,27 @@
import { defineConfig, devices } from '@playwright/test';
const baseUrl = process.env.SCREENSHOT_BASE_URL ?? 'http://127.0.0.1:4173';
export default defineConfig({
testDir: './src/screenshots',
testMatch: /run-scenarios\.ts/,
fullyParallel: false,
forbidOnly: !!process.env.CI,
retries: 0,
reporter: 'list',
use: {
baseURL: baseUrl,
screenshot: 'off',
trace: 'off',
},
workers: 1,
projects: [
{
name: 'screenshots',
use: {
...devices['Desktop Chrome'],
viewport: { width: 1920, height: 1080 },
},
},
],
});
+249
View File
@@ -0,0 +1,249 @@
/**
* Reverse dependency analyzer for the Immich web app.
*
* Given a list of changed files, traces upward through the import graph
* to find which +page.svelte routes are affected, then maps those to URL paths.
*/
import { readFileSync, readdirSync, statSync } from 'node:fs';
import { dirname, join, relative, resolve } from 'node:path';
const WEB_SRC = resolve(import.meta.dirname, '../../../web/src');
const LIB_ALIAS = resolve(WEB_SRC, 'lib');
/** Collect all .svelte, .ts, .js files under web/src/ */
function collectFiles(dir: string): string[] {
const results: string[] = [];
for (const entry of readdirSync(dir)) {
const full = join(dir, entry);
const stat = statSync(full);
if (stat.isDirectory()) {
if (entry === 'node_modules' || entry === '.svelte-kit') {
continue;
}
results.push(...collectFiles(full));
} else if (/\.(svelte|ts|js)$/.test(entry)) {
results.push(full);
}
}
return results;
}
/** Extract import specifiers from a file's source text. */
function extractImports(source: string): string[] {
const specifiers: string[] = [];
// Match: import ... from '...' / import '...' / export ... from '...'
const importRegex = /(?:import|export)\s+(?:[\s\S]*?\s+from\s+)?['"]([^'"]+)['"]/g;
let match;
while ((match = importRegex.exec(source)) !== null) {
specifiers.push(match[1]);
}
// Match dynamic imports: import('...')
const dynamicRegex = /import\(\s*['"]([^'"]+)['"]\s*\)/g;
while ((match = dynamicRegex.exec(source)) !== null) {
specifiers.push(match[1]);
}
return specifiers;
}
/** Resolve an import specifier to an absolute file path (or null if external). */
function resolveImport(specifier: string, fromFile: string, allFiles: Set<string>): string | null {
// Handle $lib alias
let resolved: string;
if (specifier.startsWith('$lib/') || specifier === '$lib') {
resolved = specifier.replace('$lib', LIB_ALIAS);
} else if (specifier.startsWith('./') || specifier.startsWith('../')) {
resolved = resolve(dirname(fromFile), specifier);
} else {
// External package import — not relevant
return null;
}
// Try exact match, then common extensions
const extensions = ['', '.ts', '.js', '.svelte', '/index.ts', '/index.js', '/index.svelte'];
for (const ext of extensions) {
const candidate = resolved + ext;
if (allFiles.has(candidate)) {
return candidate;
}
}
return null;
}
/** Build the forward dependency graph: file → set of files it imports. */
function buildDependencyGraph(files: string[]): Map<string, Set<string>> {
const fileSet = new Set(files);
const graph = new Map<string, Set<string>>();
for (const file of files) {
const deps = new Set<string>();
graph.set(file, deps);
try {
const source = readFileSync(file, 'utf8');
for (const specifier of extractImports(source)) {
const resolved = resolveImport(specifier, file, fileSet);
if (resolved) {
deps.add(resolved);
}
}
} catch {
// Skip files that can't be read
}
}
return graph;
}
/** Invert the dependency graph: file → set of files that import it. */
function buildReverseDependencyGraph(forwardGraph: Map<string, Set<string>>): Map<string, Set<string>> {
const reverse = new Map<string, Set<string>>();
for (const [file, deps] of forwardGraph) {
for (const dep of deps) {
let importers = reverse.get(dep);
if (!importers) {
importers = new Set();
reverse.set(dep, importers);
}
importers.add(file);
}
}
return reverse;
}
/** BFS from changed files upward through reverse deps to find +page.svelte files. */
function findAffectedPages(changedFiles: string[], reverseGraph: Map<string, Set<string>>): Set<string> {
const visited = new Set<string>();
const pages = new Set<string>();
const queue = [...changedFiles];
while (queue.length > 0) {
const file = queue.shift()!;
if (visited.has(file)) {
continue;
}
visited.add(file);
if (file.endsWith('+page.svelte') || file.endsWith('+layout.svelte')) {
pages.add(file);
// If it's a layout, keep tracing upward because the layout itself
// isn't a page — but the pages under it are affected.
// If it's a +page.svelte, we still want to continue in case
// this page is imported by others.
}
const importers = reverseGraph.get(file);
if (importers) {
for (const importer of importers) {
if (!visited.has(importer)) {
queue.push(importer);
}
}
}
}
// For +layout.svelte hits, also find all +page.svelte under the same directory tree
const layoutDirs: string[] = [];
for (const page of pages) {
if (page.endsWith('+layout.svelte')) {
layoutDirs.push(dirname(page));
pages.delete(page);
}
}
if (layoutDirs.length > 0) {
for (const file of reverseGraph.keys()) {
if (file.endsWith('+page.svelte')) {
for (const layoutDir of layoutDirs) {
if (file.startsWith(layoutDir)) {
pages.add(file);
}
}
}
}
// Also check the forward graph keys for page files under layout dirs
for (const layoutDir of layoutDirs) {
const allFiles = collectFiles(layoutDir);
for (const f of allFiles) {
if (f.endsWith('+page.svelte')) {
pages.add(f);
}
}
}
}
return pages;
}
/** Convert a +page.svelte file path to its URL route. */
export function pageFileToRoute(pageFile: string): string {
const routesDir = resolve(WEB_SRC, 'routes');
let rel = relative(routesDir, dirname(pageFile));
// Remove SvelteKit group markers: (user), (list), etc.
rel = rel.replaceAll(/\([^)]+\)\/?/g, '');
// Remove parameter segments: [albumId=id], [[photos=photos]], [[assetId=id]]
rel = rel.replaceAll(/\[\[?[^\]]+\]\]?\/?/g, '');
// Clean up trailing slashes and normalize
rel = rel.replaceAll(/\/+/g, '/').replace(/\/$/, '');
return '/' + rel;
}
export interface AnalysisResult {
affectedPages: string[];
affectedRoutes: string[];
}
/** Main entry: analyze which routes are affected by the given changed files. */
export function analyzeAffectedRoutes(changedFiles: string[]): AnalysisResult {
// Resolve changed files to absolute paths relative to web/src
const webRoot = resolve(WEB_SRC, '..');
const resolvedChanged = changedFiles
.filter((f) => f.startsWith('web/'))
.map((f) => resolve(webRoot, '..', f))
.filter((f) => statSync(f, { throwIfNoEntry: false })?.isFile());
if (resolvedChanged.length === 0) {
return { affectedPages: [], affectedRoutes: [] };
}
const allFiles = collectFiles(WEB_SRC);
const forwardGraph = buildDependencyGraph(allFiles);
const reverseGraph = buildReverseDependencyGraph(forwardGraph);
const pages = findAffectedPages(resolvedChanged, reverseGraph);
const affectedPages = [...pages].toSorted();
const affectedRoutes = [...new Set(affectedPages.map((f) => pageFileToRoute(f)))].toSorted();
return { affectedPages, affectedRoutes };
}
// CLI usage: node --import tsx analyze-deps.ts file1 file2 ...
if (process.argv[1]?.endsWith('analyze-deps.ts') || process.argv[1]?.endsWith('analyze-deps.js')) {
const files = process.argv.slice(2);
if (files.length === 0) {
console.log('Usage: analyze-deps.ts <changed-file1> <changed-file2> ...');
console.log('Files should be relative to the repo root (e.g. web/src/lib/components/Button.svelte)');
throw new Error('No files provided');
}
const result = analyzeAffectedRoutes(files);
console.log('Affected pages:');
for (const page of result.affectedPages) {
console.log(` ${page}`);
}
console.log('\nAffected routes:');
for (const route of result.affectedRoutes) {
console.log(` ${route}`);
}
}
+335
View File
@@ -0,0 +1,335 @@
/**
* Pixel-level comparison of base vs PR screenshots.
*
* Uses pixelmatch to generate diff images and calculate change percentages.
*
* Usage:
* npx tsx e2e/src/screenshots/compare.ts <base-dir> <pr-dir> <output-dir>
*/
import { existsSync, mkdirSync, readdirSync, readFileSync, writeFileSync } from 'node:fs';
import { basename, join, resolve } from 'node:path';
import { PNG } from 'pngjs';
// pixelmatch is a lightweight dependency — use a simple inline implementation
// based on the approach from the pixelmatch library to avoid adding a new dependency.
// The e2e package already has pngjs.
function pixelMatch(img1Data: Uint8Array, img2Data: Uint8Array, diffData: Uint8Array): number {
let diffCount = 0;
for (let i = 0; i < img1Data.length; i += 4) {
const r1 = img1Data[i];
const g1 = img1Data[i + 1];
const b1 = img1Data[i + 2];
const r2 = img2Data[i];
const g2 = img2Data[i + 1];
const b2 = img2Data[i + 2];
const dr = Math.abs(r1 - r2);
const dg = Math.abs(g1 - g2);
const db = Math.abs(b1 - b2);
// Threshold: if any channel differs by more than 25, mark as different
const isDiff = dr > 25 || dg > 25 || db > 25;
if (isDiff) {
// Red highlight for diff pixels
diffData[i] = 255;
diffData[i + 1] = 0;
diffData[i + 2] = 0;
diffData[i + 3] = 255;
diffCount++;
} else {
// Dimmed original for unchanged pixels
const gray = Math.round(0.299 * r1 + 0.587 * g1 + 0.114 * b1);
diffData[i] = gray;
diffData[i + 1] = gray;
diffData[i + 2] = gray;
diffData[i + 3] = 128;
}
}
return diffCount;
}
export interface ComparisonResult {
name: string;
baseExists: boolean;
prExists: boolean;
diffPixels: number;
totalPixels: number;
changePercent: number;
diffImagePath: string | null;
baseImagePath: string | null;
prImagePath: string | null;
}
export function compareScreenshots(baseDir: string, prDir: string, outputDir: string): ComparisonResult[] {
mkdirSync(outputDir, { recursive: true });
// Collect all screenshot names from both directories
const baseFiles = existsSync(baseDir)
? new Set(readdirSync(baseDir).filter((f) => f.endsWith('.png')))
: new Set<string>();
const prFiles = existsSync(prDir) ? new Set(readdirSync(prDir).filter((f) => f.endsWith('.png'))) : new Set<string>();
const allNames = new Set([...baseFiles, ...prFiles]);
const results: ComparisonResult[] = [];
for (const fileName of [...allNames].toSorted()) {
const name = basename(fileName, '.png');
const basePath = join(baseDir, fileName);
const prPath = join(prDir, fileName);
const baseExists = baseFiles.has(fileName);
const prExists = prFiles.has(fileName);
if (!baseExists || !prExists) {
// New or removed page
results.push({
name,
baseExists,
prExists,
diffPixels: -1,
totalPixels: -1,
changePercent: 100,
diffImagePath: null,
baseImagePath: baseExists ? basePath : null,
prImagePath: prExists ? prPath : null,
});
continue;
}
// Load both PNGs
const basePng = PNG.sync.read(readFileSync(basePath));
const prPng = PNG.sync.read(readFileSync(prPath));
// Handle size mismatches by comparing the overlapping region
const width = Math.max(basePng.width, prPng.width);
const height = Math.max(basePng.height, prPng.height);
// Resize images to the same dimensions (pad with transparent)
const normalizedBase = normalizeImage(basePng, width, height);
const normalizedPr = normalizeImage(prPng, width, height);
const diffPng = new PNG({ width, height });
const totalPixels = width * height;
const diffPixels = pixelMatch(normalizedBase, normalizedPr, diffPng.data as unknown as Uint8Array);
const diffImagePath = join(outputDir, `${name}-diff.png`);
writeFileSync(diffImagePath, PNG.sync.write(diffPng));
results.push({
name,
baseExists,
prExists,
diffPixels,
totalPixels,
changePercent: totalPixels > 0 ? (diffPixels / totalPixels) * 100 : 0,
diffImagePath,
baseImagePath: basePath,
prImagePath: prPath,
});
}
return results;
}
function normalizeImage(png: PNG, targetWidth: number, targetHeight: number): Uint8Array {
if (png.width === targetWidth && png.height === targetHeight) {
return png.data as unknown as Uint8Array;
}
const data = new Uint8Array(targetWidth * targetHeight * 4);
for (let y = 0; y < targetHeight; y++) {
for (let x = 0; x < targetWidth; x++) {
const targetIdx = (y * targetWidth + x) * 4;
if (x < png.width && y < png.height) {
const sourceIdx = (y * png.width + x) * 4;
data[targetIdx] = png.data[sourceIdx];
data[targetIdx + 1] = png.data[sourceIdx + 1];
data[targetIdx + 2] = png.data[sourceIdx + 2];
data[targetIdx + 3] = png.data[sourceIdx + 3];
} else {
// Transparent padding
data[targetIdx + 3] = 0;
}
}
}
return data;
}
/** Generate a text-only markdown summary for the PR comment. */
export function generateMarkdownReport(results: ComparisonResult[]): string {
const changed = results.filter((r) => r.changePercent > 0.1);
const unchanged = results.filter((r) => r.changePercent <= 0.1);
if (changed.length === 0) {
return '## Visual Review\n\nNo visual changes detected in the affected pages.';
}
let md = '## Visual Review\n\n';
md += `Found **${changed.length}** page(s) with visual changes`;
if (unchanged.length > 0) {
md += ` (${unchanged.length} unchanged)`;
}
md += '.\n\n';
md += '| Page | Status | Change |\n';
md += '|------|--------|--------|\n';
for (const result of changed) {
if (result.baseExists && result.prExists) {
md += `| ${result.name} | Changed | ${result.changePercent.toFixed(1)}% |\n`;
} else if (result.prExists) {
md += `| ${result.name} | New | - |\n`;
} else {
md += `| ${result.name} | Removed | - |\n`;
}
}
md += '\n';
if (unchanged.length > 0) {
md += '<details>\n<summary>Unchanged pages</summary>\n\n';
for (const result of unchanged) {
md += `- ${result.name}\n`;
}
md += '\n</details>\n';
}
return md;
}
function imgTag(filePath: string | null, alt: string): string {
if (!filePath || !existsSync(filePath)) {
return `<div class="no-image">${alt} not available</div>`;
}
const data = readFileSync(filePath);
return `<img src="data:image/png;base64,${data.toString('base64')}" alt="${alt}" loading="lazy" />`;
}
/** Generate an HTML report with embedded base64 images for the artifact. */
export function generateHtmlReport(results: ComparisonResult[]): string {
const changed = results.filter((r) => r.changePercent > 0.1);
const unchanged = results.filter((r) => r.changePercent <= 0.1);
let html = `<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Visual Review</title>
<style>
* { box-sizing: border-box; margin: 0; padding: 0; }
body { font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Helvetica, Arial, sans-serif;
background: #0d1117; color: #e6edf3; padding: 32px; line-height: 1.5; }
.container { max-width: 1800px; margin: 0 auto; }
h1 { font-size: 24px; border-bottom: 1px solid #30363d; padding-bottom: 12px; margin-bottom: 24px; }
.summary { color: #8b949e; margin-bottom: 32px; font-size: 16px; }
.scenario { margin-bottom: 40px; border: 1px solid #30363d; border-radius: 8px; overflow: hidden; }
.scenario-header { background: #161b22; padding: 12px 16px; display: flex; align-items: center; gap: 12px; }
.scenario-header h2 { font-size: 16px; font-weight: 600; }
.badge { display: inline-block; padding: 2px 10px; border-radius: 12px; font-size: 12px; font-weight: 500; }
.badge-changed { background: #da363380; color: #f85149; }
.badge-new { background: #1f6feb80; color: #58a6ff; }
.badge-removed { background: #6e767e80; color: #8b949e; }
.grid { display: grid; grid-template-columns: 1fr 1fr 1fr; gap: 1px; background: #30363d; }
.grid-cell { background: #0d1117; }
.grid-label { text-align: center; padding: 8px; font-size: 13px; color: #8b949e; font-weight: 600;
background: #161b22; text-transform: uppercase; letter-spacing: 0.5px; }
.grid-cell img { width: 100%; display: block; }
.no-image { padding: 40px; text-align: center; color: #484f58; font-style: italic; }
.unchanged-section { margin-top: 32px; color: #8b949e; }
.unchanged-section summary { cursor: pointer; font-size: 14px; }
.unchanged-section ul { margin-top: 8px; padding-left: 24px; }
.unchanged-section li { font-size: 14px; margin: 4px 0; }
</style>
</head>
<body>
<div class="container">
<h1>Visual Review</h1>
`;
if (changed.length === 0) {
html += '<p class="summary">No visual changes detected in the affected pages.</p>';
} else {
html += `<p class="summary">Found <strong>${changed.length}</strong> page(s) with visual changes`;
if (unchanged.length > 0) {
html += ` (${unchanged.length} unchanged)`;
}
html += '.</p>\n';
for (const result of changed) {
html += '<div class="scenario">\n<div class="scenario-header">\n';
html += `<h2>${result.name}</h2>\n`;
if (!result.baseExists) {
html += '<span class="badge badge-new">New</span>\n';
html += '</div>\n';
html += `<div style="padding: 16px;">${imgTag(result.prImagePath, 'PR')}</div>\n`;
html += '</div>\n';
continue;
}
if (!result.prExists) {
html += '<span class="badge badge-removed">Removed</span>\n';
html += '</div>\n</div>\n';
continue;
}
html += `<span class="badge badge-changed">${result.changePercent.toFixed(1)}% changed</span>\n`;
html += '</div>\n';
html += '<div class="grid">\n';
html += `<div class="grid-cell"><div class="grid-label">Base</div>${imgTag(result.baseImagePath, 'Base')}</div>\n`;
html += `<div class="grid-cell"><div class="grid-label">PR</div>${imgTag(result.prImagePath, 'PR')}</div>\n`;
html += `<div class="grid-cell"><div class="grid-label">Diff</div>${imgTag(result.diffImagePath, 'Diff')}</div>\n`;
html += '</div>\n</div>\n';
}
}
if (unchanged.length > 0) {
html += '<div class="unchanged-section">\n<details>\n<summary>Unchanged pages</summary>\n<ul>\n';
for (const result of unchanged) {
html += `<li>${result.name}</li>\n`;
}
html += '</ul>\n</details>\n</div>\n';
}
html += '</div>\n</body>\n</html>';
return html;
}
// CLI usage
if (process.argv[1]?.endsWith('compare.ts') || process.argv[1]?.endsWith('compare.js')) {
const [baseDir, prDir, outputDir] = process.argv.slice(2);
if (!baseDir || !prDir || !outputDir) {
throw new Error('Usage: compare.ts <base-dir> <pr-dir> <output-dir>');
}
const resolvedOutputDir = resolve(outputDir);
const results = compareScreenshots(resolve(baseDir), resolve(prDir), resolvedOutputDir);
console.log('\nComparison Results:');
console.log('==================');
for (const r of results) {
const status = r.changePercent > 0.1 ? 'CHANGED' : 'unchanged';
console.log(` ${r.name}: ${status} (${r.changePercent.toFixed(1)}%)`);
}
const report = generateMarkdownReport(results);
const reportPath = join(resolvedOutputDir, 'report.md');
writeFileSync(reportPath, report);
console.log(`\nMarkdown report written to: ${reportPath}`);
const htmlReport = generateHtmlReport(results);
const htmlPath = join(resolvedOutputDir, 'visual-review.html');
writeFileSync(htmlPath, htmlReport);
console.log(`HTML report written to: ${htmlPath}`);
const jsonPath = join(resolvedOutputDir, 'results.json');
writeFileSync(jsonPath, JSON.stringify(results, null, 2));
console.log(`Results JSON written to: ${jsonPath}`);
}
+187
View File
@@ -0,0 +1,187 @@
/**
* Maps URL routes to screenshot scenario keys.
*
* Routes discovered by the dependency analyzer are matched against this map
* to determine which screenshot scenarios to run. Routes not in this map
* are skipped (they don't have a scenario defined yet).
*/
export interface ScenarioDefinition {
/** The URL path to navigate to */
url: string;
/** Human-readable name for the screenshot file */
name: string;
/** Which mock networks this scenario needs */
mocks: ('base' | 'timeline' | 'memory')[];
/** Optional: selector to wait for before screenshotting */
waitForSelector?: string;
/** Optional: time to wait after page load (ms) for animations to settle */
settleTime?: number;
}
/**
* Map from route paths (as output by analyze-deps) to scenario definitions.
* A single route might map to multiple scenarios (e.g., different states).
*/
export const PAGE_SCENARIOS: Record<string, ScenarioDefinition[]> = {
'/photos': [
{
url: '/photos',
name: 'photos-timeline',
mocks: ['base', 'timeline'],
waitForSelector: '[data-thumbnail-focus-container]',
settleTime: 500,
},
],
'/albums': [
{
url: '/albums',
name: 'albums-list',
mocks: ['base'],
settleTime: 300,
},
],
'/explore': [
{
url: '/explore',
name: 'explore',
mocks: ['base'],
settleTime: 300,
},
],
'/favorites': [
{
url: '/favorites',
name: 'favorites',
mocks: ['base', 'timeline'],
waitForSelector: '#asset-grid',
settleTime: 300,
},
],
'/archive': [
{
url: '/archive',
name: 'archive',
mocks: ['base', 'timeline'],
waitForSelector: '#asset-grid',
settleTime: 300,
},
],
'/trash': [
{
url: '/trash',
name: 'trash',
mocks: ['base', 'timeline'],
waitForSelector: '#asset-grid',
settleTime: 300,
},
],
'/people': [
{
url: '/people',
name: 'people',
mocks: ['base'],
settleTime: 300,
},
],
'/sharing': [
{
url: '/sharing',
name: 'sharing',
mocks: ['base'],
settleTime: 300,
},
],
'/search': [
{
url: '/search',
name: 'search',
mocks: ['base'],
settleTime: 300,
},
],
'/memory': [
{
url: '/memory',
name: 'memory',
mocks: ['base', 'memory'],
settleTime: 500,
},
],
'/user-settings': [
{
url: '/user-settings',
name: 'user-settings',
mocks: ['base'],
settleTime: 300,
},
],
'/map': [
{
url: '/map',
name: 'map',
mocks: ['base'],
settleTime: 500,
},
],
'/admin': [
{
url: '/admin',
name: 'admin-dashboard',
mocks: ['base'],
settleTime: 300,
},
],
'/admin/system-settings': [
{
url: '/admin/system-settings',
name: 'admin-system-settings',
mocks: ['base'],
settleTime: 300,
},
],
'/admin/users': [
{
url: '/admin/users',
name: 'admin-users',
mocks: ['base'],
settleTime: 300,
},
],
'/auth/login': [
{
url: '/auth/login',
name: 'login',
mocks: [],
settleTime: 300,
},
],
'/': [
{
url: '/',
name: 'landing',
mocks: [],
settleTime: 300,
},
],
};
/** Given a list of routes from the analyzer, return the matching scenarios. */
export function getScenariosForRoutes(routes: string[]): ScenarioDefinition[] {
const scenarios: ScenarioDefinition[] = [];
const seen = new Set<string>();
for (const route of routes) {
const defs = PAGE_SCENARIOS[route];
if (defs) {
for (const def of defs) {
if (!seen.has(def.name)) {
seen.add(def.name);
scenarios.push(def);
}
}
}
}
return scenarios;
}
+140
View File
@@ -0,0 +1,140 @@
/**
* Playwright script to capture screenshots for visual diff scenarios.
*
* Usage:
* npx playwright test --config e2e/playwright.screenshot.config.ts
*
* Environment variables:
* SCREENSHOT_SCENARIOS - JSON array of scenario names to run (from page-map.ts)
* If not set, runs all scenarios.
* SCREENSHOT_OUTPUT_DIR - Directory to save screenshots to. Defaults to e2e/screenshots-output.
*/
import { faker } from '@faker-js/faker';
import type { MemoryResponseDto } from '@immich/sdk';
import { test } from '@playwright/test';
import { mkdirSync } from 'node:fs';
import { resolve } from 'node:path';
import { generateMemoriesFromTimeline } from 'src/ui/generators/memory';
import {
createDefaultTimelineConfig,
generateTimelineData,
type TimelineAssetConfig,
type TimelineData,
} from 'src/ui/generators/timeline';
import { setupBaseMockApiRoutes } from 'src/ui/mock-network/base-network';
import { setupMemoryMockApiRoutes } from 'src/ui/mock-network/memory-network';
import { setupTimelineMockApiRoutes, TimelineTestContext } from 'src/ui/mock-network/timeline-network';
import { PAGE_SCENARIOS, type ScenarioDefinition } from './page-map';
const OUTPUT_DIR = process.env.SCREENSHOT_OUTPUT_DIR || resolve(import.meta.dirname, '../../../screenshots-output');
const SCENARIO_FILTER: string[] | null = process.env.SCREENSHOT_SCENARIOS
? JSON.parse(process.env.SCREENSHOT_SCENARIOS)
: null;
// Collect scenarios to run
const allScenarios: ScenarioDefinition[] = [];
for (const defs of Object.values(PAGE_SCENARIOS)) {
for (const def of defs) {
if (!SCENARIO_FILTER || SCENARIO_FILTER.includes(def.name)) {
allScenarios.push(def);
}
}
}
// Use a fixed seed so screenshots are deterministic across runs
faker.seed(42);
let adminUserId: string;
let timelineData: TimelineData;
let timelineAssets: TimelineAssetConfig[];
let memories: MemoryResponseDto[];
test.beforeAll(async () => {
adminUserId = faker.string.uuid();
timelineData = generateTimelineData({ ...createDefaultTimelineConfig(), ownerId: adminUserId });
timelineAssets = [];
for (const timeBucket of timelineData.buckets.values()) {
timelineAssets.push(...timeBucket);
}
memories = generateMemoriesFromTimeline(
timelineAssets,
adminUserId,
[
{ year: 2024, assetCount: 3 },
{ year: 2023, assetCount: 2 },
],
42,
);
mkdirSync(OUTPUT_DIR, { recursive: true });
});
for (const scenario of allScenarios) {
test(`Screenshot: ${scenario.name}`, async ({ context, page }) => {
// Set up mocks based on scenario requirements
if (scenario.mocks.includes('base')) {
await setupBaseMockApiRoutes(context, adminUserId);
}
if (scenario.mocks.includes('timeline')) {
const testContext = new TimelineTestContext();
testContext.adminId = adminUserId;
await setupTimelineMockApiRoutes(
context,
timelineData,
{
albumAdditions: [],
assetDeletions: [],
assetArchivals: [],
assetFavorites: [],
},
testContext,
);
}
if (scenario.mocks.includes('memory')) {
await setupMemoryMockApiRoutes(context, memories, {
memoryDeletions: [],
assetRemovals: new Map(),
});
}
// Navigate to the page. Use networkidle so SvelteKit hydrates and API
// calls complete, but fall back to domcontentloaded if it times out
// (e.g. a persistent connection the catch-all mock didn't cover).
try {
await page.goto(scenario.url, { waitUntil: 'networkidle', timeout: 15_000 });
} catch {
console.warn(`networkidle timed out for ${scenario.name}, falling back to current state`);
// Page has already navigated, just continue with what we have
}
// Wait for specific selector if specified
if (scenario.waitForSelector) {
try {
await page.waitForSelector(scenario.waitForSelector, { timeout: 15_000 });
} catch {
console.warn(`Selector ${scenario.waitForSelector} not found for ${scenario.name}, continuing...`);
}
}
// Wait for loading spinners to disappear
await page
.waitForFunction(() => document.querySelectorAll('[data-testid="loading-spinner"]').length === 0, {
timeout: 10_000,
})
.catch(() => {});
// Wait for animations/transitions to settle
await page.waitForTimeout(scenario.settleTime ?? 500);
// Take the screenshot
await page.screenshot({
path: resolve(OUTPUT_DIR, `${scenario.name}.png`),
fullPage: false,
});
});
}
+21
View File
@@ -10,6 +10,27 @@ export const setupBaseMockApiRoutes = async (context: BrowserContext, adminUserI
path: '/',
},
]);
// Block socket.io connections — these are persistent WebSocket connections
// that prevent networkidle from resolving since there's no real server.
await context.route('**/api/socket.io**', async (route) => {
return route.abort('connectionrefused');
});
// Catch-all for any /api/ endpoint not explicitly mocked below.
// Registered FIRST so specific routes (registered after) take priority
// (Playwright checks routes in reverse registration order).
// Without this, unmocked API calls hit the static preview server which
// either hangs or returns HTML, preventing networkidle and causing timeouts.
await context.route('**/api/**', async (route) => {
const method = route.request().method();
return route.fulfill({
status: 200,
contentType: 'application/json',
json: method === 'GET' ? [] : {},
});
});
await context.route('**/api/users/me', async (route) => {
return route.fulfill({
status: 200,
File diff suppressed because one or more lines are too long
@@ -11,6 +11,10 @@ enum AssetType {
enum AssetState { local, remote, merged }
// do not change!
// keep in sync with PlatformAssetPlaybackStyle
enum AssetPlaybackStyle { unknown, image, video, imageAnimated, livePhoto, videoLooping }
sealed class BaseAsset {
final String name;
final String? checksum;
@@ -43,6 +47,14 @@ sealed class BaseAsset {
bool get isMotionPhoto => livePhotoVideoId != null;
AssetPlaybackStyle get playbackStyle {
if (isVideo) return AssetPlaybackStyle.video;
if (isMotionPhoto) return AssetPlaybackStyle.livePhoto;
if (isImage && durationInSeconds != null && durationInSeconds! > 0) return AssetPlaybackStyle.imageAnimated;
if (isImage) return AssetPlaybackStyle.image;
return AssetPlaybackStyle.unknown;
}
Duration get duration {
final durationInSeconds = this.durationInSeconds;
if (durationInSeconds != null) {
@@ -5,6 +5,8 @@ class LocalAsset extends BaseAsset {
final String? remoteAssetId;
final String? cloudId;
final int orientation;
@override
final AssetPlaybackStyle playbackStyle;
final DateTime? adjustmentTime;
final double? latitude;
@@ -25,6 +27,7 @@ class LocalAsset extends BaseAsset {
super.isFavorite = false,
super.livePhotoVideoId,
this.orientation = 0,
required this.playbackStyle,
this.adjustmentTime,
this.latitude,
this.longitude,
@@ -56,6 +59,7 @@ class LocalAsset extends BaseAsset {
width: ${width ?? "<NA>"},
height: ${height ?? "<NA>"},
durationInSeconds: ${durationInSeconds ?? "<NA>"},
playbackStyle: $playbackStyle,
remoteId: ${remoteId ?? "<NA>"},
cloudId: ${cloudId ?? "<NA>"},
checksum: ${checksum ?? "<NA>"},
@@ -76,6 +80,7 @@ class LocalAsset extends BaseAsset {
id == other.id &&
cloudId == other.cloudId &&
orientation == other.orientation &&
playbackStyle == other.playbackStyle &&
adjustmentTime == other.adjustmentTime &&
latitude == other.latitude &&
longitude == other.longitude;
@@ -87,6 +92,7 @@ class LocalAsset extends BaseAsset {
id.hashCode ^
remoteId.hashCode ^
orientation.hashCode ^
playbackStyle.hashCode ^
adjustmentTime.hashCode ^
latitude.hashCode ^
longitude.hashCode;
@@ -105,6 +111,7 @@ class LocalAsset extends BaseAsset {
int? durationInSeconds,
bool? isFavorite,
int? orientation,
AssetPlaybackStyle? playbackStyle,
DateTime? adjustmentTime,
double? latitude,
double? longitude,
@@ -124,6 +131,7 @@ class LocalAsset extends BaseAsset {
durationInSeconds: durationInSeconds ?? this.durationInSeconds,
isFavorite: isFavorite ?? this.isFavorite,
orientation: orientation ?? this.orientation,
playbackStyle: playbackStyle ?? this.playbackStyle,
adjustmentTime: adjustmentTime ?? this.adjustmentTime,
latitude: latitude ?? this.latitude,
longitude: longitude ?? this.longitude,
@@ -435,9 +435,19 @@ extension PlatformToLocalAsset on PlatformAsset {
durationInSeconds: durationInSeconds,
isFavorite: isFavorite,
orientation: orientation,
playbackStyle: _toPlaybackStyle(playbackStyle),
adjustmentTime: tryFromSecondsSinceEpoch(adjustmentTime, isUtc: true),
latitude: latitude,
longitude: longitude,
isEdited: false,
);
}
AssetPlaybackStyle _toPlaybackStyle(PlatformAssetPlaybackStyle style) => switch (style) {
PlatformAssetPlaybackStyle.unknown => AssetPlaybackStyle.unknown,
PlatformAssetPlaybackStyle.image => AssetPlaybackStyle.image,
PlatformAssetPlaybackStyle.video => AssetPlaybackStyle.video,
PlatformAssetPlaybackStyle.imageAnimated => AssetPlaybackStyle.imageAnimated,
PlatformAssetPlaybackStyle.livePhoto => AssetPlaybackStyle.livePhoto,
PlatformAssetPlaybackStyle.videoLooping => AssetPlaybackStyle.videoLooping,
};
@@ -25,6 +25,8 @@ class LocalAssetEntity extends Table with DriftDefaultsMixin, AssetEntityMixin {
RealColumn get longitude => real().nullable()();
IntColumn get playbackStyle => intEnum<AssetPlaybackStyle>().withDefault(const Constant(0))();
@override
Set<Column> get primaryKey => {id};
}
@@ -43,6 +45,7 @@ extension LocalAssetEntityDataDomainExtension on LocalAssetEntityData {
width: width,
remoteId: remoteId,
orientation: orientation,
playbackStyle: playbackStyle,
adjustmentTime: adjustmentTime,
latitude: latitude,
longitude: longitude,
@@ -25,6 +25,7 @@ typedef $$LocalAssetEntityTableCreateCompanionBuilder =
i0.Value<DateTime?> adjustmentTime,
i0.Value<double?> latitude,
i0.Value<double?> longitude,
i0.Value<i2.AssetPlaybackStyle> playbackStyle,
});
typedef $$LocalAssetEntityTableUpdateCompanionBuilder =
i1.LocalAssetEntityCompanion Function({
@@ -43,6 +44,7 @@ typedef $$LocalAssetEntityTableUpdateCompanionBuilder =
i0.Value<DateTime?> adjustmentTime,
i0.Value<double?> latitude,
i0.Value<double?> longitude,
i0.Value<i2.AssetPlaybackStyle> playbackStyle,
});
class $$LocalAssetEntityTableFilterComposer
@@ -129,6 +131,16 @@ class $$LocalAssetEntityTableFilterComposer
column: $table.longitude,
builder: (column) => i0.ColumnFilters(column),
);
i0.ColumnWithTypeConverterFilters<
i2.AssetPlaybackStyle,
i2.AssetPlaybackStyle,
int
>
get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => i0.ColumnWithTypeConverterFilters(column),
);
}
class $$LocalAssetEntityTableOrderingComposer
@@ -214,6 +226,11 @@ class $$LocalAssetEntityTableOrderingComposer
column: $table.longitude,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<int> get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => i0.ColumnOrderings(column),
);
}
class $$LocalAssetEntityTableAnnotationComposer
@@ -277,6 +294,12 @@ class $$LocalAssetEntityTableAnnotationComposer
i0.GeneratedColumn<double> get longitude =>
$composableBuilder(column: $table.longitude, builder: (column) => column);
i0.GeneratedColumnWithTypeConverter<i2.AssetPlaybackStyle, int>
get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => column,
);
}
class $$LocalAssetEntityTableTableManager
@@ -334,6 +357,8 @@ class $$LocalAssetEntityTableTableManager
i0.Value<DateTime?> adjustmentTime = const i0.Value.absent(),
i0.Value<double?> latitude = const i0.Value.absent(),
i0.Value<double?> longitude = const i0.Value.absent(),
i0.Value<i2.AssetPlaybackStyle> playbackStyle =
const i0.Value.absent(),
}) => i1.LocalAssetEntityCompanion(
name: name,
type: type,
@@ -350,6 +375,7 @@ class $$LocalAssetEntityTableTableManager
adjustmentTime: adjustmentTime,
latitude: latitude,
longitude: longitude,
playbackStyle: playbackStyle,
),
createCompanionCallback:
({
@@ -368,6 +394,8 @@ class $$LocalAssetEntityTableTableManager
i0.Value<DateTime?> adjustmentTime = const i0.Value.absent(),
i0.Value<double?> latitude = const i0.Value.absent(),
i0.Value<double?> longitude = const i0.Value.absent(),
i0.Value<i2.AssetPlaybackStyle> playbackStyle =
const i0.Value.absent(),
}) => i1.LocalAssetEntityCompanion.insert(
name: name,
type: type,
@@ -384,6 +412,7 @@ class $$LocalAssetEntityTableTableManager
adjustmentTime: adjustmentTime,
latitude: latitude,
longitude: longitude,
playbackStyle: playbackStyle,
),
withReferenceMapper: (p0) => p0
.map((e) => (e.readTable(table), i0.BaseReferences(db, table, e)))
@@ -596,6 +625,19 @@ class $LocalAssetEntityTable extends i3.LocalAssetEntity
requiredDuringInsert: false,
);
@override
late final i0.GeneratedColumnWithTypeConverter<i2.AssetPlaybackStyle, int>
playbackStyle =
i0.GeneratedColumn<int>(
'playback_style',
aliasedName,
false,
type: i0.DriftSqlType.int,
requiredDuringInsert: false,
defaultValue: const i4.Constant(0),
).withConverter<i2.AssetPlaybackStyle>(
i1.$LocalAssetEntityTable.$converterplaybackStyle,
);
@override
List<i0.GeneratedColumn> get $columns => [
name,
type,
@@ -612,6 +654,7 @@ class $LocalAssetEntityTable extends i3.LocalAssetEntity
adjustmentTime,
latitude,
longitude,
playbackStyle,
];
@override
String get aliasedName => _alias ?? actualTableName;
@@ -793,6 +836,12 @@ class $LocalAssetEntityTable extends i3.LocalAssetEntity
i0.DriftSqlType.double,
data['${effectivePrefix}longitude'],
),
playbackStyle: i1.$LocalAssetEntityTable.$converterplaybackStyle.fromSql(
attachedDatabase.typeMapping.read(
i0.DriftSqlType.int,
data['${effectivePrefix}playback_style'],
)!,
),
);
}
@@ -803,6 +852,10 @@ class $LocalAssetEntityTable extends i3.LocalAssetEntity
static i0.JsonTypeConverter2<i2.AssetType, int, int> $convertertype =
const i0.EnumIndexConverter<i2.AssetType>(i2.AssetType.values);
static i0.JsonTypeConverter2<i2.AssetPlaybackStyle, int, int>
$converterplaybackStyle = const i0.EnumIndexConverter<i2.AssetPlaybackStyle>(
i2.AssetPlaybackStyle.values,
);
@override
bool get withoutRowId => true;
@override
@@ -826,6 +879,7 @@ class LocalAssetEntityData extends i0.DataClass
final DateTime? adjustmentTime;
final double? latitude;
final double? longitude;
final i2.AssetPlaybackStyle playbackStyle;
const LocalAssetEntityData({
required this.name,
required this.type,
@@ -842,6 +896,7 @@ class LocalAssetEntityData extends i0.DataClass
this.adjustmentTime,
this.latitude,
this.longitude,
required this.playbackStyle,
});
@override
Map<String, i0.Expression> toColumns(bool nullToAbsent) {
@@ -881,6 +936,11 @@ class LocalAssetEntityData extends i0.DataClass
if (!nullToAbsent || longitude != null) {
map['longitude'] = i0.Variable<double>(longitude);
}
{
map['playback_style'] = i0.Variable<int>(
i1.$LocalAssetEntityTable.$converterplaybackStyle.toSql(playbackStyle),
);
}
return map;
}
@@ -907,6 +967,9 @@ class LocalAssetEntityData extends i0.DataClass
adjustmentTime: serializer.fromJson<DateTime?>(json['adjustmentTime']),
latitude: serializer.fromJson<double?>(json['latitude']),
longitude: serializer.fromJson<double?>(json['longitude']),
playbackStyle: i1.$LocalAssetEntityTable.$converterplaybackStyle.fromJson(
serializer.fromJson<int>(json['playbackStyle']),
),
);
}
@override
@@ -930,6 +993,9 @@ class LocalAssetEntityData extends i0.DataClass
'adjustmentTime': serializer.toJson<DateTime?>(adjustmentTime),
'latitude': serializer.toJson<double?>(latitude),
'longitude': serializer.toJson<double?>(longitude),
'playbackStyle': serializer.toJson<int>(
i1.$LocalAssetEntityTable.$converterplaybackStyle.toJson(playbackStyle),
),
};
}
@@ -949,6 +1015,7 @@ class LocalAssetEntityData extends i0.DataClass
i0.Value<DateTime?> adjustmentTime = const i0.Value.absent(),
i0.Value<double?> latitude = const i0.Value.absent(),
i0.Value<double?> longitude = const i0.Value.absent(),
i2.AssetPlaybackStyle? playbackStyle,
}) => i1.LocalAssetEntityData(
name: name ?? this.name,
type: type ?? this.type,
@@ -969,6 +1036,7 @@ class LocalAssetEntityData extends i0.DataClass
: this.adjustmentTime,
latitude: latitude.present ? latitude.value : this.latitude,
longitude: longitude.present ? longitude.value : this.longitude,
playbackStyle: playbackStyle ?? this.playbackStyle,
);
LocalAssetEntityData copyWithCompanion(i1.LocalAssetEntityCompanion data) {
return LocalAssetEntityData(
@@ -995,6 +1063,9 @@ class LocalAssetEntityData extends i0.DataClass
: this.adjustmentTime,
latitude: data.latitude.present ? data.latitude.value : this.latitude,
longitude: data.longitude.present ? data.longitude.value : this.longitude,
playbackStyle: data.playbackStyle.present
? data.playbackStyle.value
: this.playbackStyle,
);
}
@@ -1015,7 +1086,8 @@ class LocalAssetEntityData extends i0.DataClass
..write('iCloudId: $iCloudId, ')
..write('adjustmentTime: $adjustmentTime, ')
..write('latitude: $latitude, ')
..write('longitude: $longitude')
..write('longitude: $longitude, ')
..write('playbackStyle: $playbackStyle')
..write(')'))
.toString();
}
@@ -1037,6 +1109,7 @@ class LocalAssetEntityData extends i0.DataClass
adjustmentTime,
latitude,
longitude,
playbackStyle,
);
@override
bool operator ==(Object other) =>
@@ -1056,7 +1129,8 @@ class LocalAssetEntityData extends i0.DataClass
other.iCloudId == this.iCloudId &&
other.adjustmentTime == this.adjustmentTime &&
other.latitude == this.latitude &&
other.longitude == this.longitude);
other.longitude == this.longitude &&
other.playbackStyle == this.playbackStyle);
}
class LocalAssetEntityCompanion
@@ -1076,6 +1150,7 @@ class LocalAssetEntityCompanion
final i0.Value<DateTime?> adjustmentTime;
final i0.Value<double?> latitude;
final i0.Value<double?> longitude;
final i0.Value<i2.AssetPlaybackStyle> playbackStyle;
const LocalAssetEntityCompanion({
this.name = const i0.Value.absent(),
this.type = const i0.Value.absent(),
@@ -1092,6 +1167,7 @@ class LocalAssetEntityCompanion
this.adjustmentTime = const i0.Value.absent(),
this.latitude = const i0.Value.absent(),
this.longitude = const i0.Value.absent(),
this.playbackStyle = const i0.Value.absent(),
});
LocalAssetEntityCompanion.insert({
required String name,
@@ -1109,6 +1185,7 @@ class LocalAssetEntityCompanion
this.adjustmentTime = const i0.Value.absent(),
this.latitude = const i0.Value.absent(),
this.longitude = const i0.Value.absent(),
this.playbackStyle = const i0.Value.absent(),
}) : name = i0.Value(name),
type = i0.Value(type),
id = i0.Value(id);
@@ -1128,6 +1205,7 @@ class LocalAssetEntityCompanion
i0.Expression<DateTime>? adjustmentTime,
i0.Expression<double>? latitude,
i0.Expression<double>? longitude,
i0.Expression<int>? playbackStyle,
}) {
return i0.RawValuesInsertable({
if (name != null) 'name': name,
@@ -1145,6 +1223,7 @@ class LocalAssetEntityCompanion
if (adjustmentTime != null) 'adjustment_time': adjustmentTime,
if (latitude != null) 'latitude': latitude,
if (longitude != null) 'longitude': longitude,
if (playbackStyle != null) 'playback_style': playbackStyle,
});
}
@@ -1164,6 +1243,7 @@ class LocalAssetEntityCompanion
i0.Value<DateTime?>? adjustmentTime,
i0.Value<double?>? latitude,
i0.Value<double?>? longitude,
i0.Value<i2.AssetPlaybackStyle>? playbackStyle,
}) {
return i1.LocalAssetEntityCompanion(
name: name ?? this.name,
@@ -1181,6 +1261,7 @@ class LocalAssetEntityCompanion
adjustmentTime: adjustmentTime ?? this.adjustmentTime,
latitude: latitude ?? this.latitude,
longitude: longitude ?? this.longitude,
playbackStyle: playbackStyle ?? this.playbackStyle,
);
}
@@ -1234,6 +1315,13 @@ class LocalAssetEntityCompanion
if (longitude.present) {
map['longitude'] = i0.Variable<double>(longitude.value);
}
if (playbackStyle.present) {
map['playback_style'] = i0.Variable<int>(
i1.$LocalAssetEntityTable.$converterplaybackStyle.toSql(
playbackStyle.value,
),
);
}
return map;
}
@@ -1254,7 +1342,8 @@ class LocalAssetEntityCompanion
..write('iCloudId: $iCloudId, ')
..write('adjustmentTime: $adjustmentTime, ')
..write('latitude: $latitude, ')
..write('longitude: $longitude')
..write('longitude: $longitude, ')
..write('playbackStyle: $playbackStyle')
..write(')'))
.toString();
}
@@ -26,7 +26,8 @@ SELECT
NULL as latitude,
NULL as longitude,
NULL as adjustmentTime,
rae.is_edited
rae.is_edited,
0 as playback_style
FROM
remote_asset_entity rae
LEFT JOIN
@@ -63,7 +64,8 @@ SELECT
lae.latitude,
lae.longitude,
lae.adjustment_time,
0 as is_edited
0 as is_edited,
lae.playback_style
FROM
local_asset_entity lae
WHERE NOT EXISTS (
+4 -1
View File
@@ -29,7 +29,7 @@ class MergedAssetDrift extends i1.ModularAccessor {
);
$arrayStartIndex += generatedlimit.amountOfVariables;
return customSelect(
'SELECT rae.id AS remote_id, (SELECT lae.id FROM local_asset_entity AS lae WHERE lae.checksum = rae.checksum LIMIT 1) AS local_id, rae.name, rae.type, rae.created_at AS created_at, rae.updated_at, rae.width, rae.height, rae.duration_in_seconds, rae.is_favorite, rae.thumb_hash, rae.checksum, rae.owner_id, rae.live_photo_video_id, 0 AS orientation, rae.stack_id, NULL AS i_cloud_id, NULL AS latitude, NULL AS longitude, NULL AS adjustmentTime, rae.is_edited FROM remote_asset_entity AS rae LEFT JOIN stack_entity AS se ON rae.stack_id = se.id WHERE rae.deleted_at IS NULL AND rae.visibility = 0 AND rae.owner_id IN ($expandeduserIds) AND(rae.stack_id IS NULL OR rae.id = se.primary_asset_id)UNION ALL SELECT NULL AS remote_id, lae.id AS local_id, lae.name, lae.type, lae.created_at AS created_at, lae.updated_at, lae.width, lae.height, lae.duration_in_seconds, lae.is_favorite, NULL AS thumb_hash, lae.checksum, NULL AS owner_id, NULL AS live_photo_video_id, lae.orientation, NULL AS stack_id, lae.i_cloud_id, lae.latitude, lae.longitude, lae.adjustment_time, 0 AS is_edited FROM local_asset_entity AS lae WHERE NOT EXISTS (SELECT 1 FROM remote_asset_entity AS rae WHERE rae.checksum = lae.checksum AND rae.owner_id IN ($expandeduserIds)) AND EXISTS (SELECT 1 FROM local_album_asset_entity AS laa INNER JOIN local_album_entity AS la ON laa.album_id = la.id WHERE laa.asset_id = lae.id AND la.backup_selection = 0) AND NOT EXISTS (SELECT 1 FROM local_album_asset_entity AS laa INNER JOIN local_album_entity AS la ON laa.album_id = la.id WHERE laa.asset_id = lae.id AND la.backup_selection = 2) ORDER BY created_at DESC ${generatedlimit.sql}',
'SELECT rae.id AS remote_id, (SELECT lae.id FROM local_asset_entity AS lae WHERE lae.checksum = rae.checksum LIMIT 1) AS local_id, rae.name, rae.type, rae.created_at AS created_at, rae.updated_at, rae.width, rae.height, rae.duration_in_seconds, rae.is_favorite, rae.thumb_hash, rae.checksum, rae.owner_id, rae.live_photo_video_id, 0 AS orientation, rae.stack_id, NULL AS i_cloud_id, NULL AS latitude, NULL AS longitude, NULL AS adjustmentTime, rae.is_edited, 0 AS playback_style FROM remote_asset_entity AS rae LEFT JOIN stack_entity AS se ON rae.stack_id = se.id WHERE rae.deleted_at IS NULL AND rae.visibility = 0 AND rae.owner_id IN ($expandeduserIds) AND(rae.stack_id IS NULL OR rae.id = se.primary_asset_id)UNION ALL SELECT NULL AS remote_id, lae.id AS local_id, lae.name, lae.type, lae.created_at AS created_at, lae.updated_at, lae.width, lae.height, lae.duration_in_seconds, lae.is_favorite, NULL AS thumb_hash, lae.checksum, NULL AS owner_id, NULL AS live_photo_video_id, lae.orientation, NULL AS stack_id, lae.i_cloud_id, lae.latitude, lae.longitude, lae.adjustment_time, 0 AS is_edited, lae.playback_style FROM local_asset_entity AS lae WHERE NOT EXISTS (SELECT 1 FROM remote_asset_entity AS rae WHERE rae.checksum = lae.checksum AND rae.owner_id IN ($expandeduserIds)) AND EXISTS (SELECT 1 FROM local_album_asset_entity AS laa INNER JOIN local_album_entity AS la ON laa.album_id = la.id WHERE laa.asset_id = lae.id AND la.backup_selection = 0) AND NOT EXISTS (SELECT 1 FROM local_album_asset_entity AS laa INNER JOIN local_album_entity AS la ON laa.album_id = la.id WHERE laa.asset_id = lae.id AND la.backup_selection = 2) ORDER BY created_at DESC ${generatedlimit.sql}',
variables: [
for (var $ in userIds) i0.Variable<String>($),
...generatedlimit.introducedVariables,
@@ -67,6 +67,7 @@ class MergedAssetDrift extends i1.ModularAccessor {
longitude: row.readNullable<double>('longitude'),
adjustmentTime: row.readNullable<DateTime>('adjustmentTime'),
isEdited: row.read<bool>('is_edited'),
playbackStyle: row.read<int>('playback_style'),
),
);
}
@@ -139,6 +140,7 @@ class MergedAssetResult {
final double? longitude;
final DateTime? adjustmentTime;
final bool isEdited;
final int playbackStyle;
MergedAssetResult({
this.remoteId,
this.localId,
@@ -161,6 +163,7 @@ class MergedAssetResult {
this.longitude,
this.adjustmentTime,
required this.isEdited,
required this.playbackStyle,
});
}
@@ -28,6 +28,8 @@ class TrashedLocalAssetEntity extends Table with DriftDefaultsMixin, AssetEntity
IntColumn get source => intEnum<TrashOrigin>()();
IntColumn get playbackStyle => intEnum<AssetPlaybackStyle>().withDefault(const Constant(0))();
@override
Set<Column> get primaryKey => {id, albumId};
}
@@ -45,6 +47,7 @@ extension TrashedLocalAssetEntityDataDomainExtension on TrashedLocalAssetEntityD
height: height,
width: width,
orientation: orientation,
playbackStyle: playbackStyle,
isEdited: false,
);
}
@@ -23,6 +23,7 @@ typedef $$TrashedLocalAssetEntityTableCreateCompanionBuilder =
i0.Value<bool> isFavorite,
i0.Value<int> orientation,
required i3.TrashOrigin source,
i0.Value<i2.AssetPlaybackStyle> playbackStyle,
});
typedef $$TrashedLocalAssetEntityTableUpdateCompanionBuilder =
i1.TrashedLocalAssetEntityCompanion Function({
@@ -39,6 +40,7 @@ typedef $$TrashedLocalAssetEntityTableUpdateCompanionBuilder =
i0.Value<bool> isFavorite,
i0.Value<int> orientation,
i0.Value<i3.TrashOrigin> source,
i0.Value<i2.AssetPlaybackStyle> playbackStyle,
});
class $$TrashedLocalAssetEntityTableFilterComposer
@@ -117,6 +119,16 @@ class $$TrashedLocalAssetEntityTableFilterComposer
column: $table.source,
builder: (column) => i0.ColumnWithTypeConverterFilters(column),
);
i0.ColumnWithTypeConverterFilters<
i2.AssetPlaybackStyle,
i2.AssetPlaybackStyle,
int
>
get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => i0.ColumnWithTypeConverterFilters(column),
);
}
class $$TrashedLocalAssetEntityTableOrderingComposer
@@ -193,6 +205,11 @@ class $$TrashedLocalAssetEntityTableOrderingComposer
column: $table.source,
builder: (column) => i0.ColumnOrderings(column),
);
i0.ColumnOrderings<int> get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => i0.ColumnOrderings(column),
);
}
class $$TrashedLocalAssetEntityTableAnnotationComposer
@@ -249,6 +266,12 @@ class $$TrashedLocalAssetEntityTableAnnotationComposer
i0.GeneratedColumnWithTypeConverter<i3.TrashOrigin, int> get source =>
$composableBuilder(column: $table.source, builder: (column) => column);
i0.GeneratedColumnWithTypeConverter<i2.AssetPlaybackStyle, int>
get playbackStyle => $composableBuilder(
column: $table.playbackStyle,
builder: (column) => column,
);
}
class $$TrashedLocalAssetEntityTableTableManager
@@ -310,6 +333,8 @@ class $$TrashedLocalAssetEntityTableTableManager
i0.Value<bool> isFavorite = const i0.Value.absent(),
i0.Value<int> orientation = const i0.Value.absent(),
i0.Value<i3.TrashOrigin> source = const i0.Value.absent(),
i0.Value<i2.AssetPlaybackStyle> playbackStyle =
const i0.Value.absent(),
}) => i1.TrashedLocalAssetEntityCompanion(
name: name,
type: type,
@@ -324,6 +349,7 @@ class $$TrashedLocalAssetEntityTableTableManager
isFavorite: isFavorite,
orientation: orientation,
source: source,
playbackStyle: playbackStyle,
),
createCompanionCallback:
({
@@ -340,6 +366,8 @@ class $$TrashedLocalAssetEntityTableTableManager
i0.Value<bool> isFavorite = const i0.Value.absent(),
i0.Value<int> orientation = const i0.Value.absent(),
required i3.TrashOrigin source,
i0.Value<i2.AssetPlaybackStyle> playbackStyle =
const i0.Value.absent(),
}) => i1.TrashedLocalAssetEntityCompanion.insert(
name: name,
type: type,
@@ -354,6 +382,7 @@ class $$TrashedLocalAssetEntityTableTableManager
isFavorite: isFavorite,
orientation: orientation,
source: source,
playbackStyle: playbackStyle,
),
withReferenceMapper: (p0) => p0
.map((e) => (e.readTable(table), i0.BaseReferences(db, table, e)))
@@ -550,6 +579,19 @@ class $TrashedLocalAssetEntityTable extends i3.TrashedLocalAssetEntity
i1.$TrashedLocalAssetEntityTable.$convertersource,
);
@override
late final i0.GeneratedColumnWithTypeConverter<i2.AssetPlaybackStyle, int>
playbackStyle =
i0.GeneratedColumn<int>(
'playback_style',
aliasedName,
false,
type: i0.DriftSqlType.int,
requiredDuringInsert: false,
defaultValue: const i4.Constant(0),
).withConverter<i2.AssetPlaybackStyle>(
i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle,
);
@override
List<i0.GeneratedColumn> get $columns => [
name,
type,
@@ -564,6 +606,7 @@ class $TrashedLocalAssetEntityTable extends i3.TrashedLocalAssetEntity
isFavorite,
orientation,
source,
playbackStyle,
];
@override
String get aliasedName => _alias ?? actualTableName;
@@ -720,6 +763,13 @@ class $TrashedLocalAssetEntityTable extends i3.TrashedLocalAssetEntity
data['${effectivePrefix}source'],
)!,
),
playbackStyle: i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle
.fromSql(
attachedDatabase.typeMapping.read(
i0.DriftSqlType.int,
data['${effectivePrefix}playback_style'],
)!,
),
);
}
@@ -732,6 +782,10 @@ class $TrashedLocalAssetEntityTable extends i3.TrashedLocalAssetEntity
const i0.EnumIndexConverter<i2.AssetType>(i2.AssetType.values);
static i0.JsonTypeConverter2<i3.TrashOrigin, int, int> $convertersource =
const i0.EnumIndexConverter<i3.TrashOrigin>(i3.TrashOrigin.values);
static i0.JsonTypeConverter2<i2.AssetPlaybackStyle, int, int>
$converterplaybackStyle = const i0.EnumIndexConverter<i2.AssetPlaybackStyle>(
i2.AssetPlaybackStyle.values,
);
@override
bool get withoutRowId => true;
@override
@@ -753,6 +807,7 @@ class TrashedLocalAssetEntityData extends i0.DataClass
final bool isFavorite;
final int orientation;
final i3.TrashOrigin source;
final i2.AssetPlaybackStyle playbackStyle;
const TrashedLocalAssetEntityData({
required this.name,
required this.type,
@@ -767,6 +822,7 @@ class TrashedLocalAssetEntityData extends i0.DataClass
required this.isFavorite,
required this.orientation,
required this.source,
required this.playbackStyle,
});
@override
Map<String, i0.Expression> toColumns(bool nullToAbsent) {
@@ -800,6 +856,13 @@ class TrashedLocalAssetEntityData extends i0.DataClass
i1.$TrashedLocalAssetEntityTable.$convertersource.toSql(source),
);
}
{
map['playback_style'] = i0.Variable<int>(
i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle.toSql(
playbackStyle,
),
);
}
return map;
}
@@ -826,6 +889,8 @@ class TrashedLocalAssetEntityData extends i0.DataClass
source: i1.$TrashedLocalAssetEntityTable.$convertersource.fromJson(
serializer.fromJson<int>(json['source']),
),
playbackStyle: i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle
.fromJson(serializer.fromJson<int>(json['playbackStyle'])),
);
}
@override
@@ -849,6 +914,11 @@ class TrashedLocalAssetEntityData extends i0.DataClass
'source': serializer.toJson<int>(
i1.$TrashedLocalAssetEntityTable.$convertersource.toJson(source),
),
'playbackStyle': serializer.toJson<int>(
i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle.toJson(
playbackStyle,
),
),
};
}
@@ -866,6 +936,7 @@ class TrashedLocalAssetEntityData extends i0.DataClass
bool? isFavorite,
int? orientation,
i3.TrashOrigin? source,
i2.AssetPlaybackStyle? playbackStyle,
}) => i1.TrashedLocalAssetEntityData(
name: name ?? this.name,
type: type ?? this.type,
@@ -882,6 +953,7 @@ class TrashedLocalAssetEntityData extends i0.DataClass
isFavorite: isFavorite ?? this.isFavorite,
orientation: orientation ?? this.orientation,
source: source ?? this.source,
playbackStyle: playbackStyle ?? this.playbackStyle,
);
TrashedLocalAssetEntityData copyWithCompanion(
i1.TrashedLocalAssetEntityCompanion data,
@@ -906,6 +978,9 @@ class TrashedLocalAssetEntityData extends i0.DataClass
? data.orientation.value
: this.orientation,
source: data.source.present ? data.source.value : this.source,
playbackStyle: data.playbackStyle.present
? data.playbackStyle.value
: this.playbackStyle,
);
}
@@ -924,7 +999,8 @@ class TrashedLocalAssetEntityData extends i0.DataClass
..write('checksum: $checksum, ')
..write('isFavorite: $isFavorite, ')
..write('orientation: $orientation, ')
..write('source: $source')
..write('source: $source, ')
..write('playbackStyle: $playbackStyle')
..write(')'))
.toString();
}
@@ -944,6 +1020,7 @@ class TrashedLocalAssetEntityData extends i0.DataClass
isFavorite,
orientation,
source,
playbackStyle,
);
@override
bool operator ==(Object other) =>
@@ -961,7 +1038,8 @@ class TrashedLocalAssetEntityData extends i0.DataClass
other.checksum == this.checksum &&
other.isFavorite == this.isFavorite &&
other.orientation == this.orientation &&
other.source == this.source);
other.source == this.source &&
other.playbackStyle == this.playbackStyle);
}
class TrashedLocalAssetEntityCompanion
@@ -979,6 +1057,7 @@ class TrashedLocalAssetEntityCompanion
final i0.Value<bool> isFavorite;
final i0.Value<int> orientation;
final i0.Value<i3.TrashOrigin> source;
final i0.Value<i2.AssetPlaybackStyle> playbackStyle;
const TrashedLocalAssetEntityCompanion({
this.name = const i0.Value.absent(),
this.type = const i0.Value.absent(),
@@ -993,6 +1072,7 @@ class TrashedLocalAssetEntityCompanion
this.isFavorite = const i0.Value.absent(),
this.orientation = const i0.Value.absent(),
this.source = const i0.Value.absent(),
this.playbackStyle = const i0.Value.absent(),
});
TrashedLocalAssetEntityCompanion.insert({
required String name,
@@ -1008,6 +1088,7 @@ class TrashedLocalAssetEntityCompanion
this.isFavorite = const i0.Value.absent(),
this.orientation = const i0.Value.absent(),
required i3.TrashOrigin source,
this.playbackStyle = const i0.Value.absent(),
}) : name = i0.Value(name),
type = i0.Value(type),
id = i0.Value(id),
@@ -1027,6 +1108,7 @@ class TrashedLocalAssetEntityCompanion
i0.Expression<bool>? isFavorite,
i0.Expression<int>? orientation,
i0.Expression<int>? source,
i0.Expression<int>? playbackStyle,
}) {
return i0.RawValuesInsertable({
if (name != null) 'name': name,
@@ -1042,6 +1124,7 @@ class TrashedLocalAssetEntityCompanion
if (isFavorite != null) 'is_favorite': isFavorite,
if (orientation != null) 'orientation': orientation,
if (source != null) 'source': source,
if (playbackStyle != null) 'playback_style': playbackStyle,
});
}
@@ -1059,6 +1142,7 @@ class TrashedLocalAssetEntityCompanion
i0.Value<bool>? isFavorite,
i0.Value<int>? orientation,
i0.Value<i3.TrashOrigin>? source,
i0.Value<i2.AssetPlaybackStyle>? playbackStyle,
}) {
return i1.TrashedLocalAssetEntityCompanion(
name: name ?? this.name,
@@ -1074,6 +1158,7 @@ class TrashedLocalAssetEntityCompanion
isFavorite: isFavorite ?? this.isFavorite,
orientation: orientation ?? this.orientation,
source: source ?? this.source,
playbackStyle: playbackStyle ?? this.playbackStyle,
);
}
@@ -1123,6 +1208,13 @@ class TrashedLocalAssetEntityCompanion
i1.$TrashedLocalAssetEntityTable.$convertersource.toSql(source.value),
);
}
if (playbackStyle.present) {
map['playback_style'] = i0.Variable<int>(
i1.$TrashedLocalAssetEntityTable.$converterplaybackStyle.toSql(
playbackStyle.value,
),
);
}
return map;
}
@@ -1141,7 +1233,8 @@ class TrashedLocalAssetEntityCompanion
..write('checksum: $checksum, ')
..write('isFavorite: $isFavorite, ')
..write('orientation: $orientation, ')
..write('source: $source')
..write('source: $source, ')
..write('playbackStyle: $playbackStyle')
..write(')'))
.toString();
}
@@ -97,7 +97,7 @@ class Drift extends $Drift implements IDatabaseRepository {
}
@override
int get schemaVersion => 20;
int get schemaVersion => 21;
@override
MigrationStrategy get migration => MigrationStrategy(
@@ -230,6 +230,10 @@ class Drift extends $Drift implements IDatabaseRepository {
await m.addColumn(v20.assetFaceEntity, v20.assetFaceEntity.isVisible);
await m.addColumn(v20.assetFaceEntity, v20.assetFaceEntity.deletedAt);
},
from20To21: (m, v21) async {
await m.addColumn(v21.localAssetEntity, v21.localAssetEntity.playbackStyle);
await m.addColumn(v21.trashedLocalAssetEntity, v21.trashedLocalAssetEntity.playbackStyle);
},
),
);
@@ -8904,6 +8904,591 @@ i1.GeneratedColumn<bool> _column_102(String aliasedName) =>
),
defaultValue: const CustomExpression('1'),
);
final class Schema21 extends i0.VersionedSchema {
Schema21({required super.database}) : super(version: 21);
@override
late final List<i1.DatabaseSchemaEntity> entities = [
userEntity,
remoteAssetEntity,
stackEntity,
localAssetEntity,
remoteAlbumEntity,
localAlbumEntity,
localAlbumAssetEntity,
idxLocalAlbumAssetAlbumAsset,
idxRemoteAlbumOwnerId,
idxLocalAssetChecksum,
idxLocalAssetCloudId,
idxStackPrimaryAssetId,
idxRemoteAssetOwnerChecksum,
uQRemoteAssetsOwnerChecksum,
uQRemoteAssetsOwnerLibraryChecksum,
idxRemoteAssetChecksum,
idxRemoteAssetStackId,
idxRemoteAssetLocalDateTimeDay,
idxRemoteAssetLocalDateTimeMonth,
authUserEntity,
userMetadataEntity,
partnerEntity,
remoteExifEntity,
remoteAlbumAssetEntity,
remoteAlbumUserEntity,
remoteAssetCloudIdEntity,
memoryEntity,
memoryAssetEntity,
personEntity,
assetFaceEntity,
storeEntity,
trashedLocalAssetEntity,
idxPartnerSharedWithId,
idxLatLng,
idxRemoteAlbumAssetAlbumAsset,
idxRemoteAssetCloudId,
idxPersonOwnerId,
idxAssetFacePersonId,
idxAssetFaceAssetId,
idxTrashedLocalAssetChecksum,
idxTrashedLocalAssetAlbum,
];
late final Shape20 userEntity = Shape20(
source: i0.VersionedTable(
entityName: 'user_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_1,
_column_3,
_column_84,
_column_85,
_column_91,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape28 remoteAssetEntity = Shape28(
source: i0.VersionedTable(
entityName: 'remote_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_1,
_column_8,
_column_9,
_column_5,
_column_10,
_column_11,
_column_12,
_column_0,
_column_13,
_column_14,
_column_15,
_column_16,
_column_17,
_column_18,
_column_19,
_column_20,
_column_21,
_column_86,
_column_101,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape3 stackEntity = Shape3(
source: i0.VersionedTable(
entityName: 'stack_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [_column_0, _column_9, _column_5, _column_15, _column_75],
attachedDatabase: database,
),
alias: null,
);
late final Shape30 localAssetEntity = Shape30(
source: i0.VersionedTable(
entityName: 'local_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_1,
_column_8,
_column_9,
_column_5,
_column_10,
_column_11,
_column_12,
_column_0,
_column_22,
_column_14,
_column_23,
_column_98,
_column_96,
_column_46,
_column_47,
_column_103,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape9 remoteAlbumEntity = Shape9(
source: i0.VersionedTable(
entityName: 'remote_album_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_1,
_column_56,
_column_9,
_column_5,
_column_15,
_column_57,
_column_58,
_column_59,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape19 localAlbumEntity = Shape19(
source: i0.VersionedTable(
entityName: 'local_album_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_1,
_column_5,
_column_31,
_column_32,
_column_90,
_column_33,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape22 localAlbumAssetEntity = Shape22(
source: i0.VersionedTable(
entityName: 'local_album_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(asset_id, album_id)'],
columns: [_column_34, _column_35, _column_33],
attachedDatabase: database,
),
alias: null,
);
final i1.Index idxLocalAlbumAssetAlbumAsset = i1.Index(
'idx_local_album_asset_album_asset',
'CREATE INDEX IF NOT EXISTS idx_local_album_asset_album_asset ON local_album_asset_entity (album_id, asset_id)',
);
final i1.Index idxRemoteAlbumOwnerId = i1.Index(
'idx_remote_album_owner_id',
'CREATE INDEX IF NOT EXISTS idx_remote_album_owner_id ON remote_album_entity (owner_id)',
);
final i1.Index idxLocalAssetChecksum = i1.Index(
'idx_local_asset_checksum',
'CREATE INDEX IF NOT EXISTS idx_local_asset_checksum ON local_asset_entity (checksum)',
);
final i1.Index idxLocalAssetCloudId = i1.Index(
'idx_local_asset_cloud_id',
'CREATE INDEX IF NOT EXISTS idx_local_asset_cloud_id ON local_asset_entity (i_cloud_id)',
);
final i1.Index idxStackPrimaryAssetId = i1.Index(
'idx_stack_primary_asset_id',
'CREATE INDEX IF NOT EXISTS idx_stack_primary_asset_id ON stack_entity (primary_asset_id)',
);
final i1.Index idxRemoteAssetOwnerChecksum = i1.Index(
'idx_remote_asset_owner_checksum',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_owner_checksum ON remote_asset_entity (owner_id, checksum)',
);
final i1.Index uQRemoteAssetsOwnerChecksum = i1.Index(
'UQ_remote_assets_owner_checksum',
'CREATE UNIQUE INDEX IF NOT EXISTS UQ_remote_assets_owner_checksum ON remote_asset_entity (owner_id, checksum) WHERE(library_id IS NULL)',
);
final i1.Index uQRemoteAssetsOwnerLibraryChecksum = i1.Index(
'UQ_remote_assets_owner_library_checksum',
'CREATE UNIQUE INDEX IF NOT EXISTS UQ_remote_assets_owner_library_checksum ON remote_asset_entity (owner_id, library_id, checksum) WHERE(library_id IS NOT NULL)',
);
final i1.Index idxRemoteAssetChecksum = i1.Index(
'idx_remote_asset_checksum',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_checksum ON remote_asset_entity (checksum)',
);
final i1.Index idxRemoteAssetStackId = i1.Index(
'idx_remote_asset_stack_id',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_stack_id ON remote_asset_entity (stack_id)',
);
final i1.Index idxRemoteAssetLocalDateTimeDay = i1.Index(
'idx_remote_asset_local_date_time_day',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_local_date_time_day ON remote_asset_entity (STRFTIME(\'%Y-%m-%d\', local_date_time))',
);
final i1.Index idxRemoteAssetLocalDateTimeMonth = i1.Index(
'idx_remote_asset_local_date_time_month',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_local_date_time_month ON remote_asset_entity (STRFTIME(\'%Y-%m\', local_date_time))',
);
late final Shape21 authUserEntity = Shape21(
source: i0.VersionedTable(
entityName: 'auth_user_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_1,
_column_3,
_column_2,
_column_84,
_column_85,
_column_92,
_column_93,
_column_7,
_column_94,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape4 userMetadataEntity = Shape4(
source: i0.VersionedTable(
entityName: 'user_metadata_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(user_id, "key")'],
columns: [_column_25, _column_26, _column_27],
attachedDatabase: database,
),
alias: null,
);
late final Shape5 partnerEntity = Shape5(
source: i0.VersionedTable(
entityName: 'partner_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(shared_by_id, shared_with_id)'],
columns: [_column_28, _column_29, _column_30],
attachedDatabase: database,
),
alias: null,
);
late final Shape8 remoteExifEntity = Shape8(
source: i0.VersionedTable(
entityName: 'remote_exif_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(asset_id)'],
columns: [
_column_36,
_column_37,
_column_38,
_column_39,
_column_40,
_column_41,
_column_11,
_column_10,
_column_42,
_column_43,
_column_44,
_column_45,
_column_46,
_column_47,
_column_48,
_column_49,
_column_50,
_column_51,
_column_52,
_column_53,
_column_54,
_column_55,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape7 remoteAlbumAssetEntity = Shape7(
source: i0.VersionedTable(
entityName: 'remote_album_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(asset_id, album_id)'],
columns: [_column_36, _column_60],
attachedDatabase: database,
),
alias: null,
);
late final Shape10 remoteAlbumUserEntity = Shape10(
source: i0.VersionedTable(
entityName: 'remote_album_user_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(album_id, user_id)'],
columns: [_column_60, _column_25, _column_61],
attachedDatabase: database,
),
alias: null,
);
late final Shape27 remoteAssetCloudIdEntity = Shape27(
source: i0.VersionedTable(
entityName: 'remote_asset_cloud_id_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(asset_id)'],
columns: [
_column_36,
_column_99,
_column_100,
_column_96,
_column_46,
_column_47,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape11 memoryEntity = Shape11(
source: i0.VersionedTable(
entityName: 'memory_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_9,
_column_5,
_column_18,
_column_15,
_column_8,
_column_62,
_column_63,
_column_64,
_column_65,
_column_66,
_column_67,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape12 memoryAssetEntity = Shape12(
source: i0.VersionedTable(
entityName: 'memory_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(asset_id, memory_id)'],
columns: [_column_36, _column_68],
attachedDatabase: database,
),
alias: null,
);
late final Shape14 personEntity = Shape14(
source: i0.VersionedTable(
entityName: 'person_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_9,
_column_5,
_column_15,
_column_1,
_column_69,
_column_71,
_column_72,
_column_73,
_column_74,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape29 assetFaceEntity = Shape29(
source: i0.VersionedTable(
entityName: 'asset_face_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [
_column_0,
_column_36,
_column_76,
_column_77,
_column_78,
_column_79,
_column_80,
_column_81,
_column_82,
_column_83,
_column_102,
_column_18,
],
attachedDatabase: database,
),
alias: null,
);
late final Shape18 storeEntity = Shape18(
source: i0.VersionedTable(
entityName: 'store_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id)'],
columns: [_column_87, _column_88, _column_89],
attachedDatabase: database,
),
alias: null,
);
late final Shape31 trashedLocalAssetEntity = Shape31(
source: i0.VersionedTable(
entityName: 'trashed_local_asset_entity',
withoutRowId: true,
isStrict: true,
tableConstraints: ['PRIMARY KEY(id, album_id)'],
columns: [
_column_1,
_column_8,
_column_9,
_column_5,
_column_10,
_column_11,
_column_12,
_column_0,
_column_95,
_column_22,
_column_14,
_column_23,
_column_97,
_column_103,
],
attachedDatabase: database,
),
alias: null,
);
final i1.Index idxPartnerSharedWithId = i1.Index(
'idx_partner_shared_with_id',
'CREATE INDEX IF NOT EXISTS idx_partner_shared_with_id ON partner_entity (shared_with_id)',
);
final i1.Index idxLatLng = i1.Index(
'idx_lat_lng',
'CREATE INDEX IF NOT EXISTS idx_lat_lng ON remote_exif_entity (latitude, longitude)',
);
final i1.Index idxRemoteAlbumAssetAlbumAsset = i1.Index(
'idx_remote_album_asset_album_asset',
'CREATE INDEX IF NOT EXISTS idx_remote_album_asset_album_asset ON remote_album_asset_entity (album_id, asset_id)',
);
final i1.Index idxRemoteAssetCloudId = i1.Index(
'idx_remote_asset_cloud_id',
'CREATE INDEX IF NOT EXISTS idx_remote_asset_cloud_id ON remote_asset_cloud_id_entity (cloud_id)',
);
final i1.Index idxPersonOwnerId = i1.Index(
'idx_person_owner_id',
'CREATE INDEX IF NOT EXISTS idx_person_owner_id ON person_entity (owner_id)',
);
final i1.Index idxAssetFacePersonId = i1.Index(
'idx_asset_face_person_id',
'CREATE INDEX IF NOT EXISTS idx_asset_face_person_id ON asset_face_entity (person_id)',
);
final i1.Index idxAssetFaceAssetId = i1.Index(
'idx_asset_face_asset_id',
'CREATE INDEX IF NOT EXISTS idx_asset_face_asset_id ON asset_face_entity (asset_id)',
);
final i1.Index idxTrashedLocalAssetChecksum = i1.Index(
'idx_trashed_local_asset_checksum',
'CREATE INDEX IF NOT EXISTS idx_trashed_local_asset_checksum ON trashed_local_asset_entity (checksum)',
);
final i1.Index idxTrashedLocalAssetAlbum = i1.Index(
'idx_trashed_local_asset_album',
'CREATE INDEX IF NOT EXISTS idx_trashed_local_asset_album ON trashed_local_asset_entity (album_id)',
);
}
class Shape30 extends i0.VersionedTable {
Shape30({required super.source, required super.alias}) : super.aliased();
i1.GeneratedColumn<String> get name =>
columnsByName['name']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<int> get type =>
columnsByName['type']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<DateTime> get createdAt =>
columnsByName['created_at']! as i1.GeneratedColumn<DateTime>;
i1.GeneratedColumn<DateTime> get updatedAt =>
columnsByName['updated_at']! as i1.GeneratedColumn<DateTime>;
i1.GeneratedColumn<int> get width =>
columnsByName['width']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get height =>
columnsByName['height']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get durationInSeconds =>
columnsByName['duration_in_seconds']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<String> get id =>
columnsByName['id']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<String> get checksum =>
columnsByName['checksum']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<bool> get isFavorite =>
columnsByName['is_favorite']! as i1.GeneratedColumn<bool>;
i1.GeneratedColumn<int> get orientation =>
columnsByName['orientation']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<String> get iCloudId =>
columnsByName['i_cloud_id']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<DateTime> get adjustmentTime =>
columnsByName['adjustment_time']! as i1.GeneratedColumn<DateTime>;
i1.GeneratedColumn<double> get latitude =>
columnsByName['latitude']! as i1.GeneratedColumn<double>;
i1.GeneratedColumn<double> get longitude =>
columnsByName['longitude']! as i1.GeneratedColumn<double>;
i1.GeneratedColumn<int> get playbackStyle =>
columnsByName['playback_style']! as i1.GeneratedColumn<int>;
}
i1.GeneratedColumn<int> _column_103(String aliasedName) =>
i1.GeneratedColumn<int>(
'playback_style',
aliasedName,
false,
type: i1.DriftSqlType.int,
defaultValue: const CustomExpression('0'),
);
class Shape31 extends i0.VersionedTable {
Shape31({required super.source, required super.alias}) : super.aliased();
i1.GeneratedColumn<String> get name =>
columnsByName['name']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<int> get type =>
columnsByName['type']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<DateTime> get createdAt =>
columnsByName['created_at']! as i1.GeneratedColumn<DateTime>;
i1.GeneratedColumn<DateTime> get updatedAt =>
columnsByName['updated_at']! as i1.GeneratedColumn<DateTime>;
i1.GeneratedColumn<int> get width =>
columnsByName['width']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get height =>
columnsByName['height']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get durationInSeconds =>
columnsByName['duration_in_seconds']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<String> get id =>
columnsByName['id']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<String> get albumId =>
columnsByName['album_id']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<String> get checksum =>
columnsByName['checksum']! as i1.GeneratedColumn<String>;
i1.GeneratedColumn<bool> get isFavorite =>
columnsByName['is_favorite']! as i1.GeneratedColumn<bool>;
i1.GeneratedColumn<int> get orientation =>
columnsByName['orientation']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get source =>
columnsByName['source']! as i1.GeneratedColumn<int>;
i1.GeneratedColumn<int> get playbackStyle =>
columnsByName['playback_style']! as i1.GeneratedColumn<int>;
}
i0.MigrationStepWithVersion migrationSteps({
required Future<void> Function(i1.Migrator m, Schema2 schema) from1To2,
required Future<void> Function(i1.Migrator m, Schema3 schema) from2To3,
@@ -8924,6 +9509,7 @@ i0.MigrationStepWithVersion migrationSteps({
required Future<void> Function(i1.Migrator m, Schema18 schema) from17To18,
required Future<void> Function(i1.Migrator m, Schema19 schema) from18To19,
required Future<void> Function(i1.Migrator m, Schema20 schema) from19To20,
required Future<void> Function(i1.Migrator m, Schema21 schema) from20To21,
}) {
return (currentVersion, database) async {
switch (currentVersion) {
@@ -9022,6 +9608,11 @@ i0.MigrationStepWithVersion migrationSteps({
final migrator = i1.Migrator(database, schema);
await from19To20(migrator, schema);
return 20;
case 20:
final schema = Schema21(database: database);
final migrator = i1.Migrator(database, schema);
await from20To21(migrator, schema);
return 21;
default:
throw ArgumentError.value('Unknown migration from $currentVersion');
}
@@ -9048,6 +9639,7 @@ i1.OnUpgrade stepByStep({
required Future<void> Function(i1.Migrator m, Schema18 schema) from17To18,
required Future<void> Function(i1.Migrator m, Schema19 schema) from18To19,
required Future<void> Function(i1.Migrator m, Schema20 schema) from19To20,
required Future<void> Function(i1.Migrator m, Schema21 schema) from20To21,
}) => i0.VersionedSchema.stepByStepHelper(
step: migrationSteps(
from1To2: from1To2,
@@ -9069,5 +9661,6 @@ i1.OnUpgrade stepByStep({
from17To18: from17To18,
from18To19: from18To19,
from19To20: from19To20,
from20To21: from20To21,
),
);
@@ -301,6 +301,7 @@ class DriftLocalAlbumRepository extends DriftDatabaseRepository {
id: asset.id,
orientation: Value(asset.orientation),
isFavorite: Value(asset.isFavorite),
playbackStyle: Value(asset.playbackStyle),
latitude: Value(asset.latitude),
longitude: Value(asset.longitude),
adjustmentTime: Value(asset.adjustmentTime),
@@ -333,6 +334,7 @@ class DriftLocalAlbumRepository extends DriftDatabaseRepository {
checksum: const Value(null),
orientation: Value(asset.orientation),
isFavorite: Value(asset.isFavorite),
playbackStyle: Value(asset.playbackStyle),
);
batch.insert<$LocalAssetEntityTable, LocalAssetEntityData>(
_db.localAssetEntity,
@@ -101,6 +101,7 @@ class DriftTimelineRepository extends DriftDatabaseRepository {
isFavorite: row.isFavorite,
durationInSeconds: row.durationInSeconds,
orientation: row.orientation,
playbackStyle: AssetPlaybackStyle.values[row.playbackStyle],
cloudId: row.iCloudId,
latitude: row.latitude,
longitude: row.longitude,
@@ -85,6 +85,7 @@ class DriftTrashedLocalAssetRepository extends DriftDatabaseRepository {
durationInSeconds: Value(item.asset.durationInSeconds),
isFavorite: Value(item.asset.isFavorite),
orientation: Value(item.asset.orientation),
playbackStyle: Value(item.asset.playbackStyle),
source: TrashOrigin.localSync,
);
@@ -147,6 +148,7 @@ class DriftTrashedLocalAssetRepository extends DriftDatabaseRepository {
durationInSeconds: Value(asset.durationInSeconds),
isFavorite: Value(asset.isFavorite),
orientation: Value(asset.orientation),
playbackStyle: Value(asset.playbackStyle),
createdAt: Value(asset.createdAt),
updatedAt: Value(asset.updatedAt),
source: const Value(TrashOrigin.remoteSync),
@@ -195,6 +197,7 @@ class DriftTrashedLocalAssetRepository extends DriftDatabaseRepository {
checksum: Value(e.checksum),
isFavorite: Value(e.isFavorite),
orientation: Value(e.orientation),
playbackStyle: Value(e.playbackStyle),
);
});
@@ -245,6 +248,7 @@ class DriftTrashedLocalAssetRepository extends DriftDatabaseRepository {
checksum: Value(e.asset.checksum),
isFavorite: Value(e.asset.isFavorite),
orientation: Value(e.asset.orientation),
playbackStyle: Value(e.asset.playbackStyle),
source: TrashOrigin.localUser,
albumId: e.albumId,
);
@@ -25,6 +25,7 @@ class FileMediaRepository {
type: AssetType.image,
createdAt: entity.createDateTime,
updatedAt: entity.modifiedDateTime,
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
}
+53 -1
View File
@@ -5,6 +5,7 @@ import 'dart:io';
import 'package:collection/collection.dart';
import 'package:drift/drift.dart';
import 'package:immich_mobile/domain/models/album/local_album.model.dart';
import 'package:immich_mobile/domain/models/asset/base_asset.model.dart';
import 'package:immich_mobile/domain/models/store.model.dart';
import 'package:immich_mobile/entities/album.entity.dart';
import 'package:immich_mobile/entities/android_device_asset.entity.dart';
@@ -17,6 +18,7 @@ import 'package:immich_mobile/infrastructure/entities/device_asset.entity.dart';
import 'package:immich_mobile/infrastructure/entities/exif.entity.dart';
import 'package:immich_mobile/infrastructure/entities/local_album.entity.drift.dart';
import 'package:immich_mobile/infrastructure/entities/local_asset.entity.drift.dart';
import 'package:immich_mobile/infrastructure/entities/trashed_local_asset.entity.drift.dart';
import 'package:immich_mobile/infrastructure/entities/store.entity.dart';
import 'package:immich_mobile/infrastructure/entities/store.entity.drift.dart';
import 'package:immich_mobile/infrastructure/entities/user.entity.dart';
@@ -33,7 +35,7 @@ import 'package:isar/isar.dart';
// ignore: import_rule_photo_manager
import 'package:photo_manager/photo_manager.dart';
const int targetVersion = 22;
const int targetVersion = 23;
Future<void> migrateDatabaseIfNeeded(Isar db, Drift drift) async {
final hasVersion = Store.tryGet(StoreKey.version) != null;
@@ -99,6 +101,10 @@ Future<void> migrateDatabaseIfNeeded(Isar db, Drift drift) async {
}
}
if (version < 23 && Store.isBetaTimelineEnabled) {
await _populateLocalAssetPlaybackStyle(drift);
}
if (version < 22 && !Store.isBetaTimelineEnabled) {
await Store.put(StoreKey.needBetaMigration, true);
}
@@ -392,6 +398,52 @@ Future<void> migrateStoreToIsar(Isar db, Drift drift) async {
}
}
Future<void> _populateLocalAssetPlaybackStyle(Drift db) async {
try {
final nativeApi = NativeSyncApi();
final albums = await nativeApi.getAlbums();
for (final album in albums) {
final assets = await nativeApi.getAssetsForAlbum(album.id);
await db.batch((batch) {
for (final asset in assets) {
batch.update(
db.localAssetEntity,
LocalAssetEntityCompanion(playbackStyle: Value(_toPlaybackStyle(asset.playbackStyle))),
where: (t) => t.id.equals(asset.id),
);
}
});
}
final trashedAssetMap = await nativeApi.getTrashedAssets();
for (final assets in trashedAssetMap.values) {
await db.batch((batch) {
for (final asset in assets) {
batch.update(
db.trashedLocalAssetEntity,
TrashedLocalAssetEntityCompanion(playbackStyle: Value(_toPlaybackStyle(asset.playbackStyle))),
where: (t) => t.id.equals(asset.id),
);
}
});
}
dPrint(() => "[MIGRATION] Successfully populated playbackStyle for local and trashed assets");
} catch (error) {
dPrint(() => "[MIGRATION] Error while populating playbackStyle: $error");
}
}
AssetPlaybackStyle _toPlaybackStyle(PlatformAssetPlaybackStyle style) => switch (style) {
PlatformAssetPlaybackStyle.unknown => AssetPlaybackStyle.unknown,
PlatformAssetPlaybackStyle.image => AssetPlaybackStyle.image,
PlatformAssetPlaybackStyle.video => AssetPlaybackStyle.video,
PlatformAssetPlaybackStyle.imageAnimated => AssetPlaybackStyle.imageAnimated,
PlatformAssetPlaybackStyle.livePhoto => AssetPlaybackStyle.livePhoto,
PlatformAssetPlaybackStyle.videoLooping => AssetPlaybackStyle.videoLooping,
};
class _DeviceAsset {
final String assetId;
final List<int>? hash;
+4
View File
@@ -23,6 +23,7 @@ import 'schema_v17.dart' as v17;
import 'schema_v18.dart' as v18;
import 'schema_v19.dart' as v19;
import 'schema_v20.dart' as v20;
import 'schema_v21.dart' as v21;
class GeneratedHelper implements SchemaInstantiationHelper {
@override
@@ -68,6 +69,8 @@ class GeneratedHelper implements SchemaInstantiationHelper {
return v19.DatabaseAtV19(db);
case 20:
return v20.DatabaseAtV20(db);
case 21:
return v21.DatabaseAtV21(db);
default:
throw MissingSchemaException(version, versions);
}
@@ -94,5 +97,6 @@ class GeneratedHelper implements SchemaInstantiationHelper {
18,
19,
20,
21,
];
}
File diff suppressed because it is too large Load Diff
+2
View File
@@ -64,6 +64,7 @@ abstract final class LocalAssetStub {
type: AssetType.image,
createdAt: DateTime(2025),
updatedAt: DateTime(2025, 2),
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
@@ -73,6 +74,7 @@ abstract final class LocalAssetStub {
type: AssetType.image,
createdAt: DateTime(2000),
updatedAt: DateTime(20021),
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
}
@@ -194,6 +194,7 @@ void main() {
latitude: 37.7749,
longitude: -122.4194,
adjustmentTime: DateTime(2026, 1, 2),
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
@@ -243,6 +244,7 @@ void main() {
cloudId: 'cloud-id-123',
latitude: 37.7749,
longitude: -122.4194,
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
@@ -281,6 +283,7 @@ void main() {
createdAt: DateTime(2025, 1, 1),
updatedAt: DateTime(2025, 1, 2),
cloudId: null, // No cloudId
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
@@ -323,6 +326,7 @@ void main() {
cloudId: 'cloud-id-livephoto',
latitude: 37.7749,
longitude: -122.4194,
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
+1
View File
@@ -155,6 +155,7 @@ abstract final class TestUtils {
width: width,
height: height,
orientation: orientation,
playbackStyle: domain.AssetPlaybackStyle.image,
isEdited: false,
);
}
@@ -27,6 +27,7 @@ class MediumFactory {
type: type ?? AssetType.image,
createdAt: createdAt ?? DateTime.fromMillisecondsSinceEpoch(random.nextInt(1000000000)),
updatedAt: updatedAt ?? DateTime.fromMillisecondsSinceEpoch(random.nextInt(1000000000)),
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
}
@@ -23,6 +23,7 @@ LocalAsset createLocalAsset({
createdAt: createdAt ?? DateTime.now(),
updatedAt: updatedAt ?? DateTime.now(),
isFavorite: isFavorite,
playbackStyle: AssetPlaybackStyle.image,
isEdited: false,
);
}
+23 -23
View File
@@ -276,6 +276,9 @@ importers:
supertest:
specifier: ^7.0.0
version: 7.2.2
tsx:
specifier: ^4.21.0
version: 4.21.0
typescript:
specifier: ^5.3.3
version: 5.9.3
@@ -409,12 +412,9 @@ importers:
'@react-email/render':
specifier: ^1.1.2
version: 1.4.0(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
'@socket.io/postgres-adapter':
specifier: ^0.5.0
version: 0.5.0(socket.io-adapter@2.5.6)
'@types/pg':
specifier: ^8.16.0
version: 8.16.0
'@socket.io/redis-adapter':
specifier: ^8.3.0
version: 8.3.0(socket.io-adapter@2.5.6)
ajv:
specifier: ^8.17.1
version: 8.18.0
@@ -568,9 +568,6 @@ importers:
socket.io:
specifier: ^4.8.1
version: 4.8.3
socket.io-adapter:
specifier: ^2.5.6
version: 2.5.6
tailwindcss-preset-email:
specifier: ^1.4.0
version: 1.4.1(tailwindcss@3.4.19(tsx@4.21.0)(yaml@2.8.2))
@@ -3389,10 +3386,6 @@ packages:
'@microsoft/tsdoc@0.16.0':
resolution: {integrity: sha512-xgAyonlVVS+q7Vc7qLW0UrJU7rSFcETRWsqdXZtjzRU8dF+6CkozTK4V4y1LwOX7j8r/vHphjDeMeGI4tNGeGA==}
'@msgpack/msgpack@2.8.0':
resolution: {integrity: sha512-h9u4u/jiIRKbq25PM+zymTyW6bhTzELvOoUd+AvYriWOAKpLGnIamaET3pnHYoI5iYphAHBI4ayx0MehR+VVPQ==}
engines: {node: '>= 10'}
'@msgpackr-extract/msgpackr-extract-darwin-arm64@3.0.3':
resolution: {integrity: sha512-QZHtlVgbAdy2zAqNA9Gu1UpIuI8Xvsd1v8ic6B2pZmeFnFcMWiPLfWXh7TVw4eGEZ/C9TH281KwhVoeQUKbyjw==}
cpu: [arm64]
@@ -4301,9 +4294,9 @@ packages:
'@socket.io/component-emitter@3.1.2':
resolution: {integrity: sha512-9BCxFwvbGg/RsZK9tjXd8s4UcwR0MWeFQ1XEKIQVVvAGJyINdrqKMcTRyLoK8Rse1GjzLV9cwjWV1olXRWEXVA==}
'@socket.io/postgres-adapter@0.5.0':
resolution: {integrity: sha512-s1vFsatB4lS429ZbeAi8ju+mZMgtgdSmi9UsZsdcEG++vVtX5z10yDEt4TV8saePscvvGjs6uXvJfMCxz8+M2Q==}
engines: {node: '>=12.0.0'}
'@socket.io/redis-adapter@8.3.0':
resolution: {integrity: sha512-ly0cra+48hDmChxmIpnESKrc94LjRL80TEmZVscuQ/WWkRP81nNj8W8cCGMqbI4L6NCuAaPRSzZF1a9GlAxxnA==}
engines: {node: '>=10.0.0'}
peerDependencies:
socket.io-adapter: ^2.5.4
@@ -9257,6 +9250,9 @@ packages:
not@0.1.0:
resolution: {integrity: sha512-5PDmaAsVfnWUgTUbJ3ERwn7u79Z0dYxN9ErxCpVJJqe2RK0PJ3z+iFUxuqjwtlDDegXvtWoxD/3Fzxox7tFGWA==}
notepack.io@3.0.1:
resolution: {integrity: sha512-TKC/8zH5pXIAMVQio2TvVDTtPRX+DJPHDqjRbxogtFiByHyzKmy96RA0JtCQJ+WouyyL4A10xomQzgbUT+1jCg==}
npm-run-path@4.0.1:
resolution: {integrity: sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==}
engines: {node: '>=8'}
@@ -11561,6 +11557,10 @@ packages:
engines: {node: '>=0.8.0'}
hasBin: true
uid2@1.0.0:
resolution: {integrity: sha512-+I6aJUv63YAcY9n4mQreLUt0d4lvwkkopDNmpomkAUz0fAkEMV9pRWxN0EjhW1YfRhcuyHg2v3mwddCDW1+LFQ==}
engines: {node: '>= 4.0.0'}
uid@2.0.2:
resolution: {integrity: sha512-u3xV3X7uzvi5b1MncmZo3i2Aw222Zk1keqLA1YkHldREkAhAqi65wuPfe7lHx8H/Wzy+8CE7S7uS3jekIM5s8g==}
engines: {node: '>=8'}
@@ -15318,8 +15318,6 @@ snapshots:
'@microsoft/tsdoc@0.16.0': {}
'@msgpack/msgpack@2.8.0': {}
'@msgpackr-extract/msgpackr-extract-darwin-arm64@3.0.3':
optional: true
@@ -16198,15 +16196,13 @@ snapshots:
'@socket.io/component-emitter@3.1.2': {}
'@socket.io/postgres-adapter@0.5.0(socket.io-adapter@2.5.6)':
'@socket.io/redis-adapter@8.3.0(socket.io-adapter@2.5.6)':
dependencies:
'@msgpack/msgpack': 2.8.0
'@types/pg': 8.16.0
debug: 4.3.7
pg: 8.18.0
notepack.io: 3.0.1
socket.io-adapter: 2.5.6
uid2: 1.0.0
transitivePeerDependencies:
- pg-native
- supports-color
'@sphinxxxx/color-conversion@2.2.2': {}
@@ -22106,6 +22102,8 @@ snapshots:
not@0.1.0: {}
notepack.io@3.0.1: {}
npm-run-path@4.0.1:
dependencies:
path-key: 3.1.1
@@ -24833,6 +24831,8 @@ snapshots:
uglify-js@3.19.3:
optional: true
uid2@1.0.0: {}
uid@2.0.2:
dependencies:
'@lukeed/csprng': 1.1.0
+1 -3
View File
@@ -57,8 +57,7 @@
"@opentelemetry/semantic-conventions": "^1.34.0",
"@react-email/components": "^0.5.0",
"@react-email/render": "^1.1.2",
"@socket.io/postgres-adapter": "^0.5.0",
"@types/pg": "^8.16.0",
"@socket.io/redis-adapter": "^8.3.0",
"ajv": "^8.17.1",
"archiver": "^7.0.0",
"async-lock": "^1.4.0",
@@ -110,7 +109,6 @@
"sharp": "^0.34.5",
"sirv": "^3.0.0",
"socket.io": "^4.8.1",
"socket.io-adapter": "^2.5.6",
"tailwindcss-preset-email": "^1.4.0",
"thumbhash": "^0.1.1",
"transformation-matrix": "^3.1.0",
+2 -8
View File
@@ -5,9 +5,8 @@ import cookieParser from 'cookie-parser';
import { existsSync } from 'node:fs';
import sirv from 'sirv';
import { excludePaths, serverVersion } from 'src/constants';
import { SocketIoAdapter } from 'src/enum';
import { MaintenanceWorkerService } from 'src/maintenance/maintenance-worker.service';
import { createWebSocketAdapter } from 'src/middleware/websocket.adapter';
import { WebSocketAdapter } from 'src/middleware/websocket.adapter';
import { ConfigRepository } from 'src/repositories/config.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
import { bootstrapTelemetry } from 'src/repositories/telemetry.repository';
@@ -26,7 +25,6 @@ export async function configureExpress(
{
permitSwaggerWrite = true,
ssr,
socketIoAdapter,
}: {
/**
* Whether to allow swagger module to write to the specs.json
@@ -38,10 +36,6 @@ export async function configureExpress(
* Service to use for server-side rendering
*/
ssr: typeof ApiService | typeof MaintenanceWorkerService;
/**
* Override the Socket.IO adapter. If not specified, uses the adapter from config.
*/
socketIoAdapter?: SocketIoAdapter;
},
) {
const configRepository = app.get(ConfigRepository);
@@ -61,7 +55,7 @@ export async function configureExpress(
}
app.setGlobalPrefix('api', { exclude: excludePaths });
app.useWebSocketAdapter(await createWebSocketAdapter(app, socketIoAdapter));
app.useWebSocketAdapter(new WebSocketAdapter(app));
useSwagger(app, { write: configRepository.isDev() && permitSwaggerWrite });
-2
View File
@@ -10,7 +10,6 @@ import { DatabaseBackupController } from 'src/controllers/database-backup.contro
import { DownloadController } from 'src/controllers/download.controller';
import { DuplicateController } from 'src/controllers/duplicate.controller';
import { FaceController } from 'src/controllers/face.controller';
import { InternalController } from 'src/controllers/internal.controller';
import { JobController } from 'src/controllers/job.controller';
import { LibraryController } from 'src/controllers/library.controller';
import { MaintenanceController } from 'src/controllers/maintenance.controller';
@@ -52,7 +51,6 @@ export const controllers = [
DownloadController,
DuplicateController,
FaceController,
InternalController,
JobController,
LibraryController,
MaintenanceController,
@@ -1,22 +0,0 @@
import { Body, Controller, NotFoundException, Post, Req } from '@nestjs/common';
import { ApiExcludeController } from '@nestjs/swagger';
import { Request } from 'express';
import { AppRestartEvent, EventRepository } from 'src/repositories/event.repository';
const LOCALHOST_ADDRESSES = new Set(['127.0.0.1', '::1', '::ffff:127.0.0.1']);
@ApiExcludeController()
@Controller('internal')
export class InternalController {
constructor(private eventRepository: EventRepository) {}
@Post('restart')
async restart(@Req() req: Request, @Body() dto: AppRestartEvent): Promise<void> {
const remoteAddress = req.socket.remoteAddress;
if (!remoteAddress || !LOCALHOST_ADDRESSES.has(remoteAddress)) {
throw new NotFoundException();
}
await this.eventRepository.emit('AppRestart', dto);
}
}
+1 -6
View File
@@ -1,6 +1,6 @@
import { Transform, Type } from 'class-transformer';
import { IsEnum, IsInt, IsString, Matches } from 'class-validator';
import { ImmichEnvironment, LogFormat, LogLevel, SocketIoAdapter } from 'src/enum';
import { ImmichEnvironment, LogFormat, LogLevel } from 'src/enum';
import { IsIPRange, Optional, ValidateBoolean } from 'src/validation';
// TODO import from sql-tools once the swagger plugin supports external enums
@@ -149,11 +149,6 @@ export class EnvDto {
@Optional()
IMMICH_WORKERS_EXCLUDE?: string;
@IsEnum(SocketIoAdapter)
@Optional()
@Transform(({ value }) => (value ? String(value).toLowerCase().trim() : value))
IMMICH_SOCKETIO_ADAPTER?: SocketIoAdapter;
@IsString()
@Optional()
DB_DATABASE_NAME?: string;
-5
View File
@@ -518,11 +518,6 @@ export enum ImmichTelemetry {
Job = 'job',
}
export enum SocketIoAdapter {
BroadcastChannel = 'broadcastchannel',
Postgres = 'postgres',
}
export enum ExifOrientation {
Horizontal = 1,
MirrorHorizontal = 2,
+20 -17
View File
@@ -1,5 +1,6 @@
import { Kysely, sql } from 'kysely';
import { CommandFactory } from 'nest-commander';
import { ChildProcess, fork } from 'node:child_process';
import { dirname, join } from 'node:path';
import { Worker } from 'node:worker_threads';
import { PostgresError } from 'postgres';
@@ -17,7 +18,7 @@ class Workers {
/**
* Currently running workers
*/
workers: Partial<Record<ImmichWorker, { kill: () => Promise<void> | void }>> = {};
workers: Partial<Record<ImmichWorker, { kill: (signal: NodeJS.Signals) => Promise<void> | void }>> = {};
/**
* Fail-safe in case anything dies during restart
@@ -100,23 +101,25 @@ class Workers {
const basePath = dirname(__filename);
const workerFile = join(basePath, 'workers', `${name}.js`);
const inspectArg = process.execArgv.find((arg) => arg.startsWith('--inspect'));
const workerData: { inspectorPort?: number } = {};
let anyWorker: Worker | ChildProcess;
let kill: (signal?: NodeJS.Signals) => Promise<void> | void;
if (inspectArg) {
const inspectorPorts: Record<ImmichWorker, number> = {
[ImmichWorker.Api]: 9230,
[ImmichWorker.Microservices]: 9231,
[ImmichWorker.Maintenance]: 9232,
};
workerData.inspectorPort = inspectorPorts[name];
if (name === ImmichWorker.Api) {
const worker = fork(workerFile, [], {
execArgv: process.execArgv.map((arg) => (arg.startsWith('--inspect') ? '--inspect=0.0.0.0:9231' : arg)),
});
kill = (signal) => void worker.kill(signal);
anyWorker = worker;
} else {
const worker = new Worker(workerFile);
kill = async () => void (await worker.terminate());
anyWorker = worker;
}
const worker = new Worker(workerFile, { workerData });
const kill = async () => void (await worker.terminate());
worker.on('error', (error) => this.onError(name, error));
worker.on('exit', (exitCode) => this.onExit(name, exitCode));
anyWorker.on('error', (error) => this.onError(name, error));
anyWorker.on('exit', (exitCode) => this.onExit(name, exitCode));
this.workers[name] = { kill };
}
@@ -149,8 +152,8 @@ class Workers {
console.error(`${name} worker exited with code ${exitCode}`);
if (this.workers[ImmichWorker.Api] && name !== ImmichWorker.Api) {
console.error('Terminating api worker');
void this.workers[ImmichWorker.Api].kill();
console.error('Killing api process');
void this.workers[ImmichWorker.Api].kill('SIGTERM');
}
}
@@ -4,7 +4,6 @@ import {
Delete,
Get,
Next,
NotFoundException,
Param,
Post,
Req,
@@ -26,15 +25,12 @@ import { ImmichCookie } from 'src/enum';
import { MaintenanceRoute } from 'src/maintenance/maintenance-auth.guard';
import { MaintenanceWorkerService } from 'src/maintenance/maintenance-worker.service';
import { GetLoginDetails } from 'src/middleware/auth.guard';
import { AppRestartEvent } from 'src/repositories/event.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
import { LoginDetails } from 'src/services/auth.service';
import { sendFile } from 'src/utils/file';
import { respondWithCookie } from 'src/utils/response';
import { FilenameParamDto } from 'src/validation';
const LOCALHOST_ADDRESSES = new Set(['127.0.0.1', '::1', '::ffff:127.0.0.1']);
import type { DatabaseBackupController as _DatabaseBackupController } from 'src/controllers/database-backup.controller';
import type { ServerController as _ServerController } from 'src/controllers/server.controller';
import { DatabaseBackupDeleteDto, DatabaseBackupListResponseDto } from 'src/dtos/database-backup.dto';
@@ -135,14 +131,4 @@ export class MaintenanceWorkerController {
setMaintenanceMode(@Body() dto: SetMaintenanceModeDto): void {
void this.service.setAction(dto);
}
@Post('internal/restart')
internalRestart(@Req() req: Request, @Body() dto: AppRestartEvent): void {
const remoteAddress = req.socket.remoteAddress;
if (!remoteAddress || !LOCALHOST_ADDRESSES.has(remoteAddress)) {
throw new NotFoundException();
}
this.service.handleInternalRestart(dto);
}
}
@@ -19,7 +19,6 @@ import { MaintenanceWebsocketRepository } from 'src/maintenance/maintenance-webs
import { AppRepository } from 'src/repositories/app.repository';
import { ConfigRepository } from 'src/repositories/config.repository';
import { DatabaseRepository } from 'src/repositories/database.repository';
import { AppRestartEvent } from 'src/repositories/event.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
import { ProcessRepository } from 'src/repositories/process.repository';
import { StorageRepository } from 'src/repositories/storage.repository';
@@ -291,9 +290,6 @@ export class MaintenanceWorkerService {
const lock = await this.databaseRepository.tryLock(DatabaseLock.MaintenanceOperation);
if (!lock) {
// Another maintenance worker has the lock - poll until maintenance mode ends
this.logger.log('Another worker has the maintenance lock, polling for maintenance mode changes...');
await this.pollForMaintenanceEnd();
return;
}
@@ -355,25 +351,4 @@ export class MaintenanceWorkerService {
this.maintenanceWebsocketRepository.serverSend('AppRestart', state);
this.appRepository.exitApp();
}
handleInternalRestart(state: AppRestartEvent): void {
this.maintenanceWebsocketRepository.clientBroadcast('AppRestartV1', state);
this.maintenanceWebsocketRepository.serverSend('AppRestart', state);
this.appRepository.exitApp();
}
private async pollForMaintenanceEnd(): Promise<void> {
const pollIntervalMs = 5000;
while (true) {
await new Promise((resolve) => setTimeout(resolve, pollIntervalMs));
const state = await this.systemMetadataRepository.get(SystemMetadataKey.MaintenanceMode);
if (!state?.isMaintenanceMode) {
this.logger.log('Maintenance mode ended, restarting...');
this.appRepository.exitApp();
return;
}
}
}
}
@@ -1,80 +0,0 @@
import {
ClusterAdapterWithHeartbeat,
type ClusterAdapterOptions,
type ClusterMessage,
type ClusterResponse,
type ServerId,
} from 'socket.io-adapter';
const BC_CHANNEL_NAME = 'immich:socketio';
interface BroadcastChannelPayload {
type: 'message' | 'response';
sourceUid: string;
targetUid?: string;
data: unknown;
}
/**
* Socket.IO adapter using Node.js BroadcastChannel
*
* Relays messages between worker_threads within a single OS process.
* Zero external dependencies. Does NOT work across containers — use
* the Postgres adapter for multi-replica deployments.
*/
class BroadcastChannelAdapter extends ClusterAdapterWithHeartbeat {
private readonly channel: BroadcastChannel;
constructor(nsp: any, opts?: Partial<ClusterAdapterOptions>) {
super(nsp, opts ?? {});
this.channel = new BroadcastChannel(BC_CHANNEL_NAME);
this.channel.addEventListener('message', (event: MessageEvent<BroadcastChannelPayload>) => {
const msg = event.data;
if (msg.sourceUid === this.uid) {
return;
}
if (msg.type === 'message') {
this.onMessage(msg.data as ClusterMessage);
} else if (msg.type === 'response' && msg.targetUid === this.uid) {
this.onResponse(msg.data as ClusterResponse);
}
});
this.init();
}
override doPublish(message: ClusterMessage): Promise<string> {
this.channel.postMessage({
type: 'message',
sourceUid: this.uid,
data: message,
});
return Promise.resolve('');
}
override doPublishResponse(requesterUid: ServerId, response: ClusterResponse): Promise<void> {
this.channel.postMessage({
type: 'response',
sourceUid: this.uid,
targetUid: requesterUid,
data: response,
});
return Promise.resolve();
}
override close(): void {
super.close();
this.channel.close();
}
}
export function createBroadcastChannelAdapter(opts?: Partial<ClusterAdapterOptions>) {
const options: Partial<ClusterAdapterOptions> = {
...opts,
};
return function (nsp: any) {
return new BroadcastChannelAdapter(nsp, options);
};
}
+10 -92
View File
@@ -1,103 +1,21 @@
import { INestApplication, Logger } from '@nestjs/common';
import { INestApplicationContext } from '@nestjs/common';
import { IoAdapter } from '@nestjs/platform-socket.io';
import { Pool, PoolConfig } from 'pg';
import type { ServerOptions } from 'socket.io';
import { SocketIoAdapter } from 'src/enum';
import { createBroadcastChannelAdapter } from 'src/middleware/broadcast-channel.adapter';
import { createAdapter } from '@socket.io/redis-adapter';
import { Redis } from 'ioredis';
import { ServerOptions } from 'socket.io';
import { ConfigRepository } from 'src/repositories/config.repository';
import { asPostgresConnectionConfig } from 'src/utils/database';
export type Ssl = 'require' | 'allow' | 'prefer' | 'verify-full' | boolean | object;
export function asPgPoolSsl(ssl?: Ssl): PoolConfig['ssl'] {
if (ssl === undefined || ssl === false || ssl === 'allow') {
return false;
}
if (ssl === true || ssl === 'prefer' || ssl === 'require') {
return { rejectUnauthorized: false };
}
if (ssl === 'verify-full') {
return { rejectUnauthorized: true };
}
return ssl;
}
class BroadcastChannelSocketAdapter extends IoAdapter {
private adapterConstructor: ReturnType<typeof createBroadcastChannelAdapter>;
constructor(app: INestApplication) {
export class WebSocketAdapter extends IoAdapter {
constructor(private app: INestApplicationContext) {
super(app);
this.adapterConstructor = createBroadcastChannelAdapter();
}
createIOServer(port: number, options?: ServerOptions): any {
const { redis } = this.app.get(ConfigRepository).getEnv();
const server = super.createIOServer(port, options);
server.adapter(this.adapterConstructor);
const pubClient = new Redis(redis);
const subClient = pubClient.duplicate();
server.adapter(createAdapter(pubClient, subClient));
return server;
}
}
class PostgresSocketAdapter extends IoAdapter {
private adapterConstructor: any;
constructor(app: INestApplication, adapterConstructor: any) {
super(app);
this.adapterConstructor = adapterConstructor;
}
createIOServer(port: number, options?: ServerOptions): any {
const server = super.createIOServer(port, options);
server.adapter(this.adapterConstructor);
return server;
}
}
export async function createWebSocketAdapter(
app: INestApplication,
adapterOverride?: SocketIoAdapter,
): Promise<IoAdapter> {
const logger = new Logger('WebSocketAdapter');
const config = new ConfigRepository();
const { database, socketIo } = config.getEnv();
const adapter = adapterOverride ?? socketIo.adapter;
switch (adapter) {
case SocketIoAdapter.Postgres: {
logger.log('Using Postgres Socket.IO adapter');
const { createAdapter } = await import('@socket.io/postgres-adapter');
const config = asPostgresConnectionConfig(database.config);
const pool = new Pool({
host: config.host,
port: config.port,
user: config.username,
password: config.password,
database: config.database,
ssl: asPgPoolSsl(config.ssl),
max: 2,
});
await pool.query(`
CREATE TABLE IF NOT EXISTS socket_io_attachments (
id bigserial UNIQUE,
created_at timestamptz DEFAULT NOW(),
payload bytea
);
`);
pool.on('error', (error) => {
logger.error(' Postgres pool error', error);
});
const adapterConstructor = createAdapter(pool);
return new PostgresSocketAdapter(app, adapterConstructor);
}
case SocketIoAdapter.BroadcastChannel: {
logger.log('Using BroadcastChannel Socket.IO adapter');
return new BroadcastChannelSocketAdapter(app);
}
}
}
+24 -16
View File
@@ -102,22 +102,30 @@ order by
"shared_link"."createdAt" desc
-- SharedLinkRepository.getAll
select distinct
on ("shared_link"."createdAt") "shared_link".*,
"assets"."assets",
select
"shared_link".*,
(
select
coalesce(json_agg(agg), '[]')
from
(
select
"asset".*
from
"shared_link_asset"
inner join "asset" on "asset"."id" = "shared_link_asset"."assetId"
where
"shared_link"."id" = "shared_link_asset"."sharedLinkId"
and "asset"."deletedAt" is null
order by
"asset"."fileCreatedAt" asc
limit
$1
) as agg
) as "assets",
to_json("album") as "album"
from
"shared_link"
left join "shared_link_asset" on "shared_link_asset"."sharedLinkId" = "shared_link"."id"
left join lateral (
select
json_agg("asset") as "assets"
from
"asset"
where
"asset"."id" = "shared_link_asset"."assetId"
and "asset"."deletedAt" is null
) as "assets" on true
left join lateral (
select
"album".*,
@@ -152,12 +160,12 @@ from
and "album"."deletedAt" is null
) as "album" on true
where
"shared_link"."userId" = $1
"shared_link"."userId" = $2
and (
"shared_link"."type" = $2
"shared_link"."type" = $3
or "album"."id" is not null
)
and "shared_link"."albumId" = $3
and "shared_link"."albumId" = $4
order by
"shared_link"."createdAt" desc
+20 -10
View File
@@ -1,4 +1,7 @@
import { Injectable } from '@nestjs/common';
import { createAdapter } from '@socket.io/redis-adapter';
import Redis from 'ioredis';
import { Server as SocketIO } from 'socket.io';
import { ExitCode } from 'src/enum';
import { ConfigRepository } from 'src/repositories/config.repository';
import { AppRestartEvent } from 'src/repositories/event.repository';
@@ -21,17 +24,24 @@ export class AppRepository {
}
async sendOneShotAppRestart(state: AppRestartEvent): Promise<void> {
const { port } = new ConfigRepository().getEnv();
const url = `http://127.0.0.1:${port}/api/internal/restart`;
const server = new SocketIO();
const { redis } = new ConfigRepository().getEnv();
const pubClient = new Redis({ ...redis, lazyConnect: true });
const subClient = pubClient.duplicate();
const response = await fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(state),
await Promise.all([pubClient.connect(), subClient.connect()]);
server.adapter(createAdapter(pubClient, subClient));
// => corresponds to notification.service.ts#onAppRestart
server.emit('AppRestartV1', state, async () => {
const responses = await server.serverSideEmitWithAck('AppRestart', state);
if (responses.some((response) => response !== 'ok')) {
throw new Error("One or more node(s) returned a non-'ok' response to our restart request!");
}
pubClient.disconnect();
subClient.disconnect();
});
if (!response.ok) {
throw new Error(`Failed to trigger app restart: ${response.status} ${response.statusText}`);
}
}
}
@@ -21,7 +21,6 @@ import {
LogFormat,
LogLevel,
QueueName,
SocketIoAdapter,
} from 'src/enum';
import { VectorExtension } from 'src/types';
import { setDifference } from 'src/utils/set';
@@ -118,10 +117,6 @@ export interface EnvData {
};
};
socketIo: {
adapter: SocketIoAdapter;
};
noColor: boolean;
nodeVersion?: string;
}
@@ -352,10 +347,6 @@ const getEnv = (): EnvData => {
},
},
socketIo: {
adapter: dto.IMMICH_SOCKETIO_ADAPTER ?? SocketIoAdapter.Postgres,
},
noColor: !!dto.NO_COLOR,
};
};
@@ -1,6 +1,6 @@
import { Injectable } from '@nestjs/common';
import { Insertable, Kysely, NotNull, sql, Updateable } from 'kysely';
import { jsonObjectFrom } from 'kysely/helpers/postgres';
import { Insertable, Kysely, sql, Updateable } from 'kysely';
import { jsonArrayFrom, jsonObjectFrom } from 'kysely/helpers/postgres';
import _ from 'lodash';
import { InjectKysely } from 'nestjs-kysely';
import { Album, columns } from 'src/database';
@@ -124,19 +124,20 @@ export class SharedLinkRepository {
.selectFrom('shared_link')
.selectAll('shared_link')
.where('shared_link.userId', '=', userId)
.leftJoin('shared_link_asset', 'shared_link_asset.sharedLinkId', 'shared_link.id')
.leftJoinLateral(
(eb) =>
.select((eb) =>
jsonArrayFrom(
eb
.selectFrom('asset')
.select((eb) => eb.fn.jsonAgg('asset').as('assets'))
.whereRef('asset.id', '=', 'shared_link_asset.assetId')
.selectFrom('shared_link_asset')
.whereRef('shared_link.id', '=', 'shared_link_asset.sharedLinkId')
.innerJoin('asset', 'asset.id', 'shared_link_asset.assetId')
.where('asset.deletedAt', 'is', null)
.as('assets'),
(join) => join.onTrue(),
.selectAll('asset')
.orderBy('asset.fileCreatedAt', 'asc')
.limit(1),
)
.$castTo<MapAsset[]>()
.as('assets'),
)
.select('assets.assets')
.$narrowType<{ assets: NotNull }>()
.leftJoinLateral(
(eb) =>
eb
@@ -179,7 +180,6 @@ export class SharedLinkRepository {
.$if(!!albumId, (eb) => eb.where('shared_link.albumId', '=', albumId!))
.$if(!!id, (eb) => eb.where('shared_link.id', '=', id!))
.orderBy('shared_link.createdAt', 'desc')
.distinctOn(['shared_link.createdAt'])
.execute();
}
+49
View File
@@ -1,11 +1,60 @@
import { createAdapter } from '@socket.io/redis-adapter';
import Redis from 'ioredis';
import { SignJWT } from 'jose';
import { randomBytes } from 'node:crypto';
import { join } from 'node:path';
import { Server as SocketIO } from 'socket.io';
import { StorageCore } from 'src/cores/storage.core';
import { MaintenanceAuthDto, MaintenanceDetectInstallResponseDto } from 'src/dtos/maintenance.dto';
import { StorageFolder } from 'src/enum';
import { ConfigRepository } from 'src/repositories/config.repository';
import { AppRestartEvent } from 'src/repositories/event.repository';
import { StorageRepository } from 'src/repositories/storage.repository';
export function sendOneShotAppRestart(state: AppRestartEvent): void {
const server = new SocketIO();
const { redis } = new ConfigRepository().getEnv();
const pubClient = new Redis(redis);
const subClient = pubClient.duplicate();
server.adapter(createAdapter(pubClient, subClient));
/**
* Keep trying until we manage to stop Immich
*
* Sometimes there appear to be communication
* issues between to the other servers.
*
* This issue only occurs with this method.
*/
async function tryTerminate() {
while (true) {
try {
const responses = await server.serverSideEmitWithAck('AppRestart', state);
if (responses.length > 0) {
return;
}
} catch (error) {
console.error(error);
console.error('Encountered an error while telling Immich to stop.');
}
console.info(
"\nIt doesn't appear that Immich stopped, trying again in a moment.\nIf Immich is already not running, you can ignore this error.",
);
await new Promise((r) => setTimeout(r, 1e3));
}
}
// => corresponds to notification.service.ts#onAppRestart
server.emit('AppRestartV1', state, () => {
void tryTerminate().finally(() => {
pubClient.disconnect();
subClient.disconnect();
});
});
}
export async function createMaintenanceLoginUrl(
baseUrl: string,
auth: MaintenanceAuthDto,
+8 -17
View File
@@ -1,21 +1,14 @@
import { NestFactory } from '@nestjs/core';
import { NestExpressApplication } from '@nestjs/platform-express';
import inspector from 'node:inspector';
import { isMainThread, workerData } from 'node:worker_threads';
import { configureExpress, configureTelemetry } from 'src/app.common';
import { ApiModule } from 'src/app.module';
import { AppRepository } from 'src/repositories/app.repository';
import { ApiService } from 'src/services/api.service';
import { isStartUpError } from 'src/utils/misc';
export async function bootstrap() {
async function bootstrap() {
process.title = 'immich-api';
const { inspectorPort } = workerData ?? {};
if (inspectorPort) {
inspector.open(inspectorPort, '0.0.0.0', false);
}
configureTelemetry();
const app = await NestFactory.create<NestExpressApplication>(ApiModule, { bufferLogs: true });
@@ -26,12 +19,10 @@ export async function bootstrap() {
});
}
if (!isMainThread || process.send) {
bootstrap().catch((error) => {
if (!isStartUpError(error)) {
console.error(error);
}
process.exit(1);
});
}
bootstrap().catch((error) => {
if (!isStartUpError(error)) {
console.error(error);
}
// eslint-disable-next-line unicorn/no-process-exit
process.exit(1);
});
+8 -22
View File
@@ -1,22 +1,13 @@
import { NestFactory } from '@nestjs/core';
import { NestExpressApplication } from '@nestjs/platform-express';
import inspector from 'node:inspector';
import { isMainThread, workerData } from 'node:worker_threads';
import { configureExpress, configureTelemetry } from 'src/app.common';
import { MaintenanceModule } from 'src/app.module';
import { SocketIoAdapter } from 'src/enum';
import { MaintenanceWorkerService } from 'src/maintenance/maintenance-worker.service';
import { AppRepository } from 'src/repositories/app.repository';
import { isStartUpError } from 'src/utils/misc';
export async function bootstrap() {
async function bootstrap() {
process.title = 'immich-maintenance';
const { inspectorPort } = workerData ?? {};
if (inspectorPort) {
inspector.open(inspectorPort, '0.0.0.0', false);
}
configureTelemetry();
const app = await NestFactory.create<NestExpressApplication>(MaintenanceModule, { bufferLogs: true });
@@ -25,18 +16,13 @@ export async function bootstrap() {
void configureExpress(app, {
permitSwaggerWrite: false,
ssr: MaintenanceWorkerService,
// Use BroadcastChannel instead of Postgres adapter to avoid crash when
// pg_terminate_backend() kills all database connections during restore
socketIoAdapter: SocketIoAdapter.BroadcastChannel,
});
}
if (!isMainThread) {
bootstrap().catch((error) => {
if (!isStartUpError(error)) {
console.error(error);
}
process.exit(1);
});
}
bootstrap().catch((error) => {
if (!isStartUpError(error)) {
console.error(error);
}
// eslint-disable-next-line unicorn/no-process-exit
process.exit(1);
});
+3 -9
View File
@@ -1,9 +1,8 @@
import { NestFactory } from '@nestjs/core';
import inspector from 'node:inspector';
import { isMainThread, workerData } from 'node:worker_threads';
import { isMainThread } from 'node:worker_threads';
import { MicroservicesModule } from 'src/app.module';
import { serverVersion } from 'src/constants';
import { createWebSocketAdapter } from 'src/middleware/websocket.adapter';
import { WebSocketAdapter } from 'src/middleware/websocket.adapter';
import { AppRepository } from 'src/repositories/app.repository';
import { ConfigRepository } from 'src/repositories/config.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
@@ -11,11 +10,6 @@ import { bootstrapTelemetry } from 'src/repositories/telemetry.repository';
import { isStartUpError } from 'src/utils/misc';
export async function bootstrap() {
const { inspectorPort } = workerData ?? {};
if (inspectorPort) {
inspector.open(inspectorPort, '0.0.0.0', false);
}
const { telemetry } = new ConfigRepository().getEnv();
if (telemetry.metrics.size > 0) {
bootstrapTelemetry(telemetry.microservicesPort);
@@ -30,7 +24,7 @@ export async function bootstrap() {
logger.setContext('Bootstrap');
app.useLogger(logger);
app.useWebSocketAdapter(await createWebSocketAdapter(app));
app.useWebSocketAdapter(new WebSocketAdapter(app));
await (host ? app.listen(0, host) : app.listen(0));
+8
View File
@@ -233,6 +233,14 @@ export class MediumTestContext<S extends BaseService = BaseService> {
return { albumUser: { albumId, userId, role }, result };
}
async softDeleteAsset(assetId: string) {
await this.database.updateTable('asset').set({ deletedAt: new Date() }).where('id', '=', assetId).execute();
}
async softDeleteAlbum(albumId: string) {
await this.database.updateTable('album').set({ deletedAt: new Date() }).where('id', '=', albumId).execute();
}
async newJobStatus(dto: Partial<Insertable<AssetJobStatusTable>> & { assetId: string }) {
const jobStatus = mediumFactory.assetJobStatusInsert({ assetId: dto.assetId });
const result = await this.get(AssetRepository).upsertJobStatus(jobStatus);
@@ -1,276 +0,0 @@
import { ClusterMessage, ClusterResponse } from 'socket.io-adapter';
import { createBroadcastChannelAdapter } from 'src/middleware/broadcast-channel.adapter';
import { vi } from 'vitest';
const createMockNamespace = () => ({
name: '/',
sockets: new Map(),
adapter: null,
server: {
encoder: {
encode: vi.fn().mockReturnValue([]),
},
_opts: {},
sockets: {
sockets: new Map(),
},
},
});
describe('BroadcastChannelAdapter', () => {
describe('createBroadcastChannelAdapter', () => {
it('should return a factory function', () => {
const factory = createBroadcastChannelAdapter();
expect(typeof factory).toBe('function');
});
it('should create adapter instance when factory is called', () => {
const mockNamespace = createMockNamespace();
const factory = createBroadcastChannelAdapter();
const adapter = factory(mockNamespace);
expect(adapter).toBeDefined();
expect(adapter.doPublish).toBeDefined();
expect(adapter.doPublishResponse).toBeDefined();
adapter.close();
});
});
describe('BroadcastChannelAdapter message passing', () => {
it('should actually send and receive messages between two adapters', async () => {
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedMessages: ClusterMessage[] = [];
const messageReceived = new Promise<void>((resolve) => {
const originalOnMessage = adapter2.onMessage.bind(adapter2);
adapter2.onMessage = (message: ClusterMessage) => {
receivedMessages.push(message);
resolve();
return originalOnMessage(message);
};
});
const testMessage = {
type: 2,
data: {
opts: { rooms: new Set(['room1']) },
rooms: ['room1'],
},
nsp: '/',
};
void adapter1.doPublish(testMessage as any);
await Promise.race([messageReceived, new Promise((resolve) => setTimeout(resolve, 500))]);
expect(receivedMessages.length).toBeGreaterThan(0);
adapter1.close();
adapter2.close();
});
it('should send ConfigUpdate-style event and receive it on another adapter', async () => {
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedMessages: ClusterMessage[] = [];
const messageReceived = new Promise<void>((resolve) => {
const originalOnMessage = adapter2.onMessage.bind(adapter2);
adapter2.onMessage = (message: ClusterMessage) => {
receivedMessages.push(message);
if ((message as any)?.data?.event === 'ConfigUpdate') {
resolve();
}
return originalOnMessage(message);
};
});
const configUpdateMessage = {
type: 2,
data: {
event: 'ConfigUpdate',
payload: { newConfig: { ffmpeg: { crf: 23 } }, oldConfig: { ffmpeg: { crf: 20 } } },
opts: { rooms: new Set() },
rooms: [],
},
nsp: '/',
};
void adapter1.doPublish(configUpdateMessage as any);
await Promise.race([messageReceived, new Promise((resolve) => setTimeout(resolve, 500))]);
const configMessages = receivedMessages.filter((m) => (m as any)?.data?.event === 'ConfigUpdate');
expect(configMessages.length).toBeGreaterThan(0);
expect((configMessages[0] as any).data.payload.newConfig.ffmpeg.crf).toBe(23);
adapter1.close();
adapter2.close();
});
it('should send AppRestart-style event and receive it on another adapter', async () => {
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedMessages: ClusterMessage[] = [];
const messageReceived = new Promise<void>((resolve) => {
const originalOnMessage = adapter2.onMessage.bind(adapter2);
adapter2.onMessage = (message: ClusterMessage) => {
receivedMessages.push(message);
if ((message as any)?.data?.event === 'AppRestart') {
resolve();
}
return originalOnMessage(message);
};
});
const appRestartMessage = {
type: 2,
data: {
event: 'AppRestart',
payload: { isMaintenanceMode: true },
opts: { rooms: new Set() },
rooms: [],
},
nsp: '/',
};
void adapter1.doPublish(appRestartMessage as any);
await Promise.race([messageReceived, new Promise((resolve) => setTimeout(resolve, 500))]);
const restartMessages = receivedMessages.filter((m) => (m as any)?.data?.event === 'AppRestart');
expect(restartMessages.length).toBeGreaterThan(0);
expect((restartMessages[0] as any).data.payload.isMaintenanceMode).toBe(true);
adapter1.close();
adapter2.close();
});
it('should not receive its own messages (echo prevention)', async () => {
const factory = createBroadcastChannelAdapter();
const namespace = createMockNamespace();
const adapter = factory(namespace);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedOwnMessages: ClusterMessage[] = [];
const uniqueMarker = `test-${Date.now()}-${Math.random()}`;
const originalOnMessage = adapter.onMessage.bind(adapter);
adapter.onMessage = (message: ClusterMessage) => {
if ((message as any)?.data?.marker === uniqueMarker) {
receivedOwnMessages.push(message);
}
return originalOnMessage(message);
};
const testMessage = {
type: 2,
data: {
marker: uniqueMarker,
opts: { rooms: new Set() },
rooms: [],
},
nsp: '/',
};
void adapter.doPublish(testMessage as any);
await new Promise((resolve) => setTimeout(resolve, 200));
expect(receivedOwnMessages.length).toBe(0);
adapter.close();
});
it('should send and receive response messages between adapters', async () => {
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedResponses: ClusterResponse[] = [];
const responseReceived = new Promise<void>((resolve) => {
const originalOnResponse = adapter1.onResponse.bind(adapter1);
adapter1.onResponse = (response: ClusterResponse) => {
receivedResponses.push(response);
resolve();
return originalOnResponse(response);
};
});
const responseMessage = {
type: 3,
data: { result: 'success', count: 42 },
};
void adapter2.doPublishResponse((adapter1 as any).uid, responseMessage as any);
await Promise.race([responseReceived, new Promise((resolve) => setTimeout(resolve, 500))]);
expect(receivedResponses.length).toBeGreaterThan(0);
adapter1.close();
adapter2.close();
});
});
describe('BroadcastChannelAdapter lifecycle', () => {
it('should close cleanly without errors', () => {
const factory = createBroadcastChannelAdapter();
const namespace = createMockNamespace();
const adapter = factory(namespace);
expect(() => adapter.close()).not.toThrow();
});
it('should handle multiple adapters closing in sequence', () => {
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const factory3 = createBroadcastChannelAdapter();
const adapter1 = factory1(createMockNamespace());
const adapter2 = factory2(createMockNamespace());
const adapter3 = factory3(createMockNamespace());
expect(() => {
adapter1.close();
adapter2.close();
adapter3.close();
}).not.toThrow();
});
});
});
@@ -1,159 +0,0 @@
import { Server } from 'socket.io';
import { createBroadcastChannelAdapter } from 'src/middleware/broadcast-channel.adapter';
import { EventRepository } from 'src/repositories/event.repository';
import { LoggingRepository } from 'src/repositories/logging.repository';
import { WebsocketRepository } from 'src/repositories/websocket.repository';
import { automock } from 'test/utils';
import { vi } from 'vitest';
describe('WebSocket Integration - serverSend with adapters', () => {
describe('BroadcastChannel adapter', () => {
it('should broadcast ConfigUpdate event through BroadcastChannel adapter', async () => {
const createMockNamespace = () => ({
name: '/',
sockets: new Map(),
adapter: null,
server: {
encoder: { encode: vi.fn().mockReturnValue([]) },
_opts: {},
sockets: { sockets: new Map() },
},
});
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedMessages: any[] = [];
vi.spyOn(adapter2, 'onMessage').mockImplementation((message: any) => {
receivedMessages.push(message);
});
const configUpdatePayload = {
type: 5,
data: {
event: 'ConfigUpdate',
args: [{ newConfig: { ffmpeg: { crf: 23 } }, oldConfig: { ffmpeg: { crf: 20 } } }],
},
nsp: '/',
};
void adapter1.doPublish(configUpdatePayload as any);
await new Promise((resolve) => setTimeout(resolve, 100));
const configMessages = receivedMessages.filter((m) => m?.data?.event === 'ConfigUpdate');
expect(configMessages.length).toBeGreaterThan(0);
adapter1.close();
adapter2.close();
});
it('should broadcast AppRestart event through BroadcastChannel adapter', async () => {
const createMockNamespace = () => ({
name: '/',
sockets: new Map(),
adapter: null,
server: {
encoder: { encode: vi.fn().mockReturnValue([]) },
_opts: {},
sockets: { sockets: new Map() },
},
});
const factory1 = createBroadcastChannelAdapter();
const factory2 = createBroadcastChannelAdapter();
const namespace1 = createMockNamespace();
const namespace2 = createMockNamespace();
const adapter1 = factory1(namespace1);
const adapter2 = factory2(namespace2);
await new Promise((resolve) => setTimeout(resolve, 100));
const receivedMessages: any[] = [];
vi.spyOn(adapter2, 'onMessage').mockImplementation((message: any) => {
receivedMessages.push(message);
});
const appRestartPayload = {
type: 5,
data: {
event: 'AppRestart',
args: [{ isMaintenanceMode: true }],
},
nsp: '/',
};
void adapter1.doPublish(appRestartPayload as any);
await new Promise((resolve) => setTimeout(resolve, 100));
const restartMessages = receivedMessages.filter((m) => m?.data?.event === 'AppRestart');
expect(restartMessages.length).toBeGreaterThan(0);
adapter1.close();
adapter2.close();
});
});
describe('WebsocketRepository with adapter', () => {
it('should call serverSideEmit when serverSend is called', () => {
const mockServer = {
serverSideEmit: vi.fn(),
on: vi.fn(),
} as unknown as Server;
const eventRepository = automock(EventRepository, {
args: [undefined, undefined, { setContext: () => {} }],
});
const loggingRepository = automock(LoggingRepository, {
args: [undefined, { getEnv: () => ({ noColor: false }) }],
strict: false,
});
const websocketRepository = new WebsocketRepository(eventRepository, loggingRepository);
(websocketRepository as any).server = mockServer;
websocketRepository.serverSend('ConfigUpdate', {
newConfig: { ffmpeg: { crf: 23 } } as any,
oldConfig: { ffmpeg: { crf: 20 } } as any,
});
expect(mockServer.serverSideEmit).toHaveBeenCalledWith('ConfigUpdate', {
newConfig: { ffmpeg: { crf: 23 } },
oldConfig: { ffmpeg: { crf: 20 } },
});
});
it('should call serverSideEmit for AppRestart event', () => {
const mockServer = {
serverSideEmit: vi.fn(),
on: vi.fn(),
} as unknown as Server;
const eventRepository = automock(EventRepository, {
args: [undefined, undefined, { setContext: () => {} }],
});
const loggingRepository = automock(LoggingRepository, {
args: [undefined, { getEnv: () => ({ noColor: false }) }],
strict: false,
});
const websocketRepository = new WebsocketRepository(eventRepository, loggingRepository);
(websocketRepository as any).server = mockServer;
websocketRepository.serverSend('AppRestart', { isMaintenanceMode: true });
expect(mockServer.serverSideEmit).toHaveBeenCalledWith('AppRestart', { isMaintenanceMode: true });
});
});
});
@@ -1,70 +0,0 @@
import { INestApplication } from '@nestjs/common';
import { IoAdapter } from '@nestjs/platform-socket.io';
import { SocketIoAdapter } from 'src/enum';
import { asPgPoolSsl, createWebSocketAdapter } from 'src/middleware/websocket.adapter';
import { Mocked, vi } from 'vitest';
describe('asPgPoolSsl', () => {
it('should return false for undefined ssl', () => {
expect(asPgPoolSsl()).toBe(false);
});
it('should return false for ssl = false', () => {
expect(asPgPoolSsl(false)).toBe(false);
});
it('should return false for ssl = "allow"', () => {
expect(asPgPoolSsl('allow')).toBe(false);
});
it('should return { rejectUnauthorized: false } for ssl = true', () => {
expect(asPgPoolSsl(true)).toEqual({ rejectUnauthorized: false });
});
it('should return { rejectUnauthorized: false } for ssl = "prefer"', () => {
expect(asPgPoolSsl('prefer')).toEqual({ rejectUnauthorized: false });
});
it('should return { rejectUnauthorized: false } for ssl = "require"', () => {
expect(asPgPoolSsl('require')).toEqual({ rejectUnauthorized: false });
});
it('should return { rejectUnauthorized: true } for ssl = "verify-full"', () => {
expect(asPgPoolSsl('verify-full')).toEqual({ rejectUnauthorized: true });
});
it('should pass through object ssl config unchanged', () => {
const sslConfig = { ca: 'certificate', rejectUnauthorized: true };
expect(asPgPoolSsl(sslConfig)).toBe(sslConfig);
});
});
describe('createWebSocketAdapter', () => {
let mockApp: Mocked<INestApplication>;
beforeEach(() => {
vi.clearAllMocks();
mockApp = {
getHttpServer: vi.fn().mockReturnValue({}),
} as unknown as Mocked<INestApplication>;
});
describe('BroadcastChannel adapter', () => {
it('should create BroadcastChannel adapter when configured', async () => {
const adapter = await createWebSocketAdapter(mockApp, SocketIoAdapter.BroadcastChannel);
expect(adapter).toBeDefined();
expect(adapter).toBeInstanceOf(IoAdapter);
});
});
describe('Postgres adapter', () => {
it('should create Postgres adapter when configured', async () => {
const adapter = await createWebSocketAdapter(mockApp, SocketIoAdapter.Postgres);
expect(adapter).toBeDefined();
expect(adapter).toBeInstanceOf(IoAdapter);
});
});
});
@@ -95,6 +95,469 @@ describe(SharedLinkService.name, () => {
});
});
describe('getAll', () => {
it('should return all shared links even when they share the same createdAt', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sameTimestamp = '2024-01-01T00:00:00.000Z';
const link1 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
createdAt: sameTimestamp,
});
const link2 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
createdAt: sameTimestamp,
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(2);
const ids = result.map((r) => r.id);
expect(ids).toContain(link1.id);
expect(ids).toContain(link2.id);
});
it('should return shared links sorted by createdAt in descending order', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const link1 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
createdAt: '2021-01-01T00:00:00.000Z',
});
const link2 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
createdAt: '2023-01-01T00:00:00.000Z',
});
const link3 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
createdAt: '2022-01-01T00:00:00.000Z',
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(3);
expect(result.map((r) => r.id)).toEqual([link2.id, link3.id, link1.id]);
});
it('should not return shared links belonging to other users', async () => {
const { sut, ctx } = setup();
const { user: userA } = await ctx.newUser();
const { user: userB } = await ctx.newUser();
const authA = factory.auth({ user: userA });
const authB = factory.auth({ user: userB });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const linkA = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: userA.id,
allowUpload: false,
type: SharedLinkType.Individual,
});
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: userB.id,
allowUpload: false,
type: SharedLinkType.Individual,
});
const resultA = await sut.getAll(authA, {});
expect(resultA).toHaveLength(1);
expect(resultA[0].id).toBe(linkA.id);
const resultB = await sut.getAll(authB, {});
expect(resultB).toHaveLength(1);
expect(resultB[0].id).not.toBe(linkA.id);
});
it('should filter by albumId', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album: album1 } = await ctx.newAlbum({ ownerId: user.id });
const { album: album2 } = await ctx.newAlbum({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const link1 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album1.id,
allowUpload: false,
type: SharedLinkType.Album,
});
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album2.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const result = await sut.getAll(auth, { albumId: album1.id });
expect(result).toHaveLength(1);
expect(result[0].id).toBe(link1.id);
});
it('should return album shared links with album data', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(1);
expect(result[0].album).toBeDefined();
expect(result[0].album!.id).toBe(album.id);
});
it('should return multiple album shared links without sql error from json group by', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album: album1 } = await ctx.newAlbum({ ownerId: user.id });
const { album: album2 } = await ctx.newAlbum({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const link1 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album1.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const link2 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album2.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(2);
const ids = result.map((r) => r.id);
expect(ids).toContain(link1.id);
expect(ids).toContain(link2.id);
expect(result[0].album).toBeDefined();
expect(result[1].album).toBeDefined();
});
it('should return mixed album and individual shared links together', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const { asset } = await ctx.newAsset({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const albumLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const albumLink2 = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
const individualLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
assetIds: [asset.id],
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(3);
const ids = result.map((r) => r.id);
expect(ids).toContain(albumLink.id);
expect(ids).toContain(albumLink2.id);
expect(ids).toContain(individualLink.id);
});
it('should return only the first asset as cover for an individual shared link', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const assets = await Promise.all([
ctx.newAsset({ ownerId: user.id, fileCreatedAt: '2021-01-01T00:00:00.000Z' }),
ctx.newAsset({ ownerId: user.id, fileCreatedAt: '2023-01-01T00:00:00.000Z' }),
ctx.newAsset({ ownerId: user.id, fileCreatedAt: '2022-01-01T00:00:00.000Z' }),
]);
const sharedLinkRepo = ctx.get(SharedLinkRepository);
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
assetIds: assets.map(({ asset }) => asset.id),
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(1);
expect(result[0].assets).toHaveLength(1);
expect(result[0].assets[0].id).toBe(assets[0].asset.id);
});
});
describe('get', () => {
it('should not return trashed assets for an individual shared link', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { asset: visibleAsset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: visibleAsset.id, make: 'Canon' });
const { asset: trashedAsset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: trashedAsset.id, make: 'Canon' });
await ctx.softDeleteAsset(trashedAsset.id);
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sharedLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
assetIds: [visibleAsset.id, trashedAsset.id],
});
const result = await sut.get(auth, sharedLink.id);
expect(result).toBeDefined();
expect(result!.assets).toHaveLength(1);
expect(result!.assets[0].id).toBe(visibleAsset.id);
});
it('should return empty assets when all individually shared assets are trashed', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { asset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: asset.id, make: 'Canon' });
await ctx.softDeleteAsset(asset.id);
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sharedLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
assetIds: [asset.id],
});
await expect(sut.get(auth, sharedLink.id)).resolves.toMatchObject({
assets: [],
});
});
it('should not return trashed assets in a shared album', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const { asset: visibleAsset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: visibleAsset.id, make: 'Canon' });
await ctx.newAlbumAsset({ albumId: album.id, assetId: visibleAsset.id });
const { asset: trashedAsset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: trashedAsset.id, make: 'Canon' });
await ctx.newAlbumAsset({ albumId: album.id, assetId: trashedAsset.id });
await ctx.softDeleteAsset(trashedAsset.id);
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sharedLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: true,
type: SharedLinkType.Album,
});
await expect(sut.get(auth, sharedLink.id)).resolves.toMatchObject({
album: expect.objectContaining({ assetCount: 1 }),
});
});
it('should return an empty asset count when all album assets are trashed', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const { asset } = await ctx.newAsset({ ownerId: user.id });
await ctx.newExif({ assetId: asset.id, make: 'Canon' });
await ctx.newAlbumAsset({ albumId: album.id, assetId: asset.id });
await ctx.softDeleteAsset(asset.id);
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sharedLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
await expect(sut.get(auth, sharedLink.id)).resolves.toMatchObject({
album: expect.objectContaining({ assetCount: 0 }),
});
});
it('should not return an album shared link when the album is trashed', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
const sharedLink = await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
await ctx.softDeleteAlbum(album.id);
await expect(sut.get(auth, sharedLink.id)).rejects.toThrow('Shared link not found');
});
});
describe('getAll', () => {
it('should not return trashed assets as cover for an individual shared link', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { asset: trashedAsset } = await ctx.newAsset({
ownerId: user.id,
fileCreatedAt: '2020-01-01T00:00:00.000Z',
});
await ctx.softDeleteAsset(trashedAsset.id);
const { asset: visibleAsset } = await ctx.newAsset({
ownerId: user.id,
fileCreatedAt: '2021-01-01T00:00:00.000Z',
});
const sharedLinkRepo = ctx.get(SharedLinkRepository);
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
allowUpload: false,
type: SharedLinkType.Individual,
assetIds: [trashedAsset.id, visibleAsset.id],
});
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(1);
expect(result[0].assets).toHaveLength(1);
expect(result[0].assets[0].id).toBe(visibleAsset.id);
});
it('should not return an album shared link when the album is trashed', async () => {
const { sut, ctx } = setup();
const { user } = await ctx.newUser();
const auth = factory.auth({ user });
const { album } = await ctx.newAlbum({ ownerId: user.id });
const sharedLinkRepo = ctx.get(SharedLinkRepository);
await sharedLinkRepo.create({
key: randomBytes(16),
id: factory.uuid(),
userId: user.id,
albumId: album.id,
allowUpload: false,
type: SharedLinkType.Album,
});
await ctx.softDeleteAlbum(album.id);
const result = await sut.getAll(auth, {});
expect(result).toHaveLength(0);
});
});
it('should remove individually shared asset', async () => {
const { sut, ctx } = setup();
@@ -1,4 +1,4 @@
import { DatabaseExtension, ImmichEnvironment, ImmichWorker, LogFormat, SocketIoAdapter } from 'src/enum';
import { DatabaseExtension, ImmichEnvironment, ImmichWorker, LogFormat } from 'src/enum';
import { ConfigRepository, EnvData } from 'src/repositories/config.repository';
import { RepositoryInterface } from 'src/types';
import { Mocked, vitest } from 'vitest';
@@ -99,10 +99,6 @@ const envData: EnvData = {
},
},
socketIo: {
adapter: SocketIoAdapter.Postgres,
},
noColor: false,
};
@@ -50,7 +50,10 @@
<svelte:window bind:innerWidth />
<nav id="dashboard-navbar" class="max-md:h-(--navbar-height-md) h-(--navbar-height) w-dvw text-sm">
<nav
id="dashboard-navbar"
class="max-md:h-(--navbar-height-md) h-(--navbar-height) w-dvw text-sm bg-red-50 dark:bg-red-950"
>
<SkipLink text={$t('skip_to_content')} />
<div
class="grid h-full grid-cols-[--spacing(32)_auto] items-center py-2 sidebar:grid-cols-[--spacing(64)_auto] {noBorder
@@ -80,6 +83,7 @@
<a data-sveltekit-preload-data="hover" href={Route.photos()}>
<Logo variant={mediaQueryManager.isFullSidebar ? 'inline' : 'icon'} class="max-md:h-12" />
</a>
<span class="text-xs font-bold text-red-500 ms-2">[VISUAL TEST]</span>
</div>
<div class="flex justify-between gap-4 lg:gap-8 pe-6">
<div class="hidden w-full max-w-5xl flex-1 tall:ps-0 sm:block">