Merge branch 'main' of https://github.com/immich-app/immich into feat/offline-files-job
@ -131,6 +131,10 @@ If you feel like this is the right cause and the app is something you are seeing
|
||||
|
||||
## Star History
|
||||
|
||||
<a href="https://star-history.com/#immich-app/immich">
|
||||
<img src="https://api.star-history.com/svg?repos=immich-app/immich&type=Date" alt="Star History Chart" width="100%" />
|
||||
<a href="https://star-history.com/#immich-app/immich&Date">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=immich-app/immich&type=Date&theme=dark" />
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=immich-app/immich&type=Date" />
|
||||
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=immich-app/immich&type=Date" width="100%" />
|
||||
</picture>
|
||||
</a>
|
||||
|
@ -10,8 +10,8 @@ Hello everyone, it is my pleasure to deliver the new release of Immich to you. T
|
||||
|
||||
Some notable features are:
|
||||
|
||||
- [OAuth integration](#livephoto-ios-support-)
|
||||
- [LivePhoto support on iOS](#oauth-integration-)
|
||||
- OAuth integration
|
||||
- LivePhoto support on iOS
|
||||
- User config system
|
||||
|
||||
<!--truncate-->
|
||||
|
@ -288,7 +288,11 @@ Immich components are typically deployed using docker. To see logs for deployed
|
||||
### How can I run Immich as a non-root user?
|
||||
|
||||
You can change the user in the container by setting the `user` argument in `docker-compose.yml` for each service.
|
||||
You may need to add an additional volume to `immich-microservices` that mounts internally to `/usr/src/app/.reverse-geocoding-dump`.
|
||||
You may need to add mount points or docker volumes for the following internal container paths:
|
||||
|
||||
- `immich-machine-learning:/.config`
|
||||
- `immich-machine-learning:/.cache`
|
||||
- `redis:/data`
|
||||
|
||||
The non-root user/group needs read/write access to the volume mounts, including `UPLOAD_LOCATION`.
|
||||
|
||||
|
BIN
docs/docs/administration/img/repair-page-1.png
Normal file
After Width: | Height: | Size: 68 KiB |
BIN
docs/docs/administration/img/repair-page.png
Normal file
After Width: | Height: | Size: 54 KiB |
BIN
docs/docs/administration/img/server-stats.png
Normal file
After Width: | Height: | Size: 60 KiB |
@ -1,9 +1,13 @@
|
||||
# Jobs
|
||||
|
||||
Several Immich functionalities are implemented as jobs, which run in the background. To view the status of a job navigate to the Administration Screen, and then the `Jobs` page.
|
||||
The `immich-server` responds to API requests for data and files for the web and mobile app. To do this quickly and reliably, it offloads most other work to `immich-microservices` in the form of _jobs_. Simply put, a job is a request to process data in the background. Jobs are picked up automatically by microservices containers.
|
||||
|
||||

|
||||
When a new asset is uploaded it kicks off a series of jobs, which include metadata extraction, thumbnail generation, machine learning tasks, and storage template migration, if enabled. To view the status of a job navigate to the Administration -> Jobs page.
|
||||
|
||||
Additionally, some jobs run on a schedule, which is every night at midnight. This schedule, with the exception of [External Libraries](/docs/features/libraries) scanning, cannot be changed.
|
||||
|
||||
:::info
|
||||
Storage Migration job can be run after changing the [Storage Template](/docs/administration/storage-template.mdx), in order to apply the change to the existing library.
|
||||
:::
|
||||
|
||||
<img src={require('./img/admin-jobs.png').default} width="80%" title="Admin jobs" />
|
||||
|
@ -1,32 +0,0 @@
|
||||
# Password Login
|
||||
|
||||
An overview of password login and related settings for Immich.
|
||||
|
||||
## Enable/Disable
|
||||
|
||||
Immich supports password login, which is enabled by default. The preferred way to disable it is via the [Administration Page](#administration-page), although it can also be changed via a [Server Command](#server-command) as well.
|
||||
|
||||
### Administration Page
|
||||
|
||||
To toggle the password login setting via the web, navigate to the "Administration", expand "Password Authentication", toggle the "Enabled" switch, and press "Save".
|
||||
|
||||

|
||||
|
||||
### Server Command
|
||||
|
||||
There are two [Server Commands](/docs/administration/server-commands.md) for password login:
|
||||
|
||||
1. `enable-password-login`
|
||||
2. `disable-password-login`
|
||||
|
||||
See [Server Commands](/docs/administration/server-commands.md) for more details about how to run them.
|
||||
|
||||
## Password Reset
|
||||
|
||||
### Admin
|
||||
|
||||
To reset the administrator password, use the `reset-admin-password` [Server Command](/docs/administration/server-commands.md).
|
||||
|
||||
### User
|
||||
|
||||
Immich does not currently support self-service password reset. However, the administration can reset passwords for other users. See [User Management: Password Reset](/docs/administration/user-management.mdx#password-reset) for more information about how to do this.
|
31
docs/docs/administration/repair-page.md
Normal file
@ -0,0 +1,31 @@
|
||||
# Repair Page
|
||||
|
||||
The repair page is designed to give information to the system administrator about files that are not tracked, or offline paths.
|
||||
|
||||
## Natural State
|
||||
|
||||
In this situation, everything is in its place and there is no problem that the system administrator should be aware of.
|
||||
|
||||
<img src={require('./img/repair-page.png').default} title="server statistic" />
|
||||
|
||||
## Any Other Situation
|
||||
|
||||
:::note RAM Usage
|
||||
Several users report a situation where the page fails to load. In order to solve this problem you should try to allocate more RAM to Immich, if the problem continues, you should stop using the reverse proxy while loading the page.
|
||||
:::
|
||||
|
||||
In any other situation, there are 3 different options that can appear:
|
||||
|
||||
- MATCHES - These files are matched by their checksums.
|
||||
|
||||
- OFFLINE PATHS - These files are the result of manually deleting files in the upload library or a failed file move in the past (losing track of a file).
|
||||
|
||||
:::tip
|
||||
To get rid of Offline paths you can follow this [guide](/docs/guides/remove-offline-files.md)
|
||||
:::
|
||||
|
||||
- UNTRACKED FILES - These files are not tracked by the application. They can be the result of failed moves, interrupted uploads, or left behind due to a bug.
|
||||
|
||||
In addition, you can download the information from a page, mark everything (in order to check hashing) and correct the problem if a match is found in the hashing.
|
||||
|
||||
<img src={require('./img/repair-page-1.png').default} title="server statistic" />
|
13
docs/docs/administration/server-stats.md
Normal file
@ -0,0 +1,13 @@
|
||||
# Server Stats
|
||||
|
||||
Server statistics to show the total number of videos, photos, and usage per user.
|
||||
|
||||
:::info
|
||||
If a storage quota has been defined for the user, the usage number will be displayed as a percentage of the total storage quota allocated to him.
|
||||
:::
|
||||
|
||||
:::info External library
|
||||
External library is not included in the storage quota.
|
||||
:::
|
||||
|
||||
<img src={require('./img/server-stats.png').default} title="server statistic" />
|
173
docs/docs/administration/system-settings.md
Normal file
@ -0,0 +1,173 @@
|
||||
# System Settings
|
||||
|
||||
On the system settings page, the administrator can manage global settings for the Immich instance.
|
||||
|
||||
:::note
|
||||
Viewing and modifying the system settings is restricted to the Administrator.
|
||||
:::
|
||||
|
||||
:::tip
|
||||
You can always return to the default settings by clicking the `Reset to default` button.
|
||||
:::
|
||||
|
||||
## Job Settings
|
||||
|
||||
Using these settings, you can determine the amount of work that will run concurrently for each task in microservices. Some tasks can be set to higher values on computers with powerful hardware and storage with good I/O capabilities.
|
||||
|
||||
With higher concurrency, the host will work on more assets in parallel,
|
||||
this advice improves throughput, not latency, for example, it will make Smart Search jobs process more quickly, but it won't make searching faster.
|
||||
|
||||
It is important to remember that jobs like Smart Search, Face Detection, Facial Recognition, and Transcode Videos require a **lot** of processing power and therefore do not exaggerate the amount of jobs because you're probably thoroughly overloading the server.
|
||||
|
||||
:::info Facial Recognition Concurrency
|
||||
The Facial Recognition Concurrency value cannot be changed because
|
||||
[DBSCAN](https://www.youtube.com/watch?v=RDZUdRSDOok) is traditionally sequential, but there are parallel implementations of it out there. Our implementation isn't parallel.
|
||||
:::
|
||||
|
||||
## External Library
|
||||
|
||||
### Library watching (EXPERIMENTAL)
|
||||
|
||||
External libraries can automatically import changed files without a full rescan. It will import the file whenever the operating system reports a file change. If your photos are mounted over the network, this does not work.
|
||||
|
||||
### Periodic Scanning
|
||||
|
||||
You can define a custom interval for the trigger external library rescan under Administration -> Settings -> Library.
|
||||
You can set the scanning interval using the preset or cron format. For more information please refer to e.g. [Crontab Guru](https://crontab.guru/).
|
||||
|
||||
## Logging
|
||||
|
||||
By default logs are set to record at the log level, the network administrator can choose a deeper or lower level of logs according to his decision or according to the needs required by the Immich support team.
|
||||
|
||||
Here you can [learn about the different error levels](https://sematext.com/blog/logging-levels/).
|
||||
|
||||
## Machine Learning Settings
|
||||
|
||||
Through this setting, you can manage all the settings related to machine learning in Immich, from the setting of remote machine learning to the model and its parameters
|
||||
You can choose to disable a certain type of machine learning, for example smart search or facial recognition.
|
||||
|
||||
### Smart Search
|
||||
|
||||
The smart search settings are designed to allow the search tool to be used using [CLIP](https://openai.com/research/clip) models that [can be changed](/docs/FAQ#can-i-use-a-custom-clip-model), different models will necessarily give better results but may consume more processing power, when changing a model it is mandatory to re-run the
|
||||
Smart Search job on all images to fully apply the change.
|
||||
|
||||
:::info Internet connection
|
||||
Changing models requires a connection to the Internet to download the model.
|
||||
After downloading, there is no need for Immich to connect to the network
|
||||
Unless version checking has been enabled in the settings.
|
||||
:::
|
||||
|
||||
### Facial Recognition
|
||||
|
||||
Under these settings, you can change the facial recognition settings
|
||||
Editable settings:
|
||||
|
||||
- **Facial Recognition Model -** Models are listed in descending order of size. Larger models are slower and use more memory, but produce better results. Note that you must re-run the Face Detection job for all images upon changing a model.
|
||||
- **Min Detection Score -** Minimum confidence score for a face to be detected from 0-1. Lower values will detect more faces but may result in false positives.
|
||||
- **Max Recognition Distance -** Maximum distance between two faces to be considered the same person, ranging from 0-2. Lowering this can prevent labeling two people as the same person, while raising it can prevent labeling the same person as two different people. Note that it is easier to merge two people than to split one person in two, so err on the side of a lower threshold when possible.
|
||||
- **Min Recognized Faces -** The minimum number of recognized faces for a person to be created (AKA: Core face). Increasing this makes Facial Recognition more precise at the cost of increasing the chance that a face is not assigned to a person.
|
||||
|
||||
:::info
|
||||
When changing the values in Min Detection Score, Max Recognition Distance, and Min Recognized Faces.
|
||||
You will have to restart **only** the job FACIAL RECOGNITION - ALL.
|
||||
|
||||
If you replace the Facial Recognition Model, you will have to run the job FACE DETECTION - ALL.
|
||||
:::
|
||||
|
||||
:::tip identical twins
|
||||
If you have twins, you might want to lower the Max Recognition Distance value, decreasing this a **bit** can make it distinguish between them.
|
||||
:::
|
||||
|
||||
## Map & GPS Settings
|
||||
|
||||
### Map Settings
|
||||
|
||||
In these settings, you can change the appearance of the map in night and day modes according to your personal preference and according to the supported options.
|
||||
The map can be adjusted via [OpenMapTiles](https://openmaptiles.org/styles/) for example.
|
||||
|
||||
### Reverse Geocoding Settings
|
||||
|
||||
Immich supports [Reverse Geocoding](/docs/features/reverse-geocoding) using data from the [GeoNames](https://www.geonames.org/) geographical database.
|
||||
|
||||
## OAuth Authentication
|
||||
|
||||
Immich supports OAuth Authentication. Read more about this feature and its configuration [here](/docs/administration/oauth).
|
||||
|
||||
## Password Authentication
|
||||
|
||||
The administrator can choose to disable login with username and password for the entire instance. This means that **no one**, including the system administrator, will be able to log using this method. If [OAuth Authentication](/docs/administration/oauth) is also disabled, no users will be able to login using **any** method. Changing this setting does not affect existing sessions, just new login attempts.
|
||||
|
||||
:::tip
|
||||
You can always use the [Server CLI](/docs/administration/server-commands) to re-enable password login.
|
||||
:::
|
||||
|
||||
## Server Settings
|
||||
|
||||
### External Domain
|
||||
|
||||
When set, will override the domain name used when viewing and copying a shared link.
|
||||
|
||||
### Welcome Message
|
||||
|
||||
The administrator can set a custom message on the login screen (the message will be displayed to all users).
|
||||
|
||||
## Storage Template
|
||||
|
||||
Immich supports a custom [Storage Template](/docs/administration/storage-template). Learn more about this feature and its configuration [here](/docs/administration/storage-template).
|
||||
|
||||
## Theme Settings
|
||||
|
||||
You can write custom CSS that will get loaded in the web application for all users. This enables administrators to change fonts, colors, and other styles.
|
||||
|
||||
For example:
|
||||
|
||||
```css title='Custom CSS'
|
||||
p {
|
||||
color: green;
|
||||
}
|
||||
```
|
||||
|
||||
## Thumbnail Settings
|
||||
|
||||
By default Immich creates 3 thumbnails for each asset,
|
||||
Blurred (thumbhash) , Small (webp) , and Large (jpeg), using these settings you can change the quality for the thumbnail files that are created.
|
||||
|
||||
**Small thumbnail resolution**
|
||||
Used when viewing groups of photos (main timeline, album view, etc.). Higher resolutions can preserve more detail but take longer to encode, have larger file sizes, and can reduce app responsiveness.
|
||||
|
||||
**Large thumbnail resolution**
|
||||
Used when viewing a single photo and for machine learning. Higher resolutions can preserve more detail but take longer to encode, have larger file sizes, and can reduce app responsiveness.
|
||||
|
||||
**Quality**
|
||||
Thumbnail quality from 1-100. Higher is better for quality but produces larger files.
|
||||
|
||||
**Prefer wide gamut**
|
||||
Use display p3 for thumbnails. This better preserves the vibrance of images with wide color spaces, but images may appear differently on old devices with an old browser version. Srgb images are kept as srgb to avoid color shifts.
|
||||
|
||||
:::tip
|
||||
The default resolution for Large thumbnails can be lowered from 1440p (default) to 1080p or 720p to save storage space.
|
||||
:::
|
||||
|
||||
## Trash Settings
|
||||
|
||||
In the system administrator's option to set a trash for deleted files, these files will remain in the trash until the deletion date 30 days (default) or as defined by the system administrator.
|
||||
|
||||
The trash can be disabled, however this is not recommended as future files that are deleted will be permanently deleted.
|
||||
|
||||
:::tip Keyboard shortcut for permanently deletion
|
||||
You can select assets and press Ctrl + Del from the timeline for quick permanent deletion without the trash option.
|
||||
:::
|
||||
|
||||
## User Settings
|
||||
|
||||
### Delete delay
|
||||
|
||||
The system administrator can choose to delete users through the administration panel, the system administrator can delete users immediately or alternatively delay the deletion for users (7 days by default) this action permanently delete a user's account and assets. The user deletion job runs at midnight to check for users that are ready for deletion. Changes to this setting will be evaluated at the next execution.
|
||||
|
||||
## Version Check
|
||||
|
||||
When this option is enabled the `immich-server` will periodically make requests to GitHub to check for new releases.
|
||||
|
||||
## Video Transcoding Settings
|
||||
|
||||
The system administrator can define parameters according to which video files will be converted to different formats (depending on the settings). The settings can be changed in depth, to learn more about the terminology used here, refer to FFmpeg documentation for [H.264](https://trac.ffmpeg.org/wiki/Encode/H.264) codec, [HEVC](https://trac.ffmpeg.org/wiki/Encode/H.265) codec and [VP9](https://trac.ffmpeg.org/wiki/Encode/VP9) codec.
|
@ -1,6 +1,6 @@
|
||||
# The Immich CLI
|
||||
|
||||
Immich has a CLI that allows you to perform certain actions from the command line. This CLI replaces the [legacy CLI](https://github.com/immich-app/CLI) that was previously available. The CLI is hosted in the [cli folder of the the main Immich github repository](https://github.com/immich-app/immich/tree/main/cli).
|
||||
Immich has a command line interface (CLI) that allows you to perform certain actions from the command line.
|
||||
|
||||
## Features
|
||||
|
||||
@ -54,16 +54,19 @@ Usage: immich [options] [command]
|
||||
Command line interface for Immich
|
||||
|
||||
Options:
|
||||
-V, --version output the version number
|
||||
-d, --config Configuration directory (env: IMMICH_CONFIG_DIR)
|
||||
-h, --help display help for command
|
||||
-V, --version output the version number
|
||||
-d, --config-directory <directory> Configuration directory where auth.yml will be stored (default: "~/.config/immich/", env:
|
||||
IMMICH_CONFIG_DIR)
|
||||
-u, --url [url] Immich server URL (env: IMMICH_INSTANCE_URL)
|
||||
-k, --key [key] Immich API key (env: IMMICH_API_KEY)
|
||||
-h, --help display help for command
|
||||
|
||||
Commands:
|
||||
upload [options] [paths...] Upload assets
|
||||
server-info Display server information
|
||||
login [url] [key] Login using an API key
|
||||
logout Remove stored credentials
|
||||
help [command] display help for command
|
||||
login|login-key <url> <key> Login using an API key
|
||||
logout Remove stored credentials
|
||||
server-info Display server information
|
||||
upload [options] [paths...] Upload assets
|
||||
help [command] display help for command
|
||||
```
|
||||
|
||||
## Commands
|
||||
@ -71,23 +74,24 @@ Commands:
|
||||
The upload command supports the following options:
|
||||
|
||||
```
|
||||
Usage: immich upload [options] [paths...]
|
||||
Usage: immich upload [paths...] [options]
|
||||
|
||||
Upload assets
|
||||
|
||||
Arguments:
|
||||
paths One or more paths to assets to be uploaded
|
||||
paths One or more paths to assets to be uploaded
|
||||
|
||||
Options:
|
||||
-r, --recursive Recursive (default: false, env: IMMICH_RECURSIVE)
|
||||
-i, --ignore [paths...] Paths to ignore (env: IMMICH_IGNORE_PATHS)
|
||||
-h, --skip-hash Don't hash files before upload (default: false, env: IMMICH_SKIP_HASH)
|
||||
-H, --include-hidden Include hidden folders (default: false, env: IMMICH_INCLUDE_HIDDEN)
|
||||
-a, --album Automatically create albums based on folder name (default: false, env: IMMICH_AUTO_CREATE_ALBUM)
|
||||
-A, --album-name <name> Add all assets to specified album (env: IMMICH_ALBUM_NAME)
|
||||
-n, --dry-run Don't perform any actions, just show what will be done (default: false, env: IMMICH_DRY_RUN)
|
||||
--delete Delete local assets after upload (env: IMMICH_DELETE_ASSETS)
|
||||
--help display help for command
|
||||
-r, --recursive Recursive (default: false, env: IMMICH_RECURSIVE)
|
||||
-i, --ignore [paths...] Paths to ignore (default: [], env: IMMICH_IGNORE_PATHS)
|
||||
-h, --skip-hash Don't hash files before upload (default: false, env: IMMICH_SKIP_HASH)
|
||||
-H, --include-hidden Include hidden folders (default: false, env: IMMICH_INCLUDE_HIDDEN)
|
||||
-a, --album Automatically create albums based on folder name (default: false, env: IMMICH_AUTO_CREATE_ALBUM)
|
||||
-A, --album-name <name> Add all assets to specified album (env: IMMICH_ALBUM_NAME)
|
||||
-n, --dry-run Don't perform any actions, just show what will be done (default: false, env: IMMICH_DRY_RUN)
|
||||
-c, --concurrency <number> Number of assets to upload at the same time (default: 4, env: IMMICH_UPLOAD_CONCURRENCY)
|
||||
--delete Delete local assets after upload (env: IMMICH_DELETE_ASSETS)
|
||||
--help display help for command
|
||||
```
|
||||
|
||||
Note that the above options can read from environment variables as well.
|
||||
|
@ -6,7 +6,7 @@ Immich can ingest XMP sidecars on file upload (via the CLI) as well as detect ne
|
||||
|
||||
XMP sidecars are external XML files that contain metadata related to media files. Many applications read and write these files either exclusively or in addition to the metadata written to image files. They can be a powerful tool for editing and storing metadata of a media file without modifying the media file itself. When Immich receives or detects an XMP sidecar for a media file, it will attempt to extract the metadata from both the sidecar as well as the media file. It will prioritize the metadata for fields in the sidecar but will fall back and use the metadata in the media file if necessary.
|
||||
|
||||
When importing files via the CLI bulk uploader, Immich will automatically detect XMP sidecar files as files that exist next to the original media file and have the exact same name with an additional `.xmp` file extension (i.e., `PXL_20230401_203352928.MP.jpg` and `PXL_20230401_203352928.MP.jpg.xmp`).
|
||||
When importing files via the CLI bulk uploader or parsing photo metadata for external libraries, Immich will automatically detect XMP sidecar files as files that exist next to the original media file. Immich will look files that have the same name as the photo, but with the `.xmp` file extension. The same name can either include the photo's file extension or without the photo's file extension. For example, for a photo named `PXL_20230401_203352928.MP.jpg`, Immich will look for an XMP file named either `PXL_20230401_203352928.MP.jpg.xmp` or `PXL_20230401_203352928.MP.xmp`. If both `PXL_20230401_203352928.MP.jpg.xmp` and `PXL_20230401_203352928.MP.xmp` are present, Immich will prefer `PXL_20230401_203352928.MP.jpg.xmp`.
|
||||
|
||||
There are 2 administrator jobs associated with sidecar files: `SYNC` and `DISCOVER`. The sync job will re-scan all media with existing sidecar files and queue them for a metadata refresh. This is a great use case when third-party applications are used to modify the metadata of media. The discover job will attempt to scan the filesystem for new sidecar files for all media that does not currently have a sidecar file associated with it.
|
||||
|
||||
|
@ -21,7 +21,7 @@ cd ./immich-app
|
||||
Download [`docker-compose.yml`][compose-file] and [`example.env`][env-file], either by running the following commands:
|
||||
|
||||
```bash title="Get docker-compose.yml file"
|
||||
wget https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml
|
||||
wget -O docker-compose.yml https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml
|
||||
```
|
||||
|
||||
```bash title="Get .env file"
|
||||
@ -29,11 +29,11 @@ wget -O .env https://github.com/immich-app/immich/releases/latest/download/examp
|
||||
```
|
||||
|
||||
```bash title="(Optional) Get hwaccel.transcoding.yml file"
|
||||
wget https://github.com/immich-app/immich/releases/latest/download/hwaccel.transcoding.yml
|
||||
wget -O hwaccel.transcoding.yml https://github.com/immich-app/immich/releases/latest/download/hwaccel.transcoding.yml
|
||||
```
|
||||
|
||||
```bash title="(Optional) Get hwaccel.ml.yml file"
|
||||
wget https://github.com/immich-app/immich/releases/latest/download/hwaccel.ml.yml
|
||||
wget -O hwaccel.ml.yml https://github.com/immich-app/immich/releases/latest/download/hwaccel.ml.yml
|
||||
```
|
||||
|
||||
or by downloading from your browser and moving the files to the directory that you created.
|
||||
|
@ -1,7 +1,7 @@
|
||||
Immich allows the admin user to set the uploaded filename pattern. Both at the directory and filename level.
|
||||
|
||||
:::note new version
|
||||
On new machines running version 1.92.0 storage template engine is off by default, for [more info](https://github.com/immich-app/immich/releases#:~:text=the%20partner%E2%80%99s%20assets.-,Hardening%20storage%20template,-We%20have%20further).
|
||||
On new machines running version 1.92.0 storage template engine is off by default, for [more info](https://github.com/immich-app/immich/releases/tag/v1.92.0#:~:text=the%20partner%E2%80%99s%20assets.-,Hardening%20storage%20template,-We%20have%20further).
|
||||
:::
|
||||
|
||||
:::tip
|
||||
|
1
docs/static/_redirects
vendored
@ -24,3 +24,4 @@
|
||||
/docs/features/user-management /docs/administration/user-management 301
|
||||
/docs/developer/contributing /docs/developer/pr-checklist 301
|
||||
/docs/guides/machine-learning /docs/guides/remote-machine-learning 301
|
||||
/docs/administration/password-login /docs/administration/system-settings 301
|
||||
|
@ -53,7 +53,7 @@ COPY --from=web /usr/src/app/build ./www
|
||||
COPY server/resources resources
|
||||
COPY server/package.json server/package-lock.json ./
|
||||
COPY server/start*.sh ./
|
||||
RUN npm link && npm cache clean --force
|
||||
RUN npm link && npm install -g @immich/cli && npm cache clean --force
|
||||
COPY LICENSE /licenses/LICENSE.txt
|
||||
COPY LICENSE /LICENSE
|
||||
ENV PATH="${PATH}:/usr/src/app/bin"
|
||||
|
@ -1,3 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
node /usr/src/app/node_modules/.bin/immich "$@"
|
33
server/package-lock.json
generated
@ -10,7 +10,6 @@
|
||||
"license": "GNU Affero General Public License version 3",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.22.11",
|
||||
"@immich/cli": "^2.0.7",
|
||||
"@nestjs/bullmq": "^10.0.1",
|
||||
"@nestjs/common": "^10.2.2",
|
||||
"@nestjs/config": "^3.0.0",
|
||||
@ -1717,20 +1716,6 @@
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@immich/cli": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@immich/cli/-/cli-2.1.0.tgz",
|
||||
"integrity": "sha512-3s/8+Js1dAwibzgaRtZ+bsAL9nOtvoEX/qMlOTgbgLf/lT96M88QScqhb+YrU2l3WBugtts6xW76XQTrWGXcmw==",
|
||||
"dependencies": {
|
||||
"lodash-es": "^4.17.21"
|
||||
},
|
||||
"bin": {
|
||||
"immich": "dist/index.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=20.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@ioredis/commands": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@ioredis/commands/-/commands-1.2.0.tgz",
|
||||
@ -10151,11 +10136,6 @@
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"node_modules/lodash-es": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash-es/-/lodash-es-4.17.21.tgz",
|
||||
"integrity": "sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw=="
|
||||
},
|
||||
"node_modules/lodash.camelcase": {
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.camelcase/-/lodash.camelcase-4.3.0.tgz",
|
||||
@ -15590,14 +15570,6 @@
|
||||
"integrity": "sha512-E4magOks77DK47FwHUIGH0RYWSgRBfGdK56kIHSVeB9uIS4pPFr4N2kIVsXdQQo4LzOsENKV5KAhRlRL7eMAdg==",
|
||||
"optional": true
|
||||
},
|
||||
"@immich/cli": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@immich/cli/-/cli-2.1.0.tgz",
|
||||
"integrity": "sha512-3s/8+Js1dAwibzgaRtZ+bsAL9nOtvoEX/qMlOTgbgLf/lT96M88QScqhb+YrU2l3WBugtts6xW76XQTrWGXcmw==",
|
||||
"requires": {
|
||||
"lodash-es": "^4.17.21"
|
||||
}
|
||||
},
|
||||
"@ioredis/commands": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@ioredis/commands/-/commands-1.2.0.tgz",
|
||||
@ -21863,11 +21835,6 @@
|
||||
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
|
||||
},
|
||||
"lodash-es": {
|
||||
"version": "4.17.21",
|
||||
"resolved": "https://registry.npmjs.org/lodash-es/-/lodash-es-4.17.21.tgz",
|
||||
"integrity": "sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw=="
|
||||
},
|
||||
"lodash.camelcase": {
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/lodash.camelcase/-/lodash.camelcase-4.3.0.tgz",
|
||||
|
@ -34,7 +34,6 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.22.11",
|
||||
"@immich/cli": "^2.0.7",
|
||||
"@nestjs/bullmq": "^10.0.1",
|
||||
"@nestjs/common": "^10.2.2",
|
||||
"@nestjs/config": "^3.0.0",
|
||||
|
@ -43,14 +43,15 @@ import { IAssetStackRepository } from 'src/interfaces/asset-stack.interface';
|
||||
import { IAssetRepositoryV1 } from 'src/interfaces/asset-v1.interface';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { IAuditRepository } from 'src/interfaces/audit.interface';
|
||||
import { ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { IDatabaseRepository } from 'src/interfaces/database.interface';
|
||||
import { IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IJobRepository } from 'src/interfaces/job.interface';
|
||||
import { ILibraryRepository } from 'src/interfaces/library.interface';
|
||||
import { IMachineLearningRepository } from 'src/interfaces/machine-learning.interface';
|
||||
import { IMediaRepository } from 'src/interfaces/media.interface';
|
||||
import { IMetadataRepository } from 'src/interfaces/metadata.interface';
|
||||
import { IMetricRepository } from 'src/interfaces/metric.interface';
|
||||
import { IMoveRepository } from 'src/interfaces/move.interface';
|
||||
import { IPartnerRepository } from 'src/interfaces/partner.interface';
|
||||
import { IPersonRepository } from 'src/interfaces/person.interface';
|
||||
@ -74,21 +75,22 @@ import { AssetStackRepository } from 'src/repositories/asset-stack.repository';
|
||||
import { AssetRepositoryV1 } from 'src/repositories/asset-v1.repository';
|
||||
import { AssetRepository } from 'src/repositories/asset.repository';
|
||||
import { AuditRepository } from 'src/repositories/audit.repository';
|
||||
import { CommunicationRepository } from 'src/repositories/communication.repository';
|
||||
import { CryptoRepository } from 'src/repositories/crypto.repository';
|
||||
import { DatabaseRepository } from 'src/repositories/database.repository';
|
||||
import { FilesystemProvider } from 'src/repositories/filesystem.provider';
|
||||
import { EventRepository } from 'src/repositories/event.repository';
|
||||
import { JobRepository } from 'src/repositories/job.repository';
|
||||
import { LibraryRepository } from 'src/repositories/library.repository';
|
||||
import { MachineLearningRepository } from 'src/repositories/machine-learning.repository';
|
||||
import { MediaRepository } from 'src/repositories/media.repository';
|
||||
import { MetadataRepository } from 'src/repositories/metadata.repository';
|
||||
import { MetricRepository } from 'src/repositories/metric.repository';
|
||||
import { MoveRepository } from 'src/repositories/move.repository';
|
||||
import { PartnerRepository } from 'src/repositories/partner.repository';
|
||||
import { PersonRepository } from 'src/repositories/person.repository';
|
||||
import { SearchRepository } from 'src/repositories/search.repository';
|
||||
import { ServerInfoRepository } from 'src/repositories/server-info.repository';
|
||||
import { SharedLinkRepository } from 'src/repositories/shared-link.repository';
|
||||
import { StorageRepository } from 'src/repositories/storage.repository';
|
||||
import { SystemConfigRepository } from 'src/repositories/system-config.repository';
|
||||
import { SystemMetadataRepository } from 'src/repositories/system-metadata.repository';
|
||||
import { TagRepository } from 'src/repositories/tag.repository';
|
||||
@ -163,7 +165,6 @@ const controllers = [
|
||||
const services: Provider[] = [
|
||||
ApiService,
|
||||
MicroservicesService,
|
||||
|
||||
APIKeyService,
|
||||
ActivityService,
|
||||
AlbumService,
|
||||
@ -200,21 +201,22 @@ const repositories: Provider[] = [
|
||||
{ provide: IAssetRepositoryV1, useClass: AssetRepositoryV1 },
|
||||
{ provide: IAssetStackRepository, useClass: AssetStackRepository },
|
||||
{ provide: IAuditRepository, useClass: AuditRepository },
|
||||
{ provide: ICommunicationRepository, useClass: CommunicationRepository },
|
||||
{ provide: ICryptoRepository, useClass: CryptoRepository },
|
||||
{ provide: IDatabaseRepository, useClass: DatabaseRepository },
|
||||
{ provide: IEventRepository, useClass: EventRepository },
|
||||
{ provide: IJobRepository, useClass: JobRepository },
|
||||
{ provide: ILibraryRepository, useClass: LibraryRepository },
|
||||
{ provide: IKeyRepository, useClass: ApiKeyRepository },
|
||||
{ provide: IMachineLearningRepository, useClass: MachineLearningRepository },
|
||||
{ provide: IMetadataRepository, useClass: MetadataRepository },
|
||||
{ provide: IMetricRepository, useClass: MetricRepository },
|
||||
{ provide: IMoveRepository, useClass: MoveRepository },
|
||||
{ provide: IPartnerRepository, useClass: PartnerRepository },
|
||||
{ provide: IPersonRepository, useClass: PersonRepository },
|
||||
{ provide: IServerInfoRepository, useClass: ServerInfoRepository },
|
||||
{ provide: ISharedLinkRepository, useClass: SharedLinkRepository },
|
||||
{ provide: ISearchRepository, useClass: SearchRepository },
|
||||
{ provide: IStorageRepository, useClass: FilesystemProvider },
|
||||
{ provide: IStorageRepository, useClass: StorageRepository },
|
||||
{ provide: ISystemConfigRepository, useClass: SystemConfigRepository },
|
||||
{ provide: ISystemMetadataRepository, useClass: SystemMetadataRepository },
|
||||
{ provide: ITagRepository, useClass: TagRepository },
|
||||
@ -277,6 +279,7 @@ export class ImmichAdminModule {}
|
||||
EventEmitterModule.forRoot(),
|
||||
TypeOrmModule.forRoot(databaseConfig),
|
||||
TypeOrmModule.forFeature(databaseEntities),
|
||||
OpenTelemetryModule.forRoot(otelConfig),
|
||||
],
|
||||
controllers: [...controllers],
|
||||
providers: [...services, ...repositories, ...middleware, SchedulerRegistry],
|
||||
|
@ -3,6 +3,8 @@ import { readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { Version } from 'src/utils/version';
|
||||
|
||||
export const SALT_ROUNDS = 10;
|
||||
|
||||
const { version } = JSON.parse(readFileSync('./package.json', 'utf8'));
|
||||
export const serverVersion = Version.fromString(version);
|
||||
|
||||
|
@ -1,5 +1,6 @@
|
||||
import { BadRequestException, ForbiddenException } from '@nestjs/common';
|
||||
import sanitize from 'sanitize-filename';
|
||||
import { SALT_ROUNDS } from 'src/constants';
|
||||
import { UserResponseDto } from 'src/dtos/user.dto';
|
||||
import { LibraryType } from 'src/entities/library.entity';
|
||||
import { UserEntity } from 'src/entities/user.entity';
|
||||
@ -7,8 +8,6 @@ import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { ILibraryRepository } from 'src/interfaces/library.interface';
|
||||
import { IUserRepository } from 'src/interfaces/user.interface';
|
||||
|
||||
const SALT_ROUNDS = 10;
|
||||
|
||||
let instance: UserCore | null;
|
||||
|
||||
export class UserCore {
|
||||
|
@ -1,7 +1,8 @@
|
||||
import { SetMetadata } from '@nestjs/common';
|
||||
import { OnEvent, OnEventType } from '@nestjs/event-emitter';
|
||||
import { OnEvent } from '@nestjs/event-emitter';
|
||||
import { OnEventOptions } from '@nestjs/event-emitter/dist/interfaces';
|
||||
import _ from 'lodash';
|
||||
import { ServerAsyncEvent, ServerEvent } from 'src/interfaces/event.interface';
|
||||
import { setUnion } from 'src/utils/set';
|
||||
|
||||
// PostgreSQL uses a 16-bit integer to indicate the number of bound parameters. This means that the
|
||||
@ -125,5 +126,5 @@ export interface GenerateSqlQueries {
|
||||
/** Decorator to enable versioning/tracking of generated Sql */
|
||||
export const GenerateSql = (...options: GenerateSqlQueries[]) => SetMetadata(GENERATE_SQL_KEY, options);
|
||||
|
||||
export const OnEventInternal = (event: OnEventType, options?: OnEventOptions) =>
|
||||
export const OnServerEvent = (event: ServerEvent | ServerAsyncEvent, options?: OnEventOptions) =>
|
||||
OnEvent(event, { suppressErrors: false, ...options });
|
||||
|
@ -140,7 +140,6 @@ export const IAssetRepository = 'IAssetRepository';
|
||||
|
||||
export interface IAssetRepository {
|
||||
create(asset: AssetCreate): Promise<AssetEntity>;
|
||||
getByDate(ownerId: string, date: Date): Promise<AssetEntity[]>;
|
||||
getByIds(
|
||||
ids: string[],
|
||||
relations?: FindOptionsRelations<AssetEntity>,
|
||||
|
@ -8,4 +8,5 @@ export interface ICryptoRepository {
|
||||
hashSha1(data: string | Buffer): Buffer;
|
||||
hashBcrypt(data: string | Buffer, saltOrRounds: string | number): Promise<string>;
|
||||
compareBcrypt(data: string | Buffer, encrypted: string): boolean;
|
||||
newPassword(bytes: number): string;
|
||||
}
|
||||
|
@ -2,7 +2,7 @@ import { AssetResponseDto } from 'src/dtos/asset-response.dto';
|
||||
import { ReleaseNotification, ServerVersionResponseDto } from 'src/dtos/server-info.dto';
|
||||
import { SystemConfig } from 'src/entities/system-config.entity';
|
||||
|
||||
export const ICommunicationRepository = 'ICommunicationRepository';
|
||||
export const IEventRepository = 'IEventRepository';
|
||||
|
||||
export enum ClientEvent {
|
||||
UPLOAD_SUCCESS = 'on_upload_success',
|
||||
@ -19,18 +19,6 @@ export enum ClientEvent {
|
||||
NEW_RELEASE = 'on_new_release',
|
||||
}
|
||||
|
||||
export enum ServerEvent {
|
||||
CONFIG_UPDATE = 'config:update',
|
||||
}
|
||||
|
||||
export enum InternalEvent {
|
||||
VALIDATE_CONFIG = 'validate_config',
|
||||
}
|
||||
|
||||
export interface InternalEventMap {
|
||||
[InternalEvent.VALIDATE_CONFIG]: { newConfig: SystemConfig; oldConfig: SystemConfig };
|
||||
}
|
||||
|
||||
export interface ClientEventMap {
|
||||
[ClientEvent.UPLOAD_SUCCESS]: AssetResponseDto;
|
||||
[ClientEvent.USER_DELETE]: string;
|
||||
@ -46,15 +34,39 @@ export interface ClientEventMap {
|
||||
[ClientEvent.NEW_RELEASE]: ReleaseNotification;
|
||||
}
|
||||
|
||||
export type OnConnectCallback = (userId: string) => void | Promise<void>;
|
||||
export type OnServerEventCallback = () => Promise<void>;
|
||||
|
||||
export interface ICommunicationRepository {
|
||||
send<E extends keyof ClientEventMap>(event: E, userId: string, data: ClientEventMap[E]): void;
|
||||
broadcast<E extends keyof ClientEventMap>(event: E, data: ClientEventMap[E]): void;
|
||||
on(event: 'connect', callback: OnConnectCallback): void;
|
||||
on(event: ServerEvent, callback: OnServerEventCallback): void;
|
||||
sendServerEvent(event: ServerEvent): void;
|
||||
emit<E extends keyof InternalEventMap>(event: E, data: InternalEventMap[E]): boolean;
|
||||
emitAsync<E extends keyof InternalEventMap>(event: E, data: InternalEventMap[E]): Promise<any>;
|
||||
export enum ServerEvent {
|
||||
CONFIG_UPDATE = 'config.update',
|
||||
WEBSOCKET_CONNECT = 'websocket.connect',
|
||||
}
|
||||
|
||||
export interface ServerEventMap {
|
||||
[ServerEvent.CONFIG_UPDATE]: null;
|
||||
[ServerEvent.WEBSOCKET_CONNECT]: { userId: string };
|
||||
}
|
||||
|
||||
export enum ServerAsyncEvent {
|
||||
CONFIG_VALIDATE = 'config.validate',
|
||||
}
|
||||
|
||||
export interface ServerAsyncEventMap {
|
||||
[ServerAsyncEvent.CONFIG_VALIDATE]: { newConfig: SystemConfig; oldConfig: SystemConfig };
|
||||
}
|
||||
|
||||
export interface IEventRepository {
|
||||
/**
|
||||
* Send to connected clients for a specific user
|
||||
*/
|
||||
clientSend<E extends keyof ClientEventMap>(event: E, userId: string, data: ClientEventMap[E]): void;
|
||||
/**
|
||||
* Send to all connected clients
|
||||
*/
|
||||
clientBroadcast<E extends keyof ClientEventMap>(event: E, data: ClientEventMap[E]): void;
|
||||
/**
|
||||
* Notify listeners in this and connected processes. Subscribe to an event with `@OnServerEvent`
|
||||
*/
|
||||
serverSend<E extends keyof ServerEventMap>(event: E, data: ServerEventMap[E]): boolean;
|
||||
/**
|
||||
* Notify and wait for responses from listeners in this process. Subscribe to an event with `@OnServerEvent`
|
||||
*/
|
||||
serverSendAsync<E extends keyof ServerAsyncEventMap>(event: E, data: ServerAsyncEventMap[E]): Promise<any>;
|
||||
}
|
13
server/src/interfaces/metric.interface.ts
Normal file
@ -0,0 +1,13 @@
|
||||
import { MetricOptions } from '@opentelemetry/api';
|
||||
|
||||
export interface CustomMetricOptions extends MetricOptions {
|
||||
enabled?: boolean;
|
||||
}
|
||||
|
||||
export const IMetricRepository = 'IMetricRepository';
|
||||
|
||||
export interface IMetricRepository {
|
||||
addToCounter(name: string, value: number, options?: CustomMetricOptions): void;
|
||||
updateGauge(name: string, value: number, options?: CustomMetricOptions): void;
|
||||
updateHistogram(name: string, value: number, options?: CustomMetricOptions): void;
|
||||
}
|
@ -1,7 +1,6 @@
|
||||
import { AssetFaceEntity } from 'src/entities/asset-face.entity';
|
||||
import { AssetEntity, AssetType } from 'src/entities/asset.entity';
|
||||
import { GeodataPlacesEntity } from 'src/entities/geodata-places.entity';
|
||||
import { SmartInfoEntity } from 'src/entities/smart-info.entity';
|
||||
import { Paginated } from 'src/utils/pagination';
|
||||
|
||||
export const ISearchRepository = 'ISearchRepository';
|
||||
@ -188,7 +187,7 @@ export interface ISearchRepository {
|
||||
searchMetadata(pagination: SearchPaginationOptions, options: AssetSearchOptions): Paginated<AssetEntity>;
|
||||
searchSmart(pagination: SearchPaginationOptions, options: SmartSearchOptions): Paginated<AssetEntity>;
|
||||
searchFaces(search: FaceEmbeddingSearch): Promise<FaceSearchResult[]>;
|
||||
upsert(smartInfo: Partial<SmartInfoEntity>, embedding?: Embedding): Promise<void>;
|
||||
upsert(assetId: string, embedding: number[]): Promise<void>;
|
||||
searchPlaces(placeName: string): Promise<GeodataPlacesEntity[]>;
|
||||
getAssetsByCity(userIds: string[]): Promise<AssetEntity[]>;
|
||||
deleteAllSearchEmbeddings(): Promise<void>;
|
||||
|
@ -1,81 +1,5 @@
|
||||
-- NOTE: This file is auto generated by ./sql-generator
|
||||
|
||||
-- AssetRepository.getByDate
|
||||
SELECT
|
||||
"AssetEntity"."id" AS "AssetEntity_id",
|
||||
"AssetEntity"."deviceAssetId" AS "AssetEntity_deviceAssetId",
|
||||
"AssetEntity"."ownerId" AS "AssetEntity_ownerId",
|
||||
"AssetEntity"."libraryId" AS "AssetEntity_libraryId",
|
||||
"AssetEntity"."deviceId" AS "AssetEntity_deviceId",
|
||||
"AssetEntity"."type" AS "AssetEntity_type",
|
||||
"AssetEntity"."originalPath" AS "AssetEntity_originalPath",
|
||||
"AssetEntity"."resizePath" AS "AssetEntity_resizePath",
|
||||
"AssetEntity"."webpPath" AS "AssetEntity_webpPath",
|
||||
"AssetEntity"."thumbhash" AS "AssetEntity_thumbhash",
|
||||
"AssetEntity"."encodedVideoPath" AS "AssetEntity_encodedVideoPath",
|
||||
"AssetEntity"."createdAt" AS "AssetEntity_createdAt",
|
||||
"AssetEntity"."updatedAt" AS "AssetEntity_updatedAt",
|
||||
"AssetEntity"."deletedAt" AS "AssetEntity_deletedAt",
|
||||
"AssetEntity"."fileCreatedAt" AS "AssetEntity_fileCreatedAt",
|
||||
"AssetEntity"."localDateTime" AS "AssetEntity_localDateTime",
|
||||
"AssetEntity"."fileModifiedAt" AS "AssetEntity_fileModifiedAt",
|
||||
"AssetEntity"."isFavorite" AS "AssetEntity_isFavorite",
|
||||
"AssetEntity"."isArchived" AS "AssetEntity_isArchived",
|
||||
"AssetEntity"."isExternal" AS "AssetEntity_isExternal",
|
||||
"AssetEntity"."isReadOnly" AS "AssetEntity_isReadOnly",
|
||||
"AssetEntity"."isOffline" AS "AssetEntity_isOffline",
|
||||
"AssetEntity"."checksum" AS "AssetEntity_checksum",
|
||||
"AssetEntity"."duration" AS "AssetEntity_duration",
|
||||
"AssetEntity"."isVisible" AS "AssetEntity_isVisible",
|
||||
"AssetEntity"."livePhotoVideoId" AS "AssetEntity_livePhotoVideoId",
|
||||
"AssetEntity"."originalFileName" AS "AssetEntity_originalFileName",
|
||||
"AssetEntity"."sidecarPath" AS "AssetEntity_sidecarPath",
|
||||
"AssetEntity"."stackId" AS "AssetEntity_stackId",
|
||||
"AssetEntity__AssetEntity_exifInfo"."assetId" AS "AssetEntity__AssetEntity_exifInfo_assetId",
|
||||
"AssetEntity__AssetEntity_exifInfo"."description" AS "AssetEntity__AssetEntity_exifInfo_description",
|
||||
"AssetEntity__AssetEntity_exifInfo"."exifImageWidth" AS "AssetEntity__AssetEntity_exifInfo_exifImageWidth",
|
||||
"AssetEntity__AssetEntity_exifInfo"."exifImageHeight" AS "AssetEntity__AssetEntity_exifInfo_exifImageHeight",
|
||||
"AssetEntity__AssetEntity_exifInfo"."fileSizeInByte" AS "AssetEntity__AssetEntity_exifInfo_fileSizeInByte",
|
||||
"AssetEntity__AssetEntity_exifInfo"."orientation" AS "AssetEntity__AssetEntity_exifInfo_orientation",
|
||||
"AssetEntity__AssetEntity_exifInfo"."dateTimeOriginal" AS "AssetEntity__AssetEntity_exifInfo_dateTimeOriginal",
|
||||
"AssetEntity__AssetEntity_exifInfo"."modifyDate" AS "AssetEntity__AssetEntity_exifInfo_modifyDate",
|
||||
"AssetEntity__AssetEntity_exifInfo"."timeZone" AS "AssetEntity__AssetEntity_exifInfo_timeZone",
|
||||
"AssetEntity__AssetEntity_exifInfo"."latitude" AS "AssetEntity__AssetEntity_exifInfo_latitude",
|
||||
"AssetEntity__AssetEntity_exifInfo"."longitude" AS "AssetEntity__AssetEntity_exifInfo_longitude",
|
||||
"AssetEntity__AssetEntity_exifInfo"."projectionType" AS "AssetEntity__AssetEntity_exifInfo_projectionType",
|
||||
"AssetEntity__AssetEntity_exifInfo"."city" AS "AssetEntity__AssetEntity_exifInfo_city",
|
||||
"AssetEntity__AssetEntity_exifInfo"."livePhotoCID" AS "AssetEntity__AssetEntity_exifInfo_livePhotoCID",
|
||||
"AssetEntity__AssetEntity_exifInfo"."autoStackId" AS "AssetEntity__AssetEntity_exifInfo_autoStackId",
|
||||
"AssetEntity__AssetEntity_exifInfo"."state" AS "AssetEntity__AssetEntity_exifInfo_state",
|
||||
"AssetEntity__AssetEntity_exifInfo"."country" AS "AssetEntity__AssetEntity_exifInfo_country",
|
||||
"AssetEntity__AssetEntity_exifInfo"."make" AS "AssetEntity__AssetEntity_exifInfo_make",
|
||||
"AssetEntity__AssetEntity_exifInfo"."model" AS "AssetEntity__AssetEntity_exifInfo_model",
|
||||
"AssetEntity__AssetEntity_exifInfo"."lensModel" AS "AssetEntity__AssetEntity_exifInfo_lensModel",
|
||||
"AssetEntity__AssetEntity_exifInfo"."fNumber" AS "AssetEntity__AssetEntity_exifInfo_fNumber",
|
||||
"AssetEntity__AssetEntity_exifInfo"."focalLength" AS "AssetEntity__AssetEntity_exifInfo_focalLength",
|
||||
"AssetEntity__AssetEntity_exifInfo"."iso" AS "AssetEntity__AssetEntity_exifInfo_iso",
|
||||
"AssetEntity__AssetEntity_exifInfo"."exposureTime" AS "AssetEntity__AssetEntity_exifInfo_exposureTime",
|
||||
"AssetEntity__AssetEntity_exifInfo"."profileDescription" AS "AssetEntity__AssetEntity_exifInfo_profileDescription",
|
||||
"AssetEntity__AssetEntity_exifInfo"."colorspace" AS "AssetEntity__AssetEntity_exifInfo_colorspace",
|
||||
"AssetEntity__AssetEntity_exifInfo"."bitsPerSample" AS "AssetEntity__AssetEntity_exifInfo_bitsPerSample",
|
||||
"AssetEntity__AssetEntity_exifInfo"."fps" AS "AssetEntity__AssetEntity_exifInfo_fps"
|
||||
FROM
|
||||
"assets" "AssetEntity"
|
||||
LEFT JOIN "exif" "AssetEntity__AssetEntity_exifInfo" ON "AssetEntity__AssetEntity_exifInfo"."assetId" = "AssetEntity"."id"
|
||||
WHERE
|
||||
(
|
||||
(
|
||||
("AssetEntity"."ownerId" = $1)
|
||||
AND ("AssetEntity"."isVisible" = $2)
|
||||
AND ("AssetEntity"."isArchived" = $3)
|
||||
AND (NOT ("AssetEntity"."resizePath" IS NULL))
|
||||
AND ("AssetEntity"."fileCreatedAt" BETWEEN $4 AND $5)
|
||||
)
|
||||
)
|
||||
AND ("AssetEntity"."deletedAt" IS NULL)
|
||||
ORDER BY
|
||||
"AssetEntity"."fileCreatedAt" DESC
|
||||
|
||||
-- AssetRepository.getByIds
|
||||
SELECT
|
||||
"AssetEntity"."id" AS "AssetEntity_id",
|
||||
|
@ -278,7 +278,7 @@ WITH RECURSIVE
|
||||
exif
|
||||
INNER JOIN assets ON exif."assetId" = assets.id
|
||||
WHERE
|
||||
"ownerId" IN ($1)
|
||||
"ownerId" = ANY ('$1'::uuid [])
|
||||
AND "isVisible" = $2
|
||||
AND "isArchived" = $3
|
||||
AND type = $4
|
||||
@ -302,7 +302,7 @@ WITH RECURSIVE
|
||||
INNER JOIN assets ON exif."assetId" = assets.id
|
||||
WHERE
|
||||
city > c.city
|
||||
AND "ownerId" IN ($1)
|
||||
AND "ownerId" = ANY ('$1'::uuid [])
|
||||
AND "isVisible" = $2
|
||||
AND "isArchived" = $3
|
||||
AND type = $4
|
||||
|
@ -1,6 +1,5 @@
|
||||
import { Injectable } from '@nestjs/common';
|
||||
import { InjectRepository } from '@nestjs/typeorm';
|
||||
import { DateTime } from 'luxon';
|
||||
import path from 'node:path';
|
||||
import { Chunked, ChunkedArray, DummyValue, GenerateSql } from 'src/decorators';
|
||||
import { AssetOrder } from 'src/entities/album.entity';
|
||||
@ -76,41 +75,6 @@ export class AssetRepository implements IAssetRepository {
|
||||
return this.repository.save(asset);
|
||||
}
|
||||
|
||||
@GenerateSql({ params: [DummyValue.UUID, DummyValue.DATE] })
|
||||
getByDate(ownerId: string, date: Date): Promise<AssetEntity[]> {
|
||||
// For reference of a correct approach although slower
|
||||
|
||||
// let builder = this.repository
|
||||
// .createQueryBuilder('asset')
|
||||
// .leftJoin('asset.exifInfo', 'exifInfo')
|
||||
// .where('asset.ownerId = :ownerId', { ownerId })
|
||||
// .andWhere(
|
||||
// `coalesce(date_trunc('day', asset."fileCreatedAt", "exifInfo"."timeZone") at TIME ZONE "exifInfo"."timeZone", date_trunc('day', asset."fileCreatedAt")) IN (:date)`,
|
||||
// { date },
|
||||
// )
|
||||
// .andWhere('asset.isVisible = true')
|
||||
// .andWhere('asset.isArchived = false')
|
||||
// .orderBy('asset.fileCreatedAt', 'DESC');
|
||||
|
||||
// return builder.getMany();
|
||||
|
||||
return this.repository.find({
|
||||
where: {
|
||||
ownerId,
|
||||
isVisible: true,
|
||||
isArchived: false,
|
||||
resizePath: Not(IsNull()),
|
||||
fileCreatedAt: OptionalBetween(date, DateTime.fromJSDate(date).plus({ day: 1 }).toJSDate()),
|
||||
},
|
||||
relations: {
|
||||
exifInfo: true,
|
||||
},
|
||||
order: {
|
||||
fileCreatedAt: 'DESC',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
@GenerateSql({ params: [DummyValue.UUID, { day: 1, month: 1 }] })
|
||||
getByDayOfYear(ownerIds: string[], { day, month }: MonthDay): Promise<AssetEntity[]> {
|
||||
return this.repository
|
||||
|
@ -41,4 +41,8 @@ export class CryptoRepository implements ICryptoRepository {
|
||||
stream.on('end', () => resolve(hash.digest()));
|
||||
});
|
||||
}
|
||||
|
||||
newPassword(bytes: number) {
|
||||
return randomBytes(bytes).toString('base64').replaceAll(/\W/g, '');
|
||||
}
|
||||
}
|
||||
|
@ -8,13 +8,12 @@ import {
|
||||
} from '@nestjs/websockets';
|
||||
import { Server, Socket } from 'socket.io';
|
||||
import {
|
||||
ClientEvent,
|
||||
ICommunicationRepository,
|
||||
InternalEventMap,
|
||||
OnConnectCallback,
|
||||
OnServerEventCallback,
|
||||
ClientEventMap,
|
||||
IEventRepository,
|
||||
ServerAsyncEventMap,
|
||||
ServerEvent,
|
||||
} from 'src/interfaces/communication.interface';
|
||||
ServerEventMap,
|
||||
} from 'src/interfaces/event.interface';
|
||||
import { AuthService } from 'src/services/auth.service';
|
||||
import { Instrumentation } from 'src/utils/instrumentation';
|
||||
import { ImmichLogger } from 'src/utils/logger';
|
||||
@ -25,14 +24,8 @@ import { ImmichLogger } from 'src/utils/logger';
|
||||
path: '/api/socket.io',
|
||||
transports: ['websocket'],
|
||||
})
|
||||
export class CommunicationRepository
|
||||
implements OnGatewayConnection, OnGatewayDisconnect, OnGatewayInit, ICommunicationRepository
|
||||
{
|
||||
private logger = new ImmichLogger(CommunicationRepository.name);
|
||||
private onConnectCallbacks: OnConnectCallback[] = [];
|
||||
private onServerEventCallbacks: Record<ServerEvent, OnServerEventCallback[]> = {
|
||||
[ServerEvent.CONFIG_UPDATE]: [],
|
||||
};
|
||||
export class EventRepository implements OnGatewayConnection, OnGatewayDisconnect, OnGatewayInit, IEventRepository {
|
||||
private logger = new ImmichLogger(EventRepository.name);
|
||||
|
||||
@WebSocketServer()
|
||||
private server?: Server;
|
||||
@ -46,38 +39,23 @@ export class CommunicationRepository
|
||||
this.logger.log('Initialized websocket server');
|
||||
|
||||
for (const event of Object.values(ServerEvent)) {
|
||||
server.on(event, async () => {
|
||||
if (event === ServerEvent.WEBSOCKET_CONNECT) {
|
||||
continue;
|
||||
}
|
||||
|
||||
server.on(event, (data: unknown) => {
|
||||
this.logger.debug(`Server event: ${event} (receive)`);
|
||||
const callbacks = this.onServerEventCallbacks[event];
|
||||
for (const callback of callbacks) {
|
||||
await callback();
|
||||
}
|
||||
this.eventEmitter.emit(event, data);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
on(event: 'connect' | ServerEvent, callback: OnConnectCallback | OnServerEventCallback) {
|
||||
switch (event) {
|
||||
case 'connect': {
|
||||
this.onConnectCallbacks.push(callback);
|
||||
break;
|
||||
}
|
||||
|
||||
default: {
|
||||
this.onServerEventCallbacks[event].push(callback as OnServerEventCallback);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async handleConnection(client: Socket) {
|
||||
try {
|
||||
this.logger.log(`Websocket Connect: ${client.id}`);
|
||||
const auth = await this.authService.validate(client.request.headers, {});
|
||||
await client.join(auth.user.id);
|
||||
for (const callback of this.onConnectCallbacks) {
|
||||
await callback(auth.user.id);
|
||||
}
|
||||
this.serverSend(ServerEvent.WEBSOCKET_CONNECT, { userId: auth.user.id });
|
||||
} catch (error: Error | any) {
|
||||
this.logger.error(`Websocket connection error: ${error}`, error?.stack);
|
||||
client.emit('error', 'unauthorized');
|
||||
@ -90,24 +68,21 @@ export class CommunicationRepository
|
||||
await client.leave(client.nsp.name);
|
||||
}
|
||||
|
||||
send(event: ClientEvent, userId: string, data: any) {
|
||||
clientSend<E extends keyof ClientEventMap>(event: E, userId: string, data: ClientEventMap[E]) {
|
||||
this.server?.to(userId).emit(event, data);
|
||||
}
|
||||
|
||||
broadcast(event: ClientEvent, data: any) {
|
||||
clientBroadcast<E extends keyof ClientEventMap>(event: E, data: ClientEventMap[E]) {
|
||||
this.server?.emit(event, data);
|
||||
}
|
||||
|
||||
sendServerEvent(event: ServerEvent) {
|
||||
serverSend<E extends keyof ServerEventMap>(event: E, data: ServerEventMap[E]) {
|
||||
this.logger.debug(`Server event: ${event} (send)`);
|
||||
this.server?.serverSideEmit(event);
|
||||
}
|
||||
|
||||
emit<E extends keyof InternalEventMap>(event: E, data: InternalEventMap[E]): boolean {
|
||||
this.server?.serverSideEmit(event, data);
|
||||
return this.eventEmitter.emit(event, data);
|
||||
}
|
||||
|
||||
emitAsync<E extends keyof InternalEventMap, R = any[]>(event: E, data: InternalEventMap[E]): Promise<R> {
|
||||
serverSendAsync<E extends keyof ServerAsyncEventMap, R = any[]>(event: E, data: ServerAsyncEventMap[E]): Promise<R> {
|
||||
return this.eventEmitter.emitAsync(event, data) as Promise<R>;
|
||||
}
|
||||
}
|
31
server/src/repositories/metric.repository.ts
Normal file
@ -0,0 +1,31 @@
|
||||
import { Inject } from '@nestjs/common';
|
||||
import { MetricService } from 'nestjs-otel';
|
||||
import { CustomMetricOptions, IMetricRepository } from 'src/interfaces/metric.interface';
|
||||
|
||||
export class MetricRepository implements IMetricRepository {
|
||||
constructor(@Inject(MetricService) private readonly metricService: MetricService) {}
|
||||
|
||||
addToCounter(name: string, value: number, options?: CustomMetricOptions): void {
|
||||
if (options?.enabled === false) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.metricService.getCounter(name, options).add(value);
|
||||
}
|
||||
|
||||
updateGauge(name: string, value: number, options?: CustomMetricOptions): void {
|
||||
if (options?.enabled === false) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.metricService.getUpDownCounter(name, options).add(value);
|
||||
}
|
||||
|
||||
updateHistogram(name: string, value: number, options?: CustomMetricOptions): void {
|
||||
if (options?.enabled === false) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.metricService.getHistogram(name, options).record(value);
|
||||
}
|
||||
}
|
@ -10,7 +10,6 @@ import { SmartSearchEntity } from 'src/entities/smart-search.entity';
|
||||
import { DatabaseExtension } from 'src/interfaces/database.interface';
|
||||
import {
|
||||
AssetSearchOptions,
|
||||
Embedding,
|
||||
FaceEmbeddingSearch,
|
||||
FaceSearchResult,
|
||||
ISearchRepository,
|
||||
@ -226,9 +225,9 @@ export class SearchRepository implements ISearchRepository {
|
||||
.getMany();
|
||||
}
|
||||
|
||||
@GenerateSql({ params: [[DummyValue.UUID]] })
|
||||
@GenerateSql({ params: [[DummyValue.UUID, DummyValue.UUID]] })
|
||||
async getAssetsByCity(userIds: string[]): Promise<AssetEntity[]> {
|
||||
const parameters = [userIds.join(', '), true, false, AssetType.IMAGE];
|
||||
const parameters = [userIds, true, false, AssetType.IMAGE];
|
||||
const rawRes = await this.repository.query(this.assetsByCityQuery, parameters);
|
||||
|
||||
const items: AssetEntity[] = [];
|
||||
@ -247,16 +246,7 @@ export class SearchRepository implements ISearchRepository {
|
||||
return items;
|
||||
}
|
||||
|
||||
async upsert(smartInfo: Partial<SmartInfoEntity>, embedding?: Embedding): Promise<void> {
|
||||
await this.repository.upsert(smartInfo, { conflictPaths: ['assetId'] });
|
||||
if (!smartInfo.assetId || !embedding) {
|
||||
return;
|
||||
}
|
||||
|
||||
await this.upsertEmbedding(smartInfo.assetId, embedding);
|
||||
}
|
||||
|
||||
private async upsertEmbedding(assetId: string, embedding: number[]): Promise<void> {
|
||||
async upsert(assetId: string, embedding: number[]): Promise<void> {
|
||||
await this.smartSearchRepository.upsert(
|
||||
{ assetId, embedding: () => asVector(embedding, true) },
|
||||
{ conflictPaths: ['assetId'] },
|
||||
@ -325,7 +315,7 @@ WITH RECURSIVE cte AS (
|
||||
SELECT city, "assetId"
|
||||
FROM exif
|
||||
INNER JOIN assets ON exif."assetId" = assets.id
|
||||
WHERE "ownerId" IN ($1) AND "isVisible" = $2 AND "isArchived" = $3 AND type = $4
|
||||
WHERE "ownerId" = ANY('$1'::uuid[]) AND "isVisible" = $2 AND "isArchived" = $3 AND type = $4
|
||||
ORDER BY city
|
||||
LIMIT 1
|
||||
)
|
||||
@ -338,7 +328,7 @@ WITH RECURSIVE cte AS (
|
||||
SELECT city, "assetId"
|
||||
FROM exif
|
||||
INNER JOIN assets ON exif."assetId" = assets.id
|
||||
WHERE city > c.city AND "ownerId" IN ($1) AND "isVisible" = $2 AND "isArchived" = $3 AND type = $4
|
||||
WHERE city > c.city AND "ownerId" = ANY('$1'::uuid[]) AND "isVisible" = $2 AND "isArchived" = $3 AND type = $4
|
||||
ORDER BY city
|
||||
LIMIT 1
|
||||
) l
|
||||
|
@ -1,6 +1,6 @@
|
||||
import mockfs from 'mock-fs';
|
||||
import { CrawlOptionsDto } from 'src/dtos/library.dto';
|
||||
import { FilesystemProvider } from 'src/repositories/filesystem.provider';
|
||||
import { StorageRepository } from 'src/repositories/storage.repository';
|
||||
|
||||
interface Test {
|
||||
test: string;
|
||||
@ -179,11 +179,11 @@ const tests: Test[] = [
|
||||
},
|
||||
];
|
||||
|
||||
describe(FilesystemProvider.name, () => {
|
||||
let sut: FilesystemProvider;
|
||||
describe(StorageRepository.name, () => {
|
||||
let sut: StorageRepository;
|
||||
|
||||
beforeEach(() => {
|
||||
sut = new FilesystemProvider();
|
||||
sut = new StorageRepository();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
@ -18,8 +18,8 @@ import { ImmichLogger } from 'src/utils/logger';
|
||||
import { mimeTypes } from 'src/utils/mime-types';
|
||||
|
||||
@Instrumentation()
|
||||
export class FilesystemProvider implements IStorageRepository {
|
||||
private logger = new ImmichLogger(FilesystemProvider.name);
|
||||
export class StorageRepository implements IStorageRepository {
|
||||
private logger = new ImmichLogger(StorageRepository.name);
|
||||
|
||||
readdir(folder: string): Promise<string[]> {
|
||||
return fs.readdir(folder);
|
@ -27,7 +27,7 @@ describe(APIKeyService.name, () => {
|
||||
name: 'Test Key',
|
||||
userId: authStub.admin.user.id,
|
||||
});
|
||||
expect(cryptoMock.randomBytes).toHaveBeenCalled();
|
||||
expect(cryptoMock.newPassword).toHaveBeenCalled();
|
||||
expect(cryptoMock.hashSha256).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
@ -41,7 +41,7 @@ describe(APIKeyService.name, () => {
|
||||
name: 'API Key',
|
||||
userId: authStub.admin.user.id,
|
||||
});
|
||||
expect(cryptoMock.randomBytes).toHaveBeenCalled();
|
||||
expect(cryptoMock.newPassword).toHaveBeenCalled();
|
||||
expect(cryptoMock.hashSha256).toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
@ -13,7 +13,7 @@ export class APIKeyService {
|
||||
) {}
|
||||
|
||||
async create(auth: AuthDto, dto: APIKeyCreateDto): Promise<APIKeyCreateResponseDto> {
|
||||
const secret = this.crypto.randomBytes(32).toString('base64').replaceAll(/\W/g, '');
|
||||
const secret = this.crypto.newPassword(32);
|
||||
const entity = await this.repository.create({
|
||||
key: this.crypto.hashSha256(secret),
|
||||
name: dto.name || 'API Key',
|
||||
|
@ -5,7 +5,7 @@ import { AssetJobName, AssetStatsResponseDto, UploadFieldName } from 'src/dtos/a
|
||||
import { AssetEntity, AssetType } from 'src/entities/asset.entity';
|
||||
import { IAssetStackRepository } from 'src/interfaces/asset-stack.interface';
|
||||
import { AssetStats, IAssetRepository, TimeBucketSize } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IJobRepository, JobItem, JobName } from 'src/interfaces/job.interface';
|
||||
import { IPartnerRepository } from 'src/interfaces/partner.interface';
|
||||
import { IStorageRepository } from 'src/interfaces/storage.interface';
|
||||
@ -20,7 +20,7 @@ import { userStub } from 'test/fixtures/user.stub';
|
||||
import { IAccessRepositoryMock, newAccessRepositoryMock } from 'test/repositories/access.repository.mock';
|
||||
import { newAssetStackRepositoryMock } from 'test/repositories/asset-stack.repository.mock';
|
||||
import { newAssetRepositoryMock } from 'test/repositories/asset.repository.mock';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newJobRepositoryMock } from 'test/repositories/job.repository.mock';
|
||||
import { newPartnerRepositoryMock } from 'test/repositories/partner.repository.mock';
|
||||
import { newStorageRepositoryMock } from 'test/repositories/storage.repository.mock';
|
||||
@ -152,7 +152,7 @@ describe(AssetService.name, () => {
|
||||
let jobMock: jest.Mocked<IJobRepository>;
|
||||
let storageMock: jest.Mocked<IStorageRepository>;
|
||||
let userMock: jest.Mocked<IUserRepository>;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
let configMock: jest.Mocked<ISystemConfigRepository>;
|
||||
let partnerMock: jest.Mocked<IPartnerRepository>;
|
||||
let assetStackMock: jest.Mocked<IAssetStackRepository>;
|
||||
@ -164,7 +164,7 @@ describe(AssetService.name, () => {
|
||||
beforeEach(() => {
|
||||
accessMock = newAccessRepositoryMock();
|
||||
assetMock = newAssetRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
eventMock = newEventRepositoryMock();
|
||||
jobMock = newJobRepositoryMock();
|
||||
storageMock = newStorageRepositoryMock();
|
||||
userMock = newUserRepositoryMock();
|
||||
@ -179,7 +179,7 @@ describe(AssetService.name, () => {
|
||||
configMock,
|
||||
storageMock,
|
||||
userMock,
|
||||
communicationMock,
|
||||
eventMock,
|
||||
partnerMock,
|
||||
assetStackMock,
|
||||
);
|
||||
@ -704,7 +704,7 @@ describe(AssetService.name, () => {
|
||||
stackParentId: 'parent',
|
||||
});
|
||||
|
||||
expect(communicationMock.send).toHaveBeenCalledWith(ClientEvent.ASSET_STACK_UPDATE, authStub.user1.user.id, [
|
||||
expect(eventMock.clientSend).toHaveBeenCalledWith(ClientEvent.ASSET_STACK_UPDATE, authStub.user1.user.id, [
|
||||
'asset-1',
|
||||
'parent',
|
||||
]);
|
||||
|
@ -31,7 +31,7 @@ import { LibraryType } from 'src/entities/library.entity';
|
||||
import { IAccessRepository } from 'src/interfaces/access.interface';
|
||||
import { IAssetStackRepository } from 'src/interfaces/asset-stack.interface';
|
||||
import { IAssetRepository, TimeBucketOptions } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import {
|
||||
IAssetDeletionJob,
|
||||
IJobRepository,
|
||||
@ -75,7 +75,7 @@ export class AssetService {
|
||||
@Inject(ISystemConfigRepository) configRepository: ISystemConfigRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
@Inject(IUserRepository) private userRepository: IUserRepository,
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
@Inject(IPartnerRepository) private partnerRepository: IPartnerRepository,
|
||||
@Inject(IAssetStackRepository) private assetStackRepository: IAssetStackRepository,
|
||||
) {
|
||||
@ -395,7 +395,7 @@ export class AssetService {
|
||||
.flatMap((stack) => (stack ? [stack] : []))
|
||||
.filter((stack) => stack.assets.length < 2);
|
||||
await Promise.all(stacksToDelete.map((as) => this.assetStackRepository.delete(as.id)));
|
||||
this.communicationRepository.send(ClientEvent.ASSET_STACK_UPDATE, auth.user.id, ids);
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_STACK_UPDATE, auth.user.id, ids);
|
||||
}
|
||||
|
||||
async handleAssetDeletionCheck(): Promise<JobStatus> {
|
||||
@ -454,7 +454,7 @@ export class AssetService {
|
||||
|
||||
await this.assetRepository.remove(asset);
|
||||
await this.userRepository.updateUsage(asset.ownerId, -(asset.exifInfo?.fileSizeInByte || 0));
|
||||
this.communicationRepository.send(ClientEvent.ASSET_DELETE, asset.ownerId, id);
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_DELETE, asset.ownerId, id);
|
||||
|
||||
// TODO refactor this to use cascades
|
||||
if (asset.livePhotoVideoId) {
|
||||
@ -482,7 +482,7 @@ export class AssetService {
|
||||
await this.jobRepository.queueAll(ids.map((id) => ({ name: JobName.ASSET_DELETION, data: { id } })));
|
||||
} else {
|
||||
await this.assetRepository.softDeleteAll(ids);
|
||||
this.communicationRepository.send(ClientEvent.ASSET_TRASH, auth.user.id, ids);
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_TRASH, auth.user.id, ids);
|
||||
}
|
||||
}
|
||||
|
||||
@ -513,7 +513,7 @@ export class AssetService {
|
||||
primaryAssetId: newParentId,
|
||||
});
|
||||
|
||||
this.communicationRepository.send(ClientEvent.ASSET_STACK_UPDATE, auth.user.id, [
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_STACK_UPDATE, auth.user.id, [
|
||||
...childIds,
|
||||
newParentId,
|
||||
oldParentId,
|
||||
|
@ -146,7 +146,6 @@ export class AuthService {
|
||||
|
||||
async adminSignUp(dto: SignUpDto): Promise<UserResponseDto> {
|
||||
const adminUser = await this.userRepository.getAdmin();
|
||||
|
||||
if (adminUser) {
|
||||
throw new BadRequestException('The server already has an admin');
|
||||
}
|
||||
@ -427,7 +426,7 @@ export class AuthService {
|
||||
}
|
||||
|
||||
private async createLoginResponse(user: UserEntity, authType: AuthType, loginDetails: LoginDetails) {
|
||||
const key = this.cryptoRepository.randomBytes(32).toString('base64').replaceAll(/\W/g, '');
|
||||
const key = this.cryptoRepository.newPassword(32);
|
||||
const token = this.cryptoRepository.hashSha256(key);
|
||||
|
||||
await this.userTokenRepository.create({
|
||||
|
@ -2,7 +2,7 @@ import { BadRequestException } from '@nestjs/common';
|
||||
import { FeatureFlag, SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { SystemConfig, SystemConfigKey } from 'src/entities/system-config.entity';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { IEventRepository } from 'src/interfaces/event.interface';
|
||||
import {
|
||||
IJobRepository,
|
||||
JobCommand,
|
||||
@ -12,13 +12,15 @@ import {
|
||||
JobStatus,
|
||||
QueueName,
|
||||
} from 'src/interfaces/job.interface';
|
||||
import { IMetricRepository } from 'src/interfaces/metric.interface';
|
||||
import { IPersonRepository } from 'src/interfaces/person.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
import { JobService } from 'src/services/job.service';
|
||||
import { assetStub } from 'test/fixtures/asset.stub';
|
||||
import { newAssetRepositoryMock } from 'test/repositories/asset.repository.mock';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newJobRepositoryMock } from 'test/repositories/job.repository.mock';
|
||||
import { newMetricRepositoryMock } from 'test/repositories/metric.repository.mock';
|
||||
import { newPersonRepositoryMock } from 'test/repositories/person.repository.mock';
|
||||
import { newSystemConfigRepositoryMock } from 'test/repositories/system-config.repository.mock';
|
||||
|
||||
@ -34,17 +36,19 @@ describe(JobService.name, () => {
|
||||
let sut: JobService;
|
||||
let assetMock: jest.Mocked<IAssetRepository>;
|
||||
let configMock: jest.Mocked<ISystemConfigRepository>;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
let jobMock: jest.Mocked<IJobRepository>;
|
||||
let personMock: jest.Mocked<IPersonRepository>;
|
||||
let metricMock: jest.Mocked<IMetricRepository>;
|
||||
|
||||
beforeEach(() => {
|
||||
assetMock = newAssetRepositoryMock();
|
||||
configMock = newSystemConfigRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
eventMock = newEventRepositoryMock();
|
||||
jobMock = newJobRepositoryMock();
|
||||
personMock = newPersonRepositoryMock();
|
||||
sut = new JobService(assetMock, communicationMock, jobMock, configMock, personMock);
|
||||
metricMock = newMetricRepositoryMock();
|
||||
sut = new JobService(assetMock, eventMock, jobMock, configMock, personMock, metricMock);
|
||||
});
|
||||
|
||||
it('should work', () => {
|
||||
|
@ -1,10 +1,11 @@
|
||||
import { BadRequestException, Inject, Injectable } from '@nestjs/common';
|
||||
import { snakeCase } from 'lodash';
|
||||
import { FeatureFlag, SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { mapAsset } from 'src/dtos/asset-response.dto';
|
||||
import { AllJobStatusResponseDto, JobCommandDto, JobStatusDto } from 'src/dtos/job.dto';
|
||||
import { AssetType } from 'src/entities/asset.entity';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import {
|
||||
ConcurrentQueueName,
|
||||
IJobRepository,
|
||||
@ -16,8 +17,10 @@ import {
|
||||
QueueCleanType,
|
||||
QueueName,
|
||||
} from 'src/interfaces/job.interface';
|
||||
import { IMetricRepository } from 'src/interfaces/metric.interface';
|
||||
import { IPersonRepository } from 'src/interfaces/person.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
import { jobMetrics } from 'src/utils/instrumentation';
|
||||
import { ImmichLogger } from 'src/utils/logger';
|
||||
|
||||
@Injectable()
|
||||
@ -27,10 +30,11 @@ export class JobService {
|
||||
|
||||
constructor(
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ISystemConfigRepository) configRepository: ISystemConfigRepository,
|
||||
@Inject(IPersonRepository) private personRepository: IPersonRepository,
|
||||
@Inject(IMetricRepository) private metricRepository: IMetricRepository,
|
||||
) {
|
||||
this.configCore = SystemConfigCore.create(configRepository);
|
||||
}
|
||||
@ -92,6 +96,8 @@ export class JobService {
|
||||
throw new BadRequestException(`Job is already running`);
|
||||
}
|
||||
|
||||
this.metricRepository.addToCounter(`immich.queues.${snakeCase(name)}.started`, 1), { enabled: jobMetrics };
|
||||
|
||||
switch (name) {
|
||||
case QueueName.VIDEO_CONVERSION: {
|
||||
return this.jobRepository.queue({ name: JobName.QUEUE_VIDEO_CONVERSION, data: { force } });
|
||||
@ -156,14 +162,21 @@ export class JobService {
|
||||
this.jobRepository.addHandler(queueName, concurrency, async (item: JobItem): Promise<void> => {
|
||||
const { name, data } = item;
|
||||
|
||||
const queueMetric = `immich.queues.${snakeCase(queueName)}.active`;
|
||||
this.metricRepository.updateGauge(queueMetric, 1, { enabled: jobMetrics });
|
||||
|
||||
try {
|
||||
const handler = jobHandlers[name];
|
||||
const status = await handler(data);
|
||||
const jobMetric = `immich.jobs.${name.replaceAll('-', '_')}.${status}`;
|
||||
this.metricRepository.addToCounter(jobMetric, 1, { enabled: jobMetrics });
|
||||
if (status === JobStatus.SUCCESS || status == JobStatus.SKIPPED) {
|
||||
await this.onDone(item);
|
||||
}
|
||||
} catch (error: Error | any) {
|
||||
this.logger.error(`Unable to run job handler (${queueName}/${name}): ${error}`, error?.stack, data);
|
||||
} finally {
|
||||
this.metricRepository.updateGauge(queueMetric, -1, { enabled: jobMetrics });
|
||||
}
|
||||
});
|
||||
}
|
||||
@ -219,7 +232,7 @@ export class JobService {
|
||||
if (item.data.source === 'sidecar-write') {
|
||||
const [asset] = await this.assetRepository.getByIdsWithAllRelations([item.data.id]);
|
||||
if (asset) {
|
||||
this.communicationRepository.send(ClientEvent.ASSET_UPDATE, asset.ownerId, mapAsset(asset));
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_UPDATE, asset.ownerId, mapAsset(asset));
|
||||
}
|
||||
}
|
||||
await this.jobRepository.queue({ name: JobName.LINK_LIVE_PHOTOS, data: item.data });
|
||||
@ -242,7 +255,7 @@ export class JobService {
|
||||
const { id } = item.data;
|
||||
const person = await this.personRepository.getById(id);
|
||||
if (person) {
|
||||
this.communicationRepository.send(ClientEvent.PERSON_THUMBNAIL, person.ownerId, person.id);
|
||||
this.eventRepository.clientSend(ClientEvent.PERSON_THUMBNAIL, person.ownerId, person.id);
|
||||
}
|
||||
break;
|
||||
}
|
||||
@ -279,13 +292,13 @@ export class JobService {
|
||||
|
||||
// Only live-photo motion part will be marked as not visible immediately on upload. Skip notifying clients
|
||||
if (asset && asset.isVisible) {
|
||||
this.communicationRepository.send(ClientEvent.UPLOAD_SUCCESS, asset.ownerId, mapAsset(asset));
|
||||
this.eventRepository.clientSend(ClientEvent.UPLOAD_SUCCESS, asset.ownerId, mapAsset(asset));
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
case JobName.USER_DELETION: {
|
||||
this.communicationRepository.broadcast(ClientEvent.USER_DELETE, item.data.id);
|
||||
this.eventRepository.clientBroadcast(ClientEvent.USER_DELETE, item.data.id);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
@ -128,10 +128,10 @@ describe(LibraryService.name, () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('validateConfig', () => {
|
||||
describe('onValidateConfig', () => {
|
||||
it('should allow a valid cron expression', () => {
|
||||
expect(() =>
|
||||
sut.validateConfig({
|
||||
sut.onValidateConfig({
|
||||
newConfig: { library: { scan: { cronExpression: '0 0 * * *' } } } as SystemConfig,
|
||||
oldConfig: {} as SystemConfig,
|
||||
}),
|
||||
@ -140,7 +140,7 @@ describe(LibraryService.name, () => {
|
||||
|
||||
it('should fail for an invalid cron expression', () => {
|
||||
expect(() =>
|
||||
sut.validateConfig({
|
||||
sut.onValidateConfig({
|
||||
newConfig: { library: { scan: { cronExpression: 'foo' } } } as SystemConfig,
|
||||
oldConfig: {} as SystemConfig,
|
||||
}),
|
||||
|
@ -6,8 +6,7 @@ import path, { basename, parse } from 'node:path';
|
||||
import picomatch from 'picomatch';
|
||||
import { StorageCore } from 'src/cores/storage.core';
|
||||
import { SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { OnEventInternal } from 'src/decorators';
|
||||
import { ILibraryOfflineJob } from 'src/domain/job/job.interface';
|
||||
import { OnServerEvent } from 'src/decorators';
|
||||
import {
|
||||
CreateLibraryDto,
|
||||
LibraryResponseDto,
|
||||
@ -23,9 +22,9 @@ import {
|
||||
import { AssetEntity, AssetType } from 'src/entities/asset.entity';
|
||||
import { LibraryEntity, LibraryType } from 'src/entities/library.entity';
|
||||
import { IAssetRepository, WithProperty } from 'src/interfaces/asset.interface';
|
||||
import { InternalEvent, InternalEventMap } from 'src/interfaces/communication.interface';
|
||||
import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { DatabaseLock, IDatabaseRepository } from 'src/interfaces/database.interface';
|
||||
import { ServerAsyncEvent, ServerAsyncEventMap } from 'src/interfaces/event.interface';
|
||||
import {
|
||||
IBaseJob,
|
||||
IEntityJob,
|
||||
@ -105,8 +104,8 @@ export class LibraryService extends EventEmitter {
|
||||
});
|
||||
}
|
||||
|
||||
@OnEventInternal(InternalEvent.VALIDATE_CONFIG)
|
||||
validateConfig({ newConfig }: InternalEventMap[InternalEvent.VALIDATE_CONFIG]) {
|
||||
@OnServerEvent(ServerAsyncEvent.CONFIG_VALIDATE)
|
||||
onValidateConfig({ newConfig }: ServerAsyncEventMap[ServerAsyncEvent.CONFIG_VALIDATE]) {
|
||||
const { scan } = newConfig.library;
|
||||
if (!validateCronExpression(scan.cronExpression)) {
|
||||
throw new Error(`Invalid cron expression ${scan.cronExpression}`);
|
||||
|
@ -8,9 +8,9 @@ import { ExifEntity } from 'src/entities/exif.entity';
|
||||
import { SystemConfigKey } from 'src/entities/system-config.entity';
|
||||
import { IAlbumRepository } from 'src/interfaces/album.interface';
|
||||
import { IAssetRepository, WithoutProperty } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { IDatabaseRepository } from 'src/interfaces/database.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IJobRepository, JobName, JobStatus } from 'src/interfaces/job.interface';
|
||||
import { IMediaRepository } from 'src/interfaces/media.interface';
|
||||
import { IMetadataRepository, ImmichTags } from 'src/interfaces/metadata.interface';
|
||||
@ -24,9 +24,9 @@ import { fileStub } from 'test/fixtures/file.stub';
|
||||
import { probeStub } from 'test/fixtures/media.stub';
|
||||
import { newAlbumRepositoryMock } from 'test/repositories/album.repository.mock';
|
||||
import { newAssetRepositoryMock } from 'test/repositories/asset.repository.mock';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newCryptoRepositoryMock } from 'test/repositories/crypto.repository.mock';
|
||||
import { newDatabaseRepositoryMock } from 'test/repositories/database.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newJobRepositoryMock } from 'test/repositories/job.repository.mock';
|
||||
import { newMediaRepositoryMock } from 'test/repositories/media.repository.mock';
|
||||
import { newMetadataRepositoryMock } from 'test/repositories/metadata.repository.mock';
|
||||
@ -46,7 +46,7 @@ describe(MetadataService.name, () => {
|
||||
let mediaMock: jest.Mocked<IMediaRepository>;
|
||||
let personMock: jest.Mocked<IPersonRepository>;
|
||||
let storageMock: jest.Mocked<IStorageRepository>;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
let databaseMock: jest.Mocked<IDatabaseRepository>;
|
||||
let sut: MetadataService;
|
||||
|
||||
@ -59,7 +59,7 @@ describe(MetadataService.name, () => {
|
||||
metadataMock = newMetadataRepositoryMock();
|
||||
moveMock = newMoveRepositoryMock();
|
||||
personMock = newPersonRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
eventMock = newEventRepositoryMock();
|
||||
storageMock = newStorageRepositoryMock();
|
||||
mediaMock = newMediaRepositoryMock();
|
||||
databaseMock = newDatabaseRepositoryMock();
|
||||
@ -67,7 +67,7 @@ describe(MetadataService.name, () => {
|
||||
sut = new MetadataService(
|
||||
albumMock,
|
||||
assetMock,
|
||||
communicationMock,
|
||||
eventMock,
|
||||
cryptoRepository,
|
||||
databaseMock,
|
||||
jobMock,
|
||||
@ -195,7 +195,7 @@ describe(MetadataService.name, () => {
|
||||
await expect(sut.handleLivePhotoLinking({ id: assetStub.livePhotoStillAsset.id })).resolves.toBe(
|
||||
JobStatus.SUCCESS,
|
||||
);
|
||||
expect(communicationMock.send).toHaveBeenCalledWith(
|
||||
expect(eventMock.clientSend).toHaveBeenCalledWith(
|
||||
ClientEvent.ASSET_HIDDEN,
|
||||
assetStub.livePhotoMotionAsset.ownerId,
|
||||
assetStub.livePhotoMotionAsset.id,
|
||||
|
@ -12,9 +12,9 @@ import { AssetEntity, AssetType } from 'src/entities/asset.entity';
|
||||
import { ExifEntity } from 'src/entities/exif.entity';
|
||||
import { IAlbumRepository } from 'src/interfaces/album.interface';
|
||||
import { IAssetRepository, WithoutProperty } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { DatabaseLock, IDatabaseRepository } from 'src/interfaces/database.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import {
|
||||
IBaseJob,
|
||||
IEntityJob,
|
||||
@ -105,7 +105,7 @@ export class MetadataService {
|
||||
constructor(
|
||||
@Inject(IAlbumRepository) private albumRepository: IAlbumRepository,
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
@Inject(ICryptoRepository) private cryptoRepository: ICryptoRepository,
|
||||
@Inject(IDatabaseRepository) private databaseRepository: IDatabaseRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@ -185,7 +185,7 @@ export class MetadataService {
|
||||
await this.albumRepository.removeAsset(motionAsset.id);
|
||||
|
||||
// Notify clients to hide the linked live photo asset
|
||||
this.communicationRepository.send(ClientEvent.ASSET_HIDDEN, motionAsset.ownerId, motionAsset.id);
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_HIDDEN, motionAsset.ownerId, motionAsset.id);
|
||||
|
||||
return JobStatus.SUCCESS;
|
||||
}
|
||||
|
@ -1,13 +1,13 @@
|
||||
import { serverVersion } from 'src/constants';
|
||||
import { SystemMetadataKey } from 'src/entities/system-metadata.entity';
|
||||
import { ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IServerInfoRepository } from 'src/interfaces/server-info.interface';
|
||||
import { IStorageRepository } from 'src/interfaces/storage.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
import { ISystemMetadataRepository } from 'src/interfaces/system-metadata.interface';
|
||||
import { IUserRepository } from 'src/interfaces/user.interface';
|
||||
import { ServerInfoService } from 'src/services/server-info.service';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newStorageRepositoryMock } from 'test/repositories/storage.repository.mock';
|
||||
import { newSystemConfigRepositoryMock } from 'test/repositories/system-config.repository.mock';
|
||||
import { newServerInfoRepositoryMock } from 'test/repositories/system-info.repository.mock';
|
||||
@ -16,7 +16,7 @@ import { newUserRepositoryMock } from 'test/repositories/user.repository.mock';
|
||||
|
||||
describe(ServerInfoService.name, () => {
|
||||
let sut: ServerInfoService;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
let configMock: jest.Mocked<ISystemConfigRepository>;
|
||||
let serverInfoMock: jest.Mocked<IServerInfoRepository>;
|
||||
let storageMock: jest.Mocked<IStorageRepository>;
|
||||
@ -25,20 +25,13 @@ describe(ServerInfoService.name, () => {
|
||||
|
||||
beforeEach(() => {
|
||||
configMock = newSystemConfigRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
eventMock = newEventRepositoryMock();
|
||||
serverInfoMock = newServerInfoRepositoryMock();
|
||||
storageMock = newStorageRepositoryMock();
|
||||
userMock = newUserRepositoryMock();
|
||||
systemMetadataMock = newSystemMetadataRepositoryMock();
|
||||
|
||||
sut = new ServerInfoService(
|
||||
communicationMock,
|
||||
configMock,
|
||||
userMock,
|
||||
serverInfoMock,
|
||||
storageMock,
|
||||
systemMetadataMock,
|
||||
);
|
||||
sut = new ServerInfoService(eventMock, configMock, userMock, serverInfoMock, storageMock, systemMetadataMock);
|
||||
});
|
||||
|
||||
it('should work', () => {
|
||||
|
@ -3,6 +3,7 @@ import { DateTime } from 'luxon';
|
||||
import { isDev, serverVersion } from 'src/constants';
|
||||
import { StorageCore, StorageFolder } from 'src/cores/storage.core';
|
||||
import { SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { OnServerEvent } from 'src/decorators';
|
||||
import {
|
||||
ServerConfigDto,
|
||||
ServerFeaturesDto,
|
||||
@ -13,7 +14,7 @@ import {
|
||||
UsageByUserDto,
|
||||
} from 'src/dtos/server-info.dto';
|
||||
import { SystemMetadataKey } from 'src/entities/system-metadata.entity';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository, ServerEvent, ServerEventMap } from 'src/interfaces/event.interface';
|
||||
import { IServerInfoRepository } from 'src/interfaces/server-info.interface';
|
||||
import { IStorageRepository } from 'src/interfaces/storage.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
@ -32,7 +33,7 @@ export class ServerInfoService {
|
||||
private releaseVersionCheckedAt: DateTime | null = null;
|
||||
|
||||
constructor(
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
@Inject(ISystemConfigRepository) configRepository: ISystemConfigRepository,
|
||||
@Inject(IUserRepository) private userRepository: IUserRepository,
|
||||
@Inject(IServerInfoRepository) private repository: IServerInfoRepository,
|
||||
@ -40,9 +41,10 @@ export class ServerInfoService {
|
||||
@Inject(ISystemMetadataRepository) private readonly systemMetadataRepository: ISystemMetadataRepository,
|
||||
) {
|
||||
this.configCore = SystemConfigCore.create(configRepository);
|
||||
this.communicationRepository.on('connect', (userId) => this.handleConnect(userId));
|
||||
}
|
||||
|
||||
onConnect() {}
|
||||
|
||||
async init(): Promise<void> {
|
||||
await this.handleVersionCheck();
|
||||
|
||||
@ -169,8 +171,9 @@ export class ServerInfoService {
|
||||
return true;
|
||||
}
|
||||
|
||||
private handleConnect(userId: string) {
|
||||
this.communicationRepository.send(ClientEvent.SERVER_VERSION, userId, serverVersion);
|
||||
@OnServerEvent(ServerEvent.WEBSOCKET_CONNECT)
|
||||
onWebsocketConnection({ userId }: ServerEventMap[ServerEvent.WEBSOCKET_CONNECT]) {
|
||||
this.eventRepository.clientSend(ClientEvent.SERVER_VERSION, userId, serverVersion);
|
||||
this.newReleaseNotification(userId);
|
||||
}
|
||||
|
||||
@ -184,7 +187,7 @@ export class ServerInfoService {
|
||||
};
|
||||
|
||||
userId
|
||||
? this.communicationRepository.send(event, userId, payload)
|
||||
: this.communicationRepository.broadcast(event, payload);
|
||||
? this.eventRepository.clientSend(event, userId, payload)
|
||||
: this.eventRepository.clientBroadcast(event, payload);
|
||||
}
|
||||
}
|
||||
|
@ -104,7 +104,6 @@ describe(SmartInfoService.name, () => {
|
||||
});
|
||||
|
||||
it('should save the returned objects', async () => {
|
||||
searchMock.upsert.mockResolvedValue();
|
||||
machineMock.encodeImage.mockResolvedValue([0.01, 0.02, 0.03]);
|
||||
|
||||
await sut.handleEncodeClip({ id: asset.id });
|
||||
@ -114,12 +113,7 @@ describe(SmartInfoService.name, () => {
|
||||
{ imagePath: 'path/to/resize.ext' },
|
||||
{ enabled: true, modelName: 'ViT-B-32__openai' },
|
||||
);
|
||||
expect(searchMock.upsert).toHaveBeenCalledWith(
|
||||
{
|
||||
assetId: 'asset-1',
|
||||
},
|
||||
[0.01, 0.02, 0.03],
|
||||
);
|
||||
expect(searchMock.upsert).toHaveBeenCalledWith('asset-1', [0.01, 0.02, 0.03]);
|
||||
});
|
||||
});
|
||||
|
||||
|
@ -98,7 +98,7 @@ export class SmartInfoService {
|
||||
await this.databaseRepository.wait(DatabaseLock.CLIPDimSize);
|
||||
}
|
||||
|
||||
await this.repository.upsert({ assetId: asset.id }, clipEmbedding);
|
||||
await this.repository.upsert(asset.id, clipEmbedding);
|
||||
|
||||
return JobStatus.SUCCESS;
|
||||
}
|
||||
|
@ -70,10 +70,10 @@ describe(StorageTemplateService.name, () => {
|
||||
SystemConfigCore.create(configMock).config$.next(defaults);
|
||||
});
|
||||
|
||||
describe('validate', () => {
|
||||
describe('onValidateConfig', () => {
|
||||
it('should allow valid templates', () => {
|
||||
expect(() =>
|
||||
sut.validate({
|
||||
sut.onValidateConfig({
|
||||
newConfig: {
|
||||
storageTemplate: {
|
||||
template:
|
||||
@ -87,7 +87,7 @@ describe(StorageTemplateService.name, () => {
|
||||
|
||||
it('should fail for an invalid template', () => {
|
||||
expect(() =>
|
||||
sut.validate({
|
||||
sut.onValidateConfig({
|
||||
newConfig: {
|
||||
storageTemplate: {
|
||||
template: '{{foo}}',
|
||||
|
@ -14,15 +14,15 @@ import {
|
||||
} from 'src/constants';
|
||||
import { StorageCore, StorageFolder } from 'src/cores/storage.core';
|
||||
import { SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { OnEventInternal } from 'src/decorators';
|
||||
import { OnServerEvent } from 'src/decorators';
|
||||
import { AssetEntity, AssetType } from 'src/entities/asset.entity';
|
||||
import { AssetPathType } from 'src/entities/move.entity';
|
||||
import { SystemConfig } from 'src/entities/system-config.entity';
|
||||
import { IAlbumRepository } from 'src/interfaces/album.interface';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { InternalEvent, InternalEventMap } from 'src/interfaces/communication.interface';
|
||||
import { ICryptoRepository } from 'src/interfaces/crypto.interface';
|
||||
import { DatabaseLock, IDatabaseRepository } from 'src/interfaces/database.interface';
|
||||
import { ServerAsyncEvent, ServerAsyncEventMap } from 'src/interfaces/event.interface';
|
||||
import { IEntityJob, JOBS_ASSET_PAGINATION_SIZE, JobStatus } from 'src/interfaces/job.interface';
|
||||
import { IMoveRepository } from 'src/interfaces/move.interface';
|
||||
import { IPersonRepository } from 'src/interfaces/person.interface';
|
||||
@ -86,8 +86,8 @@ export class StorageTemplateService {
|
||||
);
|
||||
}
|
||||
|
||||
@OnEventInternal(InternalEvent.VALIDATE_CONFIG)
|
||||
validate({ newConfig }: InternalEventMap[InternalEvent.VALIDATE_CONFIG]) {
|
||||
@OnServerEvent(ServerAsyncEvent.CONFIG_VALIDATE)
|
||||
onValidateConfig({ newConfig }: ServerAsyncEventMap[ServerAsyncEvent.CONFIG_VALIDATE]) {
|
||||
try {
|
||||
const { compiled } = this.compile(newConfig.storageTemplate.template);
|
||||
this.render(compiled, {
|
||||
|
@ -13,13 +13,13 @@ import {
|
||||
TranscodePolicy,
|
||||
VideoCodec,
|
||||
} from 'src/entities/system-config.entity';
|
||||
import { ICommunicationRepository, ServerEvent } from 'src/interfaces/communication.interface';
|
||||
import { IEventRepository, ServerEvent } from 'src/interfaces/event.interface';
|
||||
import { QueueName } from 'src/interfaces/job.interface';
|
||||
import { ISearchRepository } from 'src/interfaces/search.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
import { SystemConfigService } from 'src/services/system-config.service';
|
||||
import { ImmichLogger } from 'src/utils/logger';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newSystemConfigRepositoryMock } from 'test/repositories/system-config.repository.mock';
|
||||
|
||||
const updates: SystemConfigEntity[] = [
|
||||
@ -152,14 +152,14 @@ const updatedConfig = Object.freeze<SystemConfig>({
|
||||
describe(SystemConfigService.name, () => {
|
||||
let sut: SystemConfigService;
|
||||
let configMock: jest.Mocked<ISystemConfigRepository>;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
let smartInfoMock: jest.Mocked<ISearchRepository>;
|
||||
|
||||
beforeEach(() => {
|
||||
delete process.env.IMMICH_CONFIG_FILE;
|
||||
configMock = newSystemConfigRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
sut = new SystemConfigService(configMock, communicationMock, smartInfoMock);
|
||||
eventMock = newEventRepositoryMock();
|
||||
sut = new SystemConfigService(configMock, eventMock, smartInfoMock);
|
||||
});
|
||||
|
||||
it('should work', () => {
|
||||
@ -330,8 +330,8 @@ describe(SystemConfigService.name, () => {
|
||||
|
||||
await expect(sut.updateConfig(updatedConfig)).resolves.toEqual(updatedConfig);
|
||||
|
||||
expect(communicationMock.broadcast).toHaveBeenCalled();
|
||||
expect(communicationMock.sendServerEvent).toHaveBeenCalledWith(ServerEvent.CONFIG_UPDATE);
|
||||
expect(eventMock.clientBroadcast).toHaveBeenCalled();
|
||||
expect(eventMock.serverSend).toHaveBeenCalledWith(ServerEvent.CONFIG_UPDATE, null);
|
||||
expect(configMock.saveAll).toHaveBeenCalledWith(updates);
|
||||
});
|
||||
|
||||
|
@ -12,16 +12,16 @@ import {
|
||||
supportedYearTokens,
|
||||
} from 'src/constants';
|
||||
import { SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { OnEventInternal } from 'src/decorators';
|
||||
import { OnServerEvent } from 'src/decorators';
|
||||
import { SystemConfigDto, SystemConfigTemplateStorageOptionDto, mapConfig } from 'src/dtos/system-config.dto';
|
||||
import { LogLevel, SystemConfig } from 'src/entities/system-config.entity';
|
||||
import {
|
||||
ClientEvent,
|
||||
ICommunicationRepository,
|
||||
InternalEvent,
|
||||
InternalEventMap,
|
||||
IEventRepository,
|
||||
ServerAsyncEvent,
|
||||
ServerAsyncEventMap,
|
||||
ServerEvent,
|
||||
} from 'src/interfaces/communication.interface';
|
||||
} from 'src/interfaces/event.interface';
|
||||
import { ISearchRepository } from 'src/interfaces/search.interface';
|
||||
import { ISystemConfigRepository } from 'src/interfaces/system-config.interface';
|
||||
import { ImmichLogger } from 'src/utils/logger';
|
||||
@ -33,11 +33,10 @@ export class SystemConfigService {
|
||||
|
||||
constructor(
|
||||
@Inject(ISystemConfigRepository) private repository: ISystemConfigRepository,
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
@Inject(ISearchRepository) private smartInfoRepository: ISearchRepository,
|
||||
) {
|
||||
this.core = SystemConfigCore.create(repository);
|
||||
this.communicationRepository.on(ServerEvent.CONFIG_UPDATE, () => this.handleConfigUpdate());
|
||||
this.core.config$.subscribe((config) => this.setLogLevel(config));
|
||||
}
|
||||
|
||||
@ -60,8 +59,8 @@ export class SystemConfigService {
|
||||
return mapConfig(config);
|
||||
}
|
||||
|
||||
@OnEventInternal(InternalEvent.VALIDATE_CONFIG)
|
||||
validateConfig({ newConfig, oldConfig }: InternalEventMap[InternalEvent.VALIDATE_CONFIG]) {
|
||||
@OnServerEvent(ServerAsyncEvent.CONFIG_VALIDATE)
|
||||
onValidateConfig({ newConfig, oldConfig }: ServerAsyncEventMap[ServerAsyncEvent.CONFIG_VALIDATE]) {
|
||||
if (!_.isEqual(instanceToPlain(newConfig.logging), oldConfig.logging) && this.getEnvLogLevel()) {
|
||||
throw new Error('Logging cannot be changed while the environment variable LOG_LEVEL is set.');
|
||||
}
|
||||
@ -71,7 +70,10 @@ export class SystemConfigService {
|
||||
const oldConfig = await this.core.getConfig();
|
||||
|
||||
try {
|
||||
await this.communicationRepository.emitAsync(InternalEvent.VALIDATE_CONFIG, { newConfig: dto, oldConfig });
|
||||
await this.eventRepository.serverSendAsync(ServerAsyncEvent.CONFIG_VALIDATE, {
|
||||
newConfig: dto,
|
||||
oldConfig,
|
||||
});
|
||||
} catch (error) {
|
||||
this.logger.warn(`Unable to save system config due to a validation error: ${error}`);
|
||||
throw new BadRequestException(error instanceof Error ? error.message : error);
|
||||
@ -79,8 +81,8 @@ export class SystemConfigService {
|
||||
|
||||
const newConfig = await this.core.updateConfig(dto);
|
||||
|
||||
this.communicationRepository.broadcast(ClientEvent.CONFIG_UPDATE, {});
|
||||
this.communicationRepository.sendServerEvent(ServerEvent.CONFIG_UPDATE);
|
||||
this.eventRepository.clientBroadcast(ClientEvent.CONFIG_UPDATE, {});
|
||||
this.eventRepository.serverSend(ServerEvent.CONFIG_UPDATE, null);
|
||||
|
||||
if (oldConfig.machineLearning.clip.modelName !== newConfig.machineLearning.clip.modelName) {
|
||||
await this.smartInfoRepository.init(newConfig.machineLearning.clip.modelName);
|
||||
@ -90,7 +92,7 @@ export class SystemConfigService {
|
||||
|
||||
// this is only used by the cli on config change, and it's not actually needed anymore
|
||||
async refreshConfig() {
|
||||
this.communicationRepository.sendServerEvent(ServerEvent.CONFIG_UPDATE);
|
||||
this.eventRepository.serverSend(ServerEvent.CONFIG_UPDATE, null);
|
||||
await this.core.refreshConfig();
|
||||
return true;
|
||||
}
|
||||
@ -126,7 +128,8 @@ export class SystemConfigService {
|
||||
return theme.customCss;
|
||||
}
|
||||
|
||||
private async handleConfigUpdate() {
|
||||
@OnServerEvent(ServerEvent.CONFIG_UPDATE)
|
||||
async onConfigUpdate() {
|
||||
await this.core.refreshConfig();
|
||||
}
|
||||
|
||||
|
@ -1,13 +1,13 @@
|
||||
import { BadRequestException } from '@nestjs/common';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IJobRepository, JobName } from 'src/interfaces/job.interface';
|
||||
import { TrashService } from 'src/services/trash.service';
|
||||
import { assetStub } from 'test/fixtures/asset.stub';
|
||||
import { authStub } from 'test/fixtures/auth.stub';
|
||||
import { IAccessRepositoryMock, newAccessRepositoryMock } from 'test/repositories/access.repository.mock';
|
||||
import { newAssetRepositoryMock } from 'test/repositories/asset.repository.mock';
|
||||
import { newCommunicationRepositoryMock } from 'test/repositories/communication.repository.mock';
|
||||
import { newEventRepositoryMock } from 'test/repositories/event.repository.mock';
|
||||
import { newJobRepositoryMock } from 'test/repositories/job.repository.mock';
|
||||
|
||||
describe(TrashService.name, () => {
|
||||
@ -15,7 +15,7 @@ describe(TrashService.name, () => {
|
||||
let accessMock: IAccessRepositoryMock;
|
||||
let assetMock: jest.Mocked<IAssetRepository>;
|
||||
let jobMock: jest.Mocked<IJobRepository>;
|
||||
let communicationMock: jest.Mocked<ICommunicationRepository>;
|
||||
let eventMock: jest.Mocked<IEventRepository>;
|
||||
|
||||
it('should work', () => {
|
||||
expect(sut).toBeDefined();
|
||||
@ -24,10 +24,10 @@ describe(TrashService.name, () => {
|
||||
beforeEach(() => {
|
||||
accessMock = newAccessRepositoryMock();
|
||||
assetMock = newAssetRepositoryMock();
|
||||
communicationMock = newCommunicationRepositoryMock();
|
||||
eventMock = newEventRepositoryMock();
|
||||
jobMock = newJobRepositoryMock();
|
||||
|
||||
sut = new TrashService(accessMock, assetMock, jobMock, communicationMock);
|
||||
sut = new TrashService(accessMock, assetMock, jobMock, eventMock);
|
||||
});
|
||||
|
||||
describe('restoreAssets', () => {
|
||||
@ -54,14 +54,14 @@ describe(TrashService.name, () => {
|
||||
assetMock.getByUserId.mockResolvedValue({ items: [], hasNextPage: false });
|
||||
await expect(sut.restore(authStub.user1)).resolves.toBeUndefined();
|
||||
expect(assetMock.restoreAll).not.toHaveBeenCalled();
|
||||
expect(communicationMock.send).not.toHaveBeenCalled();
|
||||
expect(eventMock.clientSend).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should restore and notify', async () => {
|
||||
assetMock.getByUserId.mockResolvedValue({ items: [assetStub.image], hasNextPage: false });
|
||||
await expect(sut.restore(authStub.user1)).resolves.toBeUndefined();
|
||||
expect(assetMock.restoreAll).toHaveBeenCalledWith([assetStub.image.id]);
|
||||
expect(communicationMock.send).toHaveBeenCalledWith(ClientEvent.ASSET_RESTORE, authStub.user1.user.id, [
|
||||
expect(eventMock.clientSend).toHaveBeenCalledWith(ClientEvent.ASSET_RESTORE, authStub.user1.user.id, [
|
||||
assetStub.image.id,
|
||||
]);
|
||||
});
|
||||
|
@ -5,7 +5,7 @@ import { BulkIdsDto } from 'src/dtos/asset-ids.response.dto';
|
||||
import { AuthDto } from 'src/dtos/auth.dto';
|
||||
import { IAccessRepository } from 'src/interfaces/access.interface';
|
||||
import { IAssetRepository } from 'src/interfaces/asset.interface';
|
||||
import { ClientEvent, ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
import { ClientEvent, IEventRepository } from 'src/interfaces/event.interface';
|
||||
import { IJobRepository, JOBS_ASSET_PAGINATION_SIZE, JobName } from 'src/interfaces/job.interface';
|
||||
import { usePagination } from 'src/utils/pagination';
|
||||
|
||||
@ -16,7 +16,7 @@ export class TrashService {
|
||||
@Inject(IAccessRepository) accessRepository: IAccessRepository,
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ICommunicationRepository) private communicationRepository: ICommunicationRepository,
|
||||
@Inject(IEventRepository) private eventRepository: IEventRepository,
|
||||
) {
|
||||
this.access = AccessCore.create(accessRepository);
|
||||
}
|
||||
@ -60,6 +60,6 @@ export class TrashService {
|
||||
}
|
||||
|
||||
await this.assetRepository.restoreAll(ids);
|
||||
this.communicationRepository.send(ClientEvent.ASSET_RESTORE, auth.user.id, ids);
|
||||
this.eventRepository.clientSend(ClientEvent.ASSET_RESTORE, auth.user.id, ids);
|
||||
}
|
||||
}
|
||||
|
@ -1,6 +1,5 @@
|
||||
import { BadRequestException, ForbiddenException, Inject, Injectable, NotFoundException } from '@nestjs/common';
|
||||
import { DateTime } from 'luxon';
|
||||
import { randomBytes } from 'node:crypto';
|
||||
import { StorageCore, StorageFolder } from 'src/cores/storage.core';
|
||||
import { SystemConfigCore } from 'src/cores/system-config.core';
|
||||
import { UserCore } from 'src/cores/user.core';
|
||||
@ -26,7 +25,7 @@ export class UserService {
|
||||
|
||||
constructor(
|
||||
@Inject(IAlbumRepository) private albumRepository: IAlbumRepository,
|
||||
@Inject(ICryptoRepository) cryptoRepository: ICryptoRepository,
|
||||
@Inject(ICryptoRepository) private cryptoRepository: ICryptoRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ILibraryRepository) libraryRepository: ILibraryRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
@ -132,7 +131,7 @@ export class UserService {
|
||||
}
|
||||
|
||||
const providedPassword = await ask(mapUser(admin));
|
||||
const password = providedPassword || randomBytes(24).toString('base64').replaceAll(/\W/g, '');
|
||||
const password = providedPassword || this.cryptoRepository.newPassword(24);
|
||||
|
||||
await this.userCore.updateUser(admin, admin.id, { password });
|
||||
|
||||
|
@ -17,12 +17,16 @@ import { excludePaths, serverVersion } from 'src/constants';
|
||||
import { DecorateAll } from 'src/decorators';
|
||||
|
||||
let metricsEnabled = process.env.IMMICH_METRICS === 'true';
|
||||
const hostMetrics =
|
||||
export const hostMetrics =
|
||||
process.env.IMMICH_HOST_METRICS == null ? metricsEnabled : process.env.IMMICH_HOST_METRICS === 'true';
|
||||
const apiMetrics = process.env.IMMICH_API_METRICS == null ? metricsEnabled : process.env.IMMICH_API_METRICS === 'true';
|
||||
const repoMetrics = process.env.IMMICH_IO_METRICS == null ? metricsEnabled : process.env.IMMICH_IO_METRICS === 'true';
|
||||
export const apiMetrics =
|
||||
process.env.IMMICH_API_METRICS == null ? metricsEnabled : process.env.IMMICH_API_METRICS === 'true';
|
||||
export const repoMetrics =
|
||||
process.env.IMMICH_IO_METRICS == null ? metricsEnabled : process.env.IMMICH_IO_METRICS === 'true';
|
||||
export const jobMetrics =
|
||||
process.env.IMMICH_JOB_METRICS == null ? metricsEnabled : process.env.IMMICH_JOB_METRICS === 'true';
|
||||
|
||||
metricsEnabled ||= hostMetrics || apiMetrics || repoMetrics;
|
||||
metricsEnabled ||= hostMetrics || apiMetrics || repoMetrics || jobMetrics;
|
||||
if (!metricsEnabled && process.env.OTEL_SDK_DISABLED === undefined) {
|
||||
process.env.OTEL_SDK_DISABLED = 'true';
|
||||
}
|
||||
|
@ -5,7 +5,6 @@ export const newAssetRepositoryMock = (): jest.Mocked<IAssetRepository> => {
|
||||
create: jest.fn(),
|
||||
upsertExif: jest.fn(),
|
||||
upsertJobStatus: jest.fn(),
|
||||
getByDate: jest.fn(),
|
||||
getByDayOfYear: jest.fn(),
|
||||
getByIds: jest.fn().mockResolvedValue([]),
|
||||
getByIdsWithAllRelations: jest.fn().mockResolvedValue([]),
|
||||
|
@ -1,12 +0,0 @@
|
||||
import { ICommunicationRepository } from 'src/interfaces/communication.interface';
|
||||
|
||||
export const newCommunicationRepositoryMock = (): jest.Mocked<ICommunicationRepository> => {
|
||||
return {
|
||||
send: jest.fn(),
|
||||
broadcast: jest.fn(),
|
||||
on: jest.fn(),
|
||||
sendServerEvent: jest.fn(),
|
||||
emit: jest.fn(),
|
||||
emitAsync: jest.fn(),
|
||||
};
|
||||
};
|
@ -9,5 +9,6 @@ export const newCryptoRepositoryMock = (): jest.Mocked<ICryptoRepository> => {
|
||||
hashSha256: jest.fn().mockImplementation((input) => `${input} (hashed)`),
|
||||
hashSha1: jest.fn().mockImplementation((input) => Buffer.from(`${input.toString()} (hashed)`)),
|
||||
hashFile: jest.fn().mockImplementation((input) => `${input} (file-hashed)`),
|
||||
newPassword: jest.fn().mockReturnValue(Buffer.from('random-bytes').toString('base64')),
|
||||
};
|
||||
};
|
||||
|
10
server/test/repositories/event.repository.mock.ts
Normal file
@ -0,0 +1,10 @@
|
||||
import { IEventRepository } from 'src/interfaces/event.interface';
|
||||
|
||||
export const newEventRepositoryMock = (): jest.Mocked<IEventRepository> => {
|
||||
return {
|
||||
clientSend: jest.fn(),
|
||||
clientBroadcast: jest.fn(),
|
||||
serverSend: jest.fn(),
|
||||
serverSendAsync: jest.fn(),
|
||||
};
|
||||
};
|
9
server/test/repositories/metric.repository.mock.ts
Normal file
@ -0,0 +1,9 @@
|
||||
import { IMetricRepository } from 'src/interfaces/metric.interface';
|
||||
|
||||
export const newMetricRepositoryMock = (): jest.Mocked<IMetricRepository> => {
|
||||
return {
|
||||
addToCounter: jest.fn(),
|
||||
updateGauge: jest.fn(),
|
||||
updateHistogram: jest.fn(),
|
||||
};
|
||||
};
|
@ -21,7 +21,7 @@
|
||||
<div class="ml-4 mt-4">
|
||||
<SettingSwitch
|
||||
title="ENABLED"
|
||||
subtitle="Enable period requests to GitHub to check for new releases"
|
||||
subtitle="Enable periodic requests to GitHub to check for new releases"
|
||||
bind:checked={config.newVersionCheck.enabled}
|
||||
{disabled}
|
||||
/>
|
||||
|
@ -2,7 +2,7 @@
|
||||
import LinkButton from '$lib/components/elements/buttons/link-button.svelte';
|
||||
import Dropdown from '$lib/components/elements/dropdown.svelte';
|
||||
import Icon from '$lib/components/elements/icon.svelte';
|
||||
import { AlbumViewMode, albumViewSettings } from '$lib/stores/preferences.store';
|
||||
import { AlbumFilter, AlbumViewMode, albumViewSettings } from '$lib/stores/preferences.store';
|
||||
import {
|
||||
mdiArrowDownThin,
|
||||
mdiArrowUpThin,
|
||||
@ -12,6 +12,7 @@
|
||||
} from '@mdi/js';
|
||||
import { sortByOptions, type Sort, handleCreateAlbum } from '$lib/components/album-page/albums-list.svelte';
|
||||
import SearchBar from '$lib/components/elements/search-bar.svelte';
|
||||
import GroupTab from '$lib/components/elements/group-tab.svelte';
|
||||
|
||||
export let searchAlbum: string;
|
||||
|
||||
@ -25,13 +26,20 @@
|
||||
};
|
||||
</script>
|
||||
|
||||
<div class="hidden lg:block lg:w-40 xl:w-60 2xl:w-80 h-10">
|
||||
<div class="hidden xl:block">
|
||||
<GroupTab
|
||||
filters={Object.keys(AlbumFilter)}
|
||||
selected={$albumViewSettings.filter}
|
||||
onSelect={(selected) => ($albumViewSettings.filter = selected)}
|
||||
/>
|
||||
</div>
|
||||
<div class="hidden xl:block xl:w-60 2xl:w-80 h-10">
|
||||
<SearchBar placeholder="Search albums" bind:name={searchAlbum} isSearching={false} />
|
||||
</div>
|
||||
<LinkButton on:click={handleCreateAlbum}>
|
||||
<div class="flex place-items-center gap-2 text-sm">
|
||||
<Icon path={mdiPlusBoxOutline} size="18" />
|
||||
Create album
|
||||
<p class="hidden md:block">Create album</p>
|
||||
</div>
|
||||
</LinkButton>
|
||||
|
||||
|
@ -1,5 +1,5 @@
|
||||
<script lang="ts" context="module">
|
||||
import { AlbumViewMode, albumViewSettings } from '$lib/stores/preferences.store';
|
||||
import { AlbumFilter, AlbumViewMode, albumViewSettings } from '$lib/stores/preferences.store';
|
||||
import { goto } from '$app/navigation';
|
||||
import { AppRoute } from '$lib/constants';
|
||||
import { createAlbum, deleteAlbum, type AlbumResponseDto } from '@immich/sdk';
|
||||
@ -118,10 +118,14 @@
|
||||
import AlbumsTable from '$lib/components/album-page/albums-table.svelte';
|
||||
import { handleError } from '$lib/utils/handle-error';
|
||||
import type { ContextMenuPosition } from '$lib/utils/context-menu';
|
||||
import GroupTab from '$lib/components/elements/group-tab.svelte';
|
||||
import SearchBar from '$lib/components/elements/search-bar.svelte';
|
||||
|
||||
export let albums: AlbumResponseDto[];
|
||||
export let ownedAlbums: AlbumResponseDto[];
|
||||
export let sharedAlbums: AlbumResponseDto[];
|
||||
export let searchAlbum: string;
|
||||
|
||||
let albums: AlbumResponseDto[] = [];
|
||||
let shouldShowEditAlbumForm = false;
|
||||
let selectedAlbum: AlbumResponseDto;
|
||||
let albumToDelete: AlbumResponseDto | null;
|
||||
@ -131,13 +135,45 @@
|
||||
$: {
|
||||
for (const key of sortByOptions) {
|
||||
if (key.title === $albumViewSettings.sortBy) {
|
||||
albums = key.sortFn(key.sortDesc, albums);
|
||||
$albumViewSettings.sortDesc = key.sortDesc; // "Save" sortDesc
|
||||
switch ($albumViewSettings.filter) {
|
||||
case AlbumFilter.All: {
|
||||
albums = key.sortFn(
|
||||
key.sortDesc,
|
||||
[...sharedAlbums, ...ownedAlbums].filter(
|
||||
(album, index, self) => index === self.findIndex((item) => album.id === item.id),
|
||||
),
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
case AlbumFilter.Owned: {
|
||||
albums = key.sortFn(key.sortDesc, ownedAlbums);
|
||||
break;
|
||||
}
|
||||
|
||||
case AlbumFilter.Shared: {
|
||||
albums = key.sortFn(key.sortDesc, sharedAlbums);
|
||||
break;
|
||||
}
|
||||
|
||||
default: {
|
||||
albums = key.sortFn(
|
||||
key.sortDesc,
|
||||
[...sharedAlbums, ...ownedAlbums].filter(
|
||||
(album, index, self) => index === self.findIndex((item) => album.id === item.id),
|
||||
),
|
||||
);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
$albumViewSettings.sortDesc = key.sortDesc;
|
||||
$albumViewSettings.sortBy = key.title;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
$: isShowContextMenu = !!contextMenuTargetAlbum;
|
||||
$: albumsFiltered = albums.filter((album) => album.albumName.toLowerCase().includes(searchAlbum.toLowerCase()));
|
||||
|
||||
@ -159,7 +195,7 @@
|
||||
|
||||
async function handleDeleteAlbum(albumToDelete: AlbumResponseDto): Promise<void> {
|
||||
await deleteAlbum({ id: albumToDelete.id });
|
||||
albums = albums.filter(({ id }) => id !== albumToDelete.id);
|
||||
ownedAlbums = ownedAlbums.filter(({ id }) => id !== albumToDelete.id);
|
||||
}
|
||||
|
||||
const chooseAlbumToDelete = (album: AlbumResponseDto) => {
|
||||
@ -194,7 +230,7 @@
|
||||
};
|
||||
|
||||
const removeAlbumsIfEmpty = async () => {
|
||||
for (const album of albums) {
|
||||
for (const album of ownedAlbums) {
|
||||
if (album.assetCount == 0 && album.albumName == '') {
|
||||
try {
|
||||
await handleDeleteAlbum(album);
|
||||
@ -211,7 +247,7 @@
|
||||
message: 'Album infos updated',
|
||||
type: NotificationType.Info,
|
||||
});
|
||||
albums[albums.findIndex((x) => x.id === selectedAlbum.id)] = selectedAlbum;
|
||||
ownedAlbums[ownedAlbums.findIndex((x) => x.id === selectedAlbum.id)] = selectedAlbum;
|
||||
};
|
||||
</script>
|
||||
|
||||
@ -227,6 +263,18 @@
|
||||
|
||||
{#if albums.length > 0}
|
||||
<!-- Album Card -->
|
||||
<div class=" block xl:hidden">
|
||||
<div class="w-fit dark:text-immich-dark-fg py-2">
|
||||
<GroupTab
|
||||
filters={Object.keys(AlbumFilter)}
|
||||
selected={$albumViewSettings.filter}
|
||||
onSelect={(selected) => ($albumViewSettings.filter = selected)}
|
||||
/>
|
||||
</div>
|
||||
<div class="w-60">
|
||||
<SearchBar placeholder="Search albums" bind:name={searchAlbum} isSearching={false} />
|
||||
</div>
|
||||
</div>
|
||||
{#if $albumViewSettings.view === AlbumViewMode.Cover}
|
||||
<div class="grid grid-cols-[repeat(auto-fill,minmax(14rem,1fr))] mt-4 gap-y-4">
|
||||
{#each albumsFiltered as album, index (album.id)}
|
||||
|
@ -8,6 +8,7 @@
|
||||
import type { Sort } from '$lib/components/album-page/albums-list.svelte';
|
||||
import { locale } from '$lib/stores/preferences.store';
|
||||
import { dateFormats } from '$lib/constants';
|
||||
import { user } from '$lib/stores/user.store';
|
||||
|
||||
export let albumsFiltered: AlbumResponseDto[];
|
||||
export let sortByOptions: Sort[];
|
||||
@ -66,18 +67,20 @@
|
||||
>
|
||||
</a>
|
||||
<td class="text-md hidden text-ellipsis text-center 2xl:block xl:w-[15%] 2xl:w-[12%]">
|
||||
<button
|
||||
on:click|stopPropagation={() => onAlbumToEdit(album)}
|
||||
class="rounded-full z-1 bg-immich-primary p-3 text-gray-100 transition-all duration-150 hover:bg-immich-primary/75 dark:bg-immich-dark-primary dark:text-gray-700"
|
||||
>
|
||||
<Icon path={mdiPencilOutline} size="16" />
|
||||
</button>
|
||||
<button
|
||||
on:click|stopPropagation={() => onChooseAlbumToDelete(album)}
|
||||
class="rounded-full z-1 bg-immich-primary p-3 text-gray-100 transition-all duration-150 hover:bg-immich-primary/75 dark:bg-immich-dark-primary dark:text-gray-700"
|
||||
>
|
||||
<Icon path={mdiTrashCanOutline} size="16" />
|
||||
</button>
|
||||
{#if $user.id === album.ownerId}
|
||||
<button
|
||||
on:click|stopPropagation={() => onAlbumToEdit(album)}
|
||||
class="rounded-full z-1 bg-immich-primary p-3 text-gray-100 transition-all duration-150 hover:bg-immich-primary/75 dark:bg-immich-dark-primary dark:text-gray-700"
|
||||
>
|
||||
<Icon path={mdiPencilOutline} size="16" />
|
||||
</button>
|
||||
<button
|
||||
on:click|stopPropagation={() => onChooseAlbumToDelete(album)}
|
||||
class="rounded-full z-1 bg-immich-primary p-3 text-gray-100 transition-all duration-150 hover:bg-immich-primary/75 dark:bg-immich-dark-primary dark:text-gray-700"
|
||||
>
|
||||
<Icon path={mdiTrashCanOutline} size="16" />
|
||||
</button>
|
||||
{/if}
|
||||
</td>
|
||||
</tr>
|
||||
{/each}
|
||||
|
20
web/src/lib/components/elements/group-tab.svelte
Normal file
@ -0,0 +1,20 @@
|
||||
<script lang="ts">
|
||||
export let filters: string[];
|
||||
export let selected: string;
|
||||
export let onSelect: (selected: string) => void;
|
||||
</script>
|
||||
|
||||
<div class=" flex bg-gray-100 dark:bg-gray-700 rounded-lg">
|
||||
{#each filters as filter, index}
|
||||
<button
|
||||
class="flex py-2 px-4 {filter === selected
|
||||
? 'dark:bg-gray-900 bg-gray-300'
|
||||
: 'dark:hover:bg-gray-800 hover:bg-gray-200'} {index === 0 ? 'rounded-l-lg' : ''} {index === filters.length - 1
|
||||
? 'rounded-r-lg'
|
||||
: ''}"
|
||||
on:click={() => onSelect(filter)}
|
||||
>
|
||||
{filter}
|
||||
</button>
|
||||
{/each}
|
||||
</div>
|
@ -70,6 +70,7 @@ export interface AlbumViewSettings {
|
||||
sortBy: string;
|
||||
sortDesc: boolean;
|
||||
view: string;
|
||||
filter: string;
|
||||
}
|
||||
|
||||
export interface SidebarSettings {
|
||||
@ -87,10 +88,17 @@ export enum AlbumViewMode {
|
||||
List = 'List',
|
||||
}
|
||||
|
||||
export enum AlbumFilter {
|
||||
All = 'All',
|
||||
Owned = 'Owned',
|
||||
Shared = 'Shared',
|
||||
}
|
||||
|
||||
export const albumViewSettings = persisted<AlbumViewSettings>('album-view-settings', {
|
||||
sortBy: 'Most recent photo',
|
||||
sortDesc: true,
|
||||
view: AlbumViewMode.Cover,
|
||||
filter: AlbumFilter.All,
|
||||
});
|
||||
|
||||
export const showDeleteModal = persisted<boolean>('delete-confirm-dialog', true, {});
|
||||
|
@ -13,5 +13,5 @@
|
||||
<div class="flex place-items-center gap-2" slot="buttons">
|
||||
<AlbumsControls bind:searchAlbum />
|
||||
</div>
|
||||
<Albums albums={data.albums} {searchAlbum} />
|
||||
<Albums ownedAlbums={data.albums} {searchAlbum} sharedAlbums={data.sharedAlbums} />
|
||||
</UserPageLayout>
|
||||
|
@ -4,10 +4,12 @@ import type { PageLoad } from './$types';
|
||||
|
||||
export const load = (async () => {
|
||||
await authenticate();
|
||||
const sharedAlbums = await getAllAlbums({ shared: true });
|
||||
const albums = await getAllAlbums({});
|
||||
|
||||
return {
|
||||
albums,
|
||||
sharedAlbums,
|
||||
meta: {
|
||||
title: 'Albums',
|
||||
},
|
||||
|
Before Width: | Height: | Size: 5.0 KiB After Width: | Height: | Size: 4.9 KiB |
Before Width: | Height: | Size: 635 B After Width: | Height: | Size: 618 B |
Before Width: | Height: | Size: 1.2 KiB After Width: | Height: | Size: 1.2 KiB |
Before Width: | Height: | Size: 1.7 KiB After Width: | Height: | Size: 1.7 KiB |
Before Width: | Height: | Size: 3.4 KiB After Width: | Height: | Size: 3.4 KiB |
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 15 KiB |
Before Width: | Height: | Size: 6.2 KiB After Width: | Height: | Size: 17 KiB |