mirror of
https://github.com/Kareadita/Kavita.git
synced 2025-07-09 03:04:19 -04:00
v0.4.8 Release (#720)
* Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Workflow updates (#658) # Added - Added: Added automatic character parsing for discord notifier. Now if the PR is over a certain character limit, it will trim and add an appropriate link to the full changelog. (Release for Stable, PR for Dev) # Removed - Removed: Removed Sentry map task from the workflow since Sentry is no longer used. * Bump versions by dotnet-bump-version. * Misc Updates (#665) * Do not allow non-admins to change their passwords when authentication is disabled * Clean up the login page so that input field text is black * cleanup some resizing when typing a password and having a lot of users * Changed the LastActive for a user to not just be login, but also when they open an already authenticated session. * Bump versions by dotnet-bump-version. * Logging Cleanup (#668) * Do not allow non-admins to change their passwords when authentication is disabled * Clean up the login page so that input field text is black * cleanup some resizing when typing a password and having a lot of users * Changed the LastActive for a user to not just be login, but also when they open an already authenticated session. * Removed some verbose debugging statements and moved some debug to information to be more prevelant to logs for default installs. * In Progress now sends progress information on the Series * Add ability to add cards to recently added when new series are added in backend * Implemented the ability to click the glasses icon to turn off incognito mode from within the reader so you can start tracking progress * Don't warn the user about authentication when they don't touch that control * Bump versions by dotnet-bump-version. * Changed the stats that are sent back to stat server from installed server. * Revert "Changed the stats that are sent back to stat server from installed server." This reverts commit 644cb6d1f67de9531ea1a1dfd3853709e0329ce7. * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Bulk Add to Collection (#674) * Fixed the typeahead not having the same size input box as other inputs * Implemented the ability to add multiple series to a collection through bulk operations flow. Updated book parser to handle "@import url('...');" syntax as well as @import '...'; * Implemented the ability to create a new Collection tag via bulk operations flow. * Bump versions by dotnet-bump-version. * Bulk Operations for In Progress and Recently Added (#677) * Don't log a message about bad match if the file is a cover image * Enable bulk operations for In Progress and Recently Added * Fixed a bad logic case * Bump versions by dotnet-bump-version. * Regression Fix (#680) * Ensure we mount the backups directory for Docker users * Fixed a huge logic bug that deleted files in users libraries * Bump versions by dotnet-bump-version. * Change chunk size to be a fixed 50 to validate if it's causing issue with refresh. Added some try catches to see if exceptions are causing issues. (#681) * Bump versions by dotnet-bump-version. * Fixed a bug where searching on localized name would fail to show on the search. Fixed a bug where extra spaces would cause the search results not to show properly. (#682) * Bump versions by dotnet-bump-version. * When we have a special marker, ensure we fall back to folder parsing to try and group correctly to the actual series before just accepting what we parsed. (#684) Fixed a missed parsing case where comic special parsing wasn't being called on comic libraries. * Bump versions by dotnet-bump-version. * iOS Admin page dropdown fix (#686) # Fixed: - Fixed: Fixed an issue where the dropdown on the admin server page would not work on Safari or other iOS browsers. * When the DB fails to save, log out all the series the user should look into for constraint issues and push a message to the admins connected to webui. (#687) * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Stat upload will now schedule itself between midnight and 6am in server time for upload. (#688) * Bump versions by dotnet-bump-version. * EPUB CSS Parsing Issues (#690) * WIP. Rewrote some of the Regex to better support css escaping. We now escape background-image, border-image, and list-style-image within css files. * Added position relative to help with positioning on books that are just absolute positioned elements. * When there is absolute positioning, like in some epub based comics, supress the bottom action bar since it wont render in the correct location. * Fixed tests * Commented out tests * Bump versions by dotnet-bump-version. * More EPUB Scoping Fixes (#691) * Added better handling around when importing css files that are empty. Moved comment removal on css files to before some css whitespace cleanup to get better matches. * Some enhancements on the checks to see if we need the bottom action bar on reader. Now we don't query DOM and have something that works more reliably. * Bump versions by dotnet-bump-version. * Fixed an issue where docker users were not properly backing up the database. Removed an empty File for when covers/ had nothing in it. (#692) * Bump versions by dotnet-bump-version. * Fallback to Folder Parsing Issue (#694) * Fixed a bug in the scanner where we fall back to parsing from folders for poorly named files. The code was exiting early if a chapter or volume could be parsed out. * Fixed a unit test by tweaking a regex for fallback * Bump versions by dotnet-bump-version. * KavitaStats Cleanup (#695) * Refactored Stats code to be much cleaner and user better naming. * Cleaned up the actual http code to use Flurl and to return if the upload was successful or not so we can delete the file where appropriate. * More refactoring for the stats code to clean it up and keep it consistent with our standards. * Removed a confusing log statement * Added support for old api key header from original stat server * Use the correct endpoint, not the new one. * Code smell * Bump versions by dotnet-bump-version. * Bulk Deletion (#697) * Implemented bulk deletion of series * Don't show unauthorized exception on UI, just redirect to the login page. * Bump versions by dotnet-bump-version. * Cover Image Picking + Forwarding Headers with EPUBs (#700) * Ensure Kavita knows about forwarding headers (fixes issue with epub urls not going through https with reverse proxy). Fixed a case where cover image selection preferred nested folders vs files in root directory. * Fixed broken unit test * Added bug that I fixed to the unit tests * Cover Image Picking + Forwarding Headers with EPUBs (#702) * Updating GA Bump version temporarily for fix (#703) * Bump versions by dotnet-bump-version. * Cover Image Picking + Forwarding Headers with EPUBs (GA Fix) (#704) * Bump versions by dotnet-bump-version. * Vacation Fixes (#709) * Ignore system and hidden folders when performing directory scan. * Fixed the comic parser tests not using Comic mode for parsing. * Accept all forwarded headers and use them. * Ignore some changes from another branch * Bump versions by dotnet-bump-version. * Breaking Changes: Docker Parity (#698) * Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial. * Cleaned up documentation around new update method. * Updated docker files to support config directory * Removed entrypoint, no longer needed * Update appsettings to point to config directory for logs * Updated message for docker users that are upgrading * Ensure that docker users that have not updated their mount points from upgrade cannot start the server * Code smells * More cleanup * Added entrypoint to fix bind mount issues * Updated README with new folder structure * Fixed build system for new setup * Updated string path if user is docker * Updated the migration flow for docker to work properly and Fixed LogFile configuration updating. * Migrating docker images is now working 100% * Fixed config from bad code * Code cleanup Co-authored-by: Chris Plaatjes <kizaing@gmail.com> * Bump versions by dotnet-bump-version. * Feature/docker parity (#714) * Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial. * Cleaned up documentation around new update method. * Updated docker files to support config directory * Removed entrypoint, no longer needed * Update appsettings to point to config directory for logs * Updated message for docker users that are upgrading * Ensure that docker users that have not updated their mount points from upgrade cannot start the server * Code smells * More cleanup * Added entrypoint to fix bind mount issues * Updated README with new folder structure * Fixed build system for new setup * Updated string path if user is docker * Updated the migration flow for docker to work properly and Fixed LogFile configuration updating. * Migrating docker images is now working 100% * Fixed config from bad code * Code cleanup * Fixed monorepo-build.sh Co-authored-by: Chris Plaatjes <kizaing@gmail.com> * Breaking Changes: Docker Parity (#715) * Fixed a bug in the copy directory to directory in the migration * Somehow GetFiles lost static modifier. * Bump versions by dotnet-bump-version. * Build issue (#716) * Fixed a bug in the copy directory to directory in the migration * Somehow GetFiles lost static modifier. * Please work * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Shakeout Changes (#717) * Make the appsettings public on Configuration and change how we detect when to migrate for non-docker users. * Fixed up non-docker copy command and removed duplicate check on source directory for a copy. * Don't delete files unless we know we are successful * Bump versions by dotnet-bump-version. * Fixed a migration issue on docker happening too many times or throwing exception when source wasn't there. (#719) * Bump versions by dotnet-bump-version. * Version bump for release (#718) * Bump versions by dotnet-bump-version. Co-authored-by: Robbie Davis <robbie@therobbiedavis.com> Co-authored-by: YEGCSharpDev <89283498+YEGCSharpDev@users.noreply.github.com> Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
parent
cb9fa0dda8
commit
33db123e81
4
.github/ISSUE_TEMPLATE/bug_report.md
vendored
4
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -2,7 +2,7 @@
|
|||||||
name: Bug report
|
name: Bug report
|
||||||
about: Create a report to help us improve
|
about: Create a report to help us improve
|
||||||
title: ''
|
title: ''
|
||||||
labels: bug
|
labels: needs-triage
|
||||||
assignees: ''
|
assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
@ -24,7 +24,7 @@ A clear and concise description of what you expected to happen.
|
|||||||
If applicable, add screenshots to help explain your problem.
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Desktop (please complete the following information):**
|
**Desktop (please complete the following information):**
|
||||||
- OS: [e.g. iOS]
|
- OS: [e.g. iOS, Docker]
|
||||||
- Browser [e.g. chrome, safari]
|
- Browser [e.g. chrome, safari]
|
||||||
- Version [e.g. 22] (can be found on Server Settings -> System tab)
|
- Version [e.g. 22] (can be found on Server Settings -> System tab)
|
||||||
|
|
||||||
|
63
.github/workflows/sentry-map.yml
vendored
63
.github/workflows/sentry-map.yml
vendored
@ -1,63 +0,0 @@
|
|||||||
name: Sentry Map Release
|
|
||||||
on:
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
version:
|
|
||||||
description: "version to update package.json"
|
|
||||||
required: true
|
|
||||||
# No default
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build:
|
|
||||||
name: Setup Sentry CLI
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: mathieu-bour/setup-sentry-cli@1.2.0
|
|
||||||
with:
|
|
||||||
version: latest
|
|
||||||
token: ${{ secrets.SENTRY_TOKEN }}
|
|
||||||
organization: kavita-7n
|
|
||||||
project: angular
|
|
||||||
|
|
||||||
- name: Check out repository
|
|
||||||
uses: actions/checkout@v2
|
|
||||||
|
|
||||||
- name: Parse Version
|
|
||||||
run: |
|
|
||||||
version='${{ github.event.inputs.version }}'
|
|
||||||
newVersion=${version%.*}
|
|
||||||
echo $newVersion
|
|
||||||
echo "::set-output name=VERSION::$newVersion"
|
|
||||||
id: parse-version
|
|
||||||
|
|
||||||
- name: NodeJS to Compile WebUI
|
|
||||||
uses: actions/setup-node@v2.1.5
|
|
||||||
with:
|
|
||||||
node-version: '14'
|
|
||||||
|
|
||||||
- run: |
|
|
||||||
cd UI/Web || exit
|
|
||||||
echo 'Installing web dependencies'
|
|
||||||
npm install
|
|
||||||
|
|
||||||
npm version --allow-same-version "${{ steps.parse-version.outputs.VERSION }}"
|
|
||||||
|
|
||||||
echo 'Building UI'
|
|
||||||
npm run prod
|
|
||||||
|
|
||||||
- name: Cache dependencies
|
|
||||||
uses: actions/cache@v2
|
|
||||||
with:
|
|
||||||
path: ~/.npm
|
|
||||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
|
||||||
restore-keys: |
|
|
||||||
${{ runner.os }}-node-
|
|
||||||
|
|
||||||
- name: Create Release
|
|
||||||
run: sentry-cli releases new ${{ steps.parse-version.outputs.VERSION }}
|
|
||||||
|
|
||||||
- name: Upload Source Maps
|
|
||||||
run: sentry-cli releases files ${{ steps.parse-version.outputs.VERSION }} upload-sourcemaps UI/Web/dist
|
|
||||||
|
|
||||||
- name: Finalize Release
|
|
||||||
run: sentry-cli releases finalize ${{ steps.parse-version.outputs.VERSION }}
|
|
30
.github/workflows/sonar-scan.yml
vendored
30
.github/workflows/sonar-scan.yml
vendored
@ -115,7 +115,7 @@ jobs:
|
|||||||
run: dotnet build --configuration Release --no-restore
|
run: dotnet build --configuration Release --no-restore
|
||||||
|
|
||||||
- name: Bump versions
|
- name: Bump versions
|
||||||
uses: SiqiLu/dotnet-bump-version@master
|
uses: ThomasEg/dotnet-bump-version@patch-1
|
||||||
with:
|
with:
|
||||||
version_files: Kavita.Common/Kavita.Common.csproj
|
version_files: Kavita.Common/Kavita.Common.csproj
|
||||||
github_token: ${{ secrets.REPO_GHA_PAT }}
|
github_token: ${{ secrets.REPO_GHA_PAT }}
|
||||||
@ -136,6 +136,13 @@ jobs:
|
|||||||
id: parse-body
|
id: parse-body
|
||||||
run: |
|
run: |
|
||||||
body="${{ steps.findPr.outputs.body }}"
|
body="${{ steps.findPr.outputs.body }}"
|
||||||
|
if [[ ${#body} -gt 1870 ]] ; then
|
||||||
|
body=${body:0:1870}
|
||||||
|
body="${body}...and much more.
|
||||||
|
|
||||||
|
Read full changelog: https://github.com/Kareadita/Kavita/pull/${{ steps.findPr.outputs.pr }}"
|
||||||
|
fi
|
||||||
|
|
||||||
body=${body//\'/}
|
body=${body//\'/}
|
||||||
body=${body//'%'/'%25'}
|
body=${body//'%'/'%25'}
|
||||||
body=${body//$'\n'/'%0A'}
|
body=${body//$'\n'/'%0A'}
|
||||||
@ -180,13 +187,6 @@ jobs:
|
|||||||
dotnet-version: '5.0.x'
|
dotnet-version: '5.0.x'
|
||||||
- run: ./monorepo-build.sh
|
- run: ./monorepo-build.sh
|
||||||
|
|
||||||
- name: Trigger Sentry workflow
|
|
||||||
uses: benc-uk/workflow-dispatch@v1
|
|
||||||
with:
|
|
||||||
workflow: Sentry Map Release
|
|
||||||
token: ${{ secrets.REPO_GHA_PAT }}
|
|
||||||
inputs: '{ "version": "${{steps.get-version.outputs.assembly-version}}" }'
|
|
||||||
|
|
||||||
- name: Login to Docker Hub
|
- name: Login to Docker Hub
|
||||||
uses: docker/login-action@v1
|
uses: docker/login-action@v1
|
||||||
with:
|
with:
|
||||||
@ -238,6 +238,13 @@ jobs:
|
|||||||
id: parse-body
|
id: parse-body
|
||||||
run: |
|
run: |
|
||||||
body="${{ steps.findPr.outputs.body }}"
|
body="${{ steps.findPr.outputs.body }}"
|
||||||
|
if [[ ${#body} -gt 1870 ]] ; then
|
||||||
|
body=${body:0:1870}
|
||||||
|
body="${body}...and much more.
|
||||||
|
|
||||||
|
Read full changelog: https://github.com/Kareadita/Kavita/releases/latest"
|
||||||
|
fi
|
||||||
|
|
||||||
body=${body//\'/}
|
body=${body//\'/}
|
||||||
body=${body//'%'/'%25'}
|
body=${body//'%'/'%25'}
|
||||||
body=${body//$'\n'/'%0A'}
|
body=${body//$'\n'/'%0A'}
|
||||||
@ -291,13 +298,6 @@ jobs:
|
|||||||
dotnet-version: '5.0.x'
|
dotnet-version: '5.0.x'
|
||||||
- run: ./monorepo-build.sh
|
- run: ./monorepo-build.sh
|
||||||
|
|
||||||
- name: Trigger Sentry workflow
|
|
||||||
uses: benc-uk/workflow-dispatch@v1
|
|
||||||
with:
|
|
||||||
workflow: Sentry Map Release
|
|
||||||
token: ${{ secrets.REPO_GHA_PAT }}
|
|
||||||
inputs: '{ "version": "${{steps.get-version.outputs.assembly-version}}" }'
|
|
||||||
|
|
||||||
- name: Login to Docker Hub
|
- name: Login to Docker Hub
|
||||||
uses: docker/login-action@v1
|
uses: docker/login-action@v1
|
||||||
with:
|
with:
|
||||||
|
20
.gitignore
vendored
20
.gitignore
vendored
@ -500,4 +500,22 @@ _output/
|
|||||||
API/stats/
|
API/stats/
|
||||||
UI/Web/dist/
|
UI/Web/dist/
|
||||||
/API.Tests/Extensions/Test Data/modified on run.txt
|
/API.Tests/Extensions/Test Data/modified on run.txt
|
||||||
/API/covers/
|
|
||||||
|
# All config files/folders in config except appsettings.json
|
||||||
|
/API/config/covers/
|
||||||
|
/API/config/logs/
|
||||||
|
/API/config/backups/
|
||||||
|
/API/config/cache/
|
||||||
|
/API/config/temp/
|
||||||
|
/API/config/stats/
|
||||||
|
/API/config/kavita.db
|
||||||
|
/API/config/kavita.db-shm
|
||||||
|
/API/config/kavita.db-wal
|
||||||
|
/API/config/Hangfire.db
|
||||||
|
/API/config/Hangfire-log.db
|
||||||
|
API/config/covers/
|
||||||
|
API/config/*.db
|
||||||
|
API/config/stats/*
|
||||||
|
API/config/stats/app_stats.json
|
||||||
|
|
||||||
|
UI/Web/.vscode/settings.json
|
||||||
|
@ -17,5 +17,24 @@ namespace API.Tests.Parser
|
|||||||
{
|
{
|
||||||
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
|
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// [Theory]
|
||||||
|
// [InlineData("@font-face{font-family:'syyskuu_repaleinen';src:url(data:font/opentype;base64,AAEAAAA", "@font-face{font-family:'syyskuu_repaleinen';src:url(data:font/opentype;base64,AAEAAAA")]
|
||||||
|
// [InlineData("@font-face{font-family:'syyskuu_repaleinen';src:url('fonts/font.css')", "@font-face{font-family:'syyskuu_repaleinen';src:url('TEST/fonts/font.css')")]
|
||||||
|
// public void ReplaceFontSrcUrl(string input, string expected)
|
||||||
|
// {
|
||||||
|
// var apiBase = "TEST/";
|
||||||
|
// var actual = API.Parser.Parser.FontSrcUrlRegex.Replace(input, "$1" + apiBase + "$2" + "$3");
|
||||||
|
// Assert.Equal(expected, actual);
|
||||||
|
// }
|
||||||
|
//
|
||||||
|
// [Theory]
|
||||||
|
// [InlineData("@import url('font.css');", "@import url('TEST/font.css');")]
|
||||||
|
// public void ReplaceImportSrcUrl(string input, string expected)
|
||||||
|
// {
|
||||||
|
// var apiBase = "TEST/";
|
||||||
|
// var actual = API.Parser.Parser.CssImportUrlRegex.Replace(input, "$1" + apiBase + "$2" + "$3");
|
||||||
|
// Assert.Equal(expected, actual);
|
||||||
|
// }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -56,6 +56,8 @@ namespace API.Tests.Parser
|
|||||||
[InlineData("Batgirl V2000 #57", "Batgirl")]
|
[InlineData("Batgirl V2000 #57", "Batgirl")]
|
||||||
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire)", "Fables")]
|
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire)", "Fables")]
|
||||||
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "2000 AD")]
|
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "2000 AD")]
|
||||||
|
[InlineData("Daredevil - v6 - 10 - (2019)", "Daredevil")]
|
||||||
|
[InlineData("Batman - The Man Who Laughs #1 (2005)", "Batman - The Man Who Laughs")]
|
||||||
public void ParseComicSeriesTest(string filename, string expected)
|
public void ParseComicSeriesTest(string filename, string expected)
|
||||||
{
|
{
|
||||||
Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename));
|
Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename));
|
||||||
@ -93,6 +95,7 @@ namespace API.Tests.Parser
|
|||||||
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "0")]
|
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "0")]
|
||||||
[InlineData("Cyberpunk 2077 - Trauma Team 04.cbz", "0")]
|
[InlineData("Cyberpunk 2077 - Trauma Team 04.cbz", "0")]
|
||||||
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "0")]
|
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "0")]
|
||||||
|
[InlineData("Daredevil - v6 - 10 - (2019)", "6")]
|
||||||
public void ParseComicVolumeTest(string filename, string expected)
|
public void ParseComicVolumeTest(string filename, string expected)
|
||||||
{
|
{
|
||||||
Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename));
|
Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename));
|
||||||
@ -134,6 +137,7 @@ namespace API.Tests.Parser
|
|||||||
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "21")]
|
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "21")]
|
||||||
[InlineData("Cyberpunk 2077 - Trauma Team #04.cbz", "4")]
|
[InlineData("Cyberpunk 2077 - Trauma Team #04.cbz", "4")]
|
||||||
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "366")]
|
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "366")]
|
||||||
|
[InlineData("Daredevil - v6 - 10 - (2019)", "10")]
|
||||||
public void ParseComicChapterTest(string filename, string expected)
|
public void ParseComicChapterTest(string filename, string expected)
|
||||||
{
|
{
|
||||||
Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename));
|
Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename));
|
||||||
@ -172,10 +176,26 @@ namespace API.Tests.Parser
|
|||||||
FullFilePath = filepath, IsSpecial = false
|
FullFilePath = filepath, IsSpecial = false
|
||||||
});
|
});
|
||||||
|
|
||||||
|
filepath = @"E:\Comics\Comics\Publisher\Batman the Detective (2021)\Batman the Detective - v6 - 11 - (2021).cbr";
|
||||||
|
expected.Add(filepath, new ParserInfo
|
||||||
|
{
|
||||||
|
Series = "Batman the Detective", Volumes = "6", Edition = "",
|
||||||
|
Chapters = "11", Filename = "Batman the Detective - v6 - 11 - (2021).cbr", Format = MangaFormat.Archive,
|
||||||
|
FullFilePath = filepath, IsSpecial = false
|
||||||
|
});
|
||||||
|
|
||||||
|
filepath = @"E:\Comics\Comics\Batman - The Man Who Laughs #1 (2005)\Batman - The Man Who Laughs #1 (2005).cbr";
|
||||||
|
expected.Add(filepath, new ParserInfo
|
||||||
|
{
|
||||||
|
Series = "Batman - The Man Who Laughs", Volumes = "0", Edition = "",
|
||||||
|
Chapters = "1", Filename = "Batman - The Man Who Laughs #1 (2005).cbr", Format = MangaFormat.Archive,
|
||||||
|
FullFilePath = filepath, IsSpecial = false
|
||||||
|
});
|
||||||
|
|
||||||
foreach (var file in expected.Keys)
|
foreach (var file in expected.Keys)
|
||||||
{
|
{
|
||||||
var expectedInfo = expected[file];
|
var expectedInfo = expected[file];
|
||||||
var actual = API.Parser.Parser.Parse(file, rootPath);
|
var actual = API.Parser.Parser.Parse(file, rootPath, LibraryType.Comic);
|
||||||
if (expectedInfo == null)
|
if (expectedInfo == null)
|
||||||
{
|
{
|
||||||
Assert.Null(actual);
|
Assert.Null(actual);
|
||||||
|
@ -297,6 +297,7 @@ namespace API.Tests.Parser
|
|||||||
[Theory]
|
[Theory]
|
||||||
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
|
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
|
||||||
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
|
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
|
||||||
|
[InlineData("/manga/Monster #8/Ch. 001-016 [MangaPlus] [Digital] [amit34521]/Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]/13.jpg", "Monster #8~0~1")]
|
||||||
public void ParseFromFallbackFoldersTest(string inputFile, string expectedParseInfo)
|
public void ParseFromFallbackFoldersTest(string inputFile, string expectedParseInfo)
|
||||||
{
|
{
|
||||||
const string rootDirectory = "/manga/";
|
const string rootDirectory = "/manga/";
|
||||||
@ -438,6 +439,22 @@ namespace API.Tests.Parser
|
|||||||
filepath = @"E:\Manga\Seraph of the End\cover.png";
|
filepath = @"E:\Manga\Seraph of the End\cover.png";
|
||||||
expected.Add(filepath, null);
|
expected.Add(filepath, null);
|
||||||
|
|
||||||
|
filepath = @"E:\Manga\The Beginning After the End\Chapter 001.cbz";
|
||||||
|
expected.Add(filepath, new ParserInfo
|
||||||
|
{
|
||||||
|
Series = "The Beginning After the End", Volumes = "0", Edition = "",
|
||||||
|
Chapters = "1", Filename = "Chapter 001.cbz", Format = MangaFormat.Archive,
|
||||||
|
FullFilePath = filepath, IsSpecial = false
|
||||||
|
});
|
||||||
|
|
||||||
|
filepath = @"E:\Manga\Monster #8\Ch. 001-016 [MangaPlus] [Digital] [amit34521]\Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]\13.jpg";
|
||||||
|
expected.Add(filepath, new ParserInfo
|
||||||
|
{
|
||||||
|
Series = "Monster #8", Volumes = "0", Edition = "",
|
||||||
|
Chapters = "1", Filename = "13.jpg", Format = MangaFormat.Archive,
|
||||||
|
FullFilePath = filepath, IsSpecial = false
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
foreach (var file in expected.Keys)
|
foreach (var file in expected.Keys)
|
||||||
{
|
{
|
||||||
|
@ -140,9 +140,10 @@ namespace API.Tests.Services
|
|||||||
[InlineData(new [] {"page 2.jpg", "page 10.jpg"}, "page 2.jpg")]
|
[InlineData(new [] {"page 2.jpg", "page 10.jpg"}, "page 2.jpg")]
|
||||||
[InlineData(new [] {"__MACOSX/cover.jpg", "vol1/page 01.jpg"}, "vol1/page 01.jpg")]
|
[InlineData(new [] {"__MACOSX/cover.jpg", "vol1/page 01.jpg"}, "vol1/page 01.jpg")]
|
||||||
[InlineData(new [] {"Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c060 (v10) - p200 [Digital] [LuCaZ].jpg", "folder.jpg"}, "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg")]
|
[InlineData(new [] {"Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c060 (v10) - p200 [Digital] [LuCaZ].jpg", "folder.jpg"}, "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg")]
|
||||||
|
[InlineData(new [] {"001.jpg", "001 - chapter 1/001.jpg"}, "001.jpg")]
|
||||||
public void FindFirstEntry(string[] files, string expected)
|
public void FindFirstEntry(string[] files, string expected)
|
||||||
{
|
{
|
||||||
var foundFile = _archiveService.FirstFileEntry(files);
|
var foundFile = ArchiveService.FirstFileEntry(files, string.Empty);
|
||||||
Assert.Equal(expected, string.IsNullOrEmpty(foundFile) ? "" : foundFile);
|
Assert.Equal(expected, string.IsNullOrEmpty(foundFile) ? "" : foundFile);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -36,7 +36,7 @@ namespace API.Tests.Services
|
|||||||
public void GetFiles_WithCustomRegex_ShouldPass_Test()
|
public void GetFiles_WithCustomRegex_ShouldPass_Test()
|
||||||
{
|
{
|
||||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/regex");
|
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/regex");
|
||||||
var files = _directoryService.GetFiles(testDirectory, @"file\d*.txt");
|
var files = DirectoryService.GetFiles(testDirectory, @"file\d*.txt");
|
||||||
Assert.Equal(2, files.Count());
|
Assert.Equal(2, files.Count());
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -44,7 +44,7 @@ namespace API.Tests.Services
|
|||||||
public void GetFiles_TopLevel_ShouldBeEmpty_Test()
|
public void GetFiles_TopLevel_ShouldBeEmpty_Test()
|
||||||
{
|
{
|
||||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService");
|
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService");
|
||||||
var files = _directoryService.GetFiles(testDirectory);
|
var files = DirectoryService.GetFiles(testDirectory);
|
||||||
Assert.Empty(files);
|
Assert.Empty(files);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -52,7 +52,7 @@ namespace API.Tests.Services
|
|||||||
public void GetFilesWithExtensions_ShouldBeEmpty_Test()
|
public void GetFilesWithExtensions_ShouldBeEmpty_Test()
|
||||||
{
|
{
|
||||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extensions");
|
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extensions");
|
||||||
var files = _directoryService.GetFiles(testDirectory, "*.txt");
|
var files = DirectoryService.GetFiles(testDirectory, "*.txt");
|
||||||
Assert.Empty(files);
|
Assert.Empty(files);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -60,7 +60,7 @@ namespace API.Tests.Services
|
|||||||
public void GetFilesWithExtensions_Test()
|
public void GetFilesWithExtensions_Test()
|
||||||
{
|
{
|
||||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extension");
|
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extension");
|
||||||
var files = _directoryService.GetFiles(testDirectory, ".cbz|.rar");
|
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
|
||||||
Assert.Equal(3, files.Count());
|
Assert.Equal(3, files.Count());
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -68,7 +68,7 @@ namespace API.Tests.Services
|
|||||||
public void GetFilesWithExtensions_BadDirectory_ShouldBeEmpty_Test()
|
public void GetFilesWithExtensions_BadDirectory_ShouldBeEmpty_Test()
|
||||||
{
|
{
|
||||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/doesntexist");
|
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/doesntexist");
|
||||||
var files = _directoryService.GetFiles(testDirectory, ".cbz|.rar");
|
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
|
||||||
Assert.Empty(files);
|
Assert.Empty(files);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -6,6 +6,10 @@ using static System.String;
|
|||||||
|
|
||||||
namespace API.Comparators
|
namespace API.Comparators
|
||||||
{
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Attempts to emulate Windows explorer sorting
|
||||||
|
/// </summary>
|
||||||
|
/// <remarks>This is not thread-safe</remarks>
|
||||||
public sealed class NaturalSortComparer : IComparer<string>, IDisposable
|
public sealed class NaturalSortComparer : IComparer<string>, IDisposable
|
||||||
{
|
{
|
||||||
private readonly bool _isAscending;
|
private readonly bool _isAscending;
|
||||||
@ -23,7 +27,6 @@ namespace API.Comparators
|
|||||||
{
|
{
|
||||||
if (x == y) return 0;
|
if (x == y) return 0;
|
||||||
|
|
||||||
// Should be fixed: Operations that change non-concurrent collections must have exclusive access. A concurrent update was performed on this collection and corrupted its state. The collection's state is no longer correct.
|
|
||||||
if (!_table.TryGetValue(x ?? Empty, out var x1))
|
if (!_table.TryGetValue(x ?? Empty, out var x1))
|
||||||
{
|
{
|
||||||
x1 = Regex.Split(x ?? Empty, "([0-9]+)");
|
x1 = Regex.Split(x ?? Empty, "([0-9]+)");
|
||||||
@ -33,7 +36,6 @@ namespace API.Comparators
|
|||||||
if (!_table.TryGetValue(y ?? Empty, out var y1))
|
if (!_table.TryGetValue(y ?? Empty, out var y1))
|
||||||
{
|
{
|
||||||
y1 = Regex.Split(y ?? Empty, "([0-9]+)");
|
y1 = Regex.Split(y ?? Empty, "([0-9]+)");
|
||||||
// Should be fixed: EXCEPTION: An item with the same key has already been added. Key: M:\Girls of the Wild's\Girls of the Wild's - Ep. 083 (Season 1) [LINE Webtoon].cbz
|
|
||||||
_table.Add(y ?? Empty, y1);
|
_table.Add(y ?? Empty, y1);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -59,6 +61,7 @@ namespace API.Comparators
|
|||||||
returnVal = 0;
|
returnVal = 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
return _isAscending ? returnVal : -returnVal;
|
return _isAscending ? returnVal : -returnVal;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -259,7 +259,10 @@ namespace API.Controllers
|
|||||||
}
|
}
|
||||||
|
|
||||||
var styleContent = await _bookService.ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase, book.Content.Css[key].FileName, book);
|
var styleContent = await _bookService.ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase, book.Content.Css[key].FileName, book);
|
||||||
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
|
if (styleContent != null)
|
||||||
|
{
|
||||||
|
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -2,7 +2,9 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
|
using API.Data;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
using API.Extensions;
|
using API.Extensions;
|
||||||
using API.Interfaces;
|
using API.Interfaces;
|
||||||
@ -90,6 +92,40 @@ namespace API.Controllers
|
|||||||
return BadRequest("Something went wrong, please try again");
|
return BadRequest("Something went wrong, please try again");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds a collection tag onto multiple Series. If tag id is 0, this will create a new tag.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="dto"></param>
|
||||||
|
/// <returns></returns>
|
||||||
|
[HttpPost("update-for-series")]
|
||||||
|
public async Task<ActionResult> AddToMultipleSeries(CollectionTagBulkAddDto dto)
|
||||||
|
{
|
||||||
|
var tag = await _unitOfWork.CollectionTagRepository.GetFullTagAsync(dto.CollectionTagId);
|
||||||
|
if (tag == null)
|
||||||
|
{
|
||||||
|
tag = DbFactory.CollectionTag(0, dto.CollectionTagTitle, String.Empty, false);
|
||||||
|
_unitOfWork.CollectionTagRepository.Add(tag);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
var seriesMetadatas = await _unitOfWork.SeriesRepository.GetSeriesMetadataForIdsAsync(dto.SeriesIds);
|
||||||
|
foreach (var metadata in seriesMetadatas)
|
||||||
|
{
|
||||||
|
if (!metadata.CollectionTags.Any(t => t.Title.Equals(tag.Title, StringComparison.InvariantCulture)))
|
||||||
|
{
|
||||||
|
metadata.CollectionTags.Add(tag);
|
||||||
|
_unitOfWork.SeriesMetadataRepository.Update(metadata);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!_unitOfWork.HasChanges()) return Ok();
|
||||||
|
if (await _unitOfWork.CommitAsync())
|
||||||
|
{
|
||||||
|
return Ok();
|
||||||
|
}
|
||||||
|
return BadRequest("There was an issue updating series with collection tag");
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// For a given tag, update the summary if summary has changed and remove a set of series from the tag.
|
/// For a given tag, update the summary if summary has changed and remove a set of series from the tag.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
|
@ -164,7 +164,7 @@ namespace API.Controllers
|
|||||||
case MangaFormat.Archive:
|
case MangaFormat.Archive:
|
||||||
case MangaFormat.Pdf:
|
case MangaFormat.Pdf:
|
||||||
_cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
|
_cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
|
||||||
var originalFiles = _directoryService.GetFilesWithExtension(chapterExtractPath,
|
var originalFiles = DirectoryService.GetFilesWithExtension(chapterExtractPath,
|
||||||
Parser.Parser.ImageFileExtensions);
|
Parser.Parser.ImageFileExtensions);
|
||||||
_directoryService.CopyFilesToDirectory(originalFiles, chapterExtractPath, $"{chapterId}_");
|
_directoryService.CopyFilesToDirectory(originalFiles, chapterExtractPath, $"{chapterId}_");
|
||||||
DirectoryService.DeleteFiles(originalFiles);
|
DirectoryService.DeleteFiles(originalFiles);
|
||||||
@ -175,7 +175,7 @@ namespace API.Controllers
|
|||||||
return BadRequest("Series is not in a valid format. Please rescan series and try again.");
|
return BadRequest("Series is not in a valid format. Please rescan series and try again.");
|
||||||
}
|
}
|
||||||
|
|
||||||
var files = _directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
|
var files = DirectoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
|
||||||
// Filter out images that aren't in bookmarks
|
// Filter out images that aren't in bookmarks
|
||||||
Array.Sort(files, _numericComparer);
|
Array.Sort(files, _numericComparer);
|
||||||
totalFilePaths.AddRange(files.Where((_, i) => chapterPages.Contains(i)));
|
totalFilePaths.AddRange(files.Where((_, i) => chapterPages.Contains(i)));
|
||||||
|
@ -226,7 +226,7 @@ namespace API.Controllers
|
|||||||
[HttpGet("search")]
|
[HttpGet("search")]
|
||||||
public async Task<ActionResult<IEnumerable<SearchResultDto>>> Search(string queryString)
|
public async Task<ActionResult<IEnumerable<SearchResultDto>>> Search(string queryString)
|
||||||
{
|
{
|
||||||
queryString = queryString.Trim().Replace(@"%", "");
|
queryString = Uri.UnescapeDataString(queryString).Trim().Replace(@"%", string.Empty);
|
||||||
|
|
||||||
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
|
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
|
||||||
// Get libraries user has access to
|
// Get libraries user has access to
|
||||||
|
@ -6,6 +6,7 @@ using System.Threading.Tasks;
|
|||||||
using System.Xml.Serialization;
|
using System.Xml.Serialization;
|
||||||
using API.Comparators;
|
using API.Comparators;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.DTOs.Filtering;
|
using API.DTOs.Filtering;
|
||||||
using API.DTOs.OPDS;
|
using API.DTOs.OPDS;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
@ -738,7 +739,7 @@ namespace API.Controllers
|
|||||||
[HttpGet("{apiKey}/favicon")]
|
[HttpGet("{apiKey}/favicon")]
|
||||||
public async Task<ActionResult> GetFavicon(string apiKey)
|
public async Task<ActionResult> GetFavicon(string apiKey)
|
||||||
{
|
{
|
||||||
var files = _directoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
|
var files = DirectoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
|
||||||
if (files.Length == 0) return BadRequest("Cannot find icon");
|
if (files.Length == 0) return BadRequest("Cannot find icon");
|
||||||
var path = files[0];
|
var path = files[0];
|
||||||
var content = await _directoryService.ReadFileAsync(path);
|
var content = await _directoryService.ReadFileAsync(path);
|
||||||
|
@ -78,8 +78,9 @@ namespace API.Controllers
|
|||||||
public async Task<ActionResult<bool>> DeleteSeries(int seriesId)
|
public async Task<ActionResult<bool>> DeleteSeries(int seriesId)
|
||||||
{
|
{
|
||||||
var username = User.GetUsername();
|
var username = User.GetUsername();
|
||||||
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
|
|
||||||
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
|
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
|
||||||
|
|
||||||
|
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
|
||||||
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
|
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
|
||||||
|
|
||||||
if (result)
|
if (result)
|
||||||
@ -92,6 +93,34 @@ namespace API.Controllers
|
|||||||
return Ok(result);
|
return Ok(result);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Authorize(Policy = "RequireAdminRole")]
|
||||||
|
[HttpPost("delete-multiple")]
|
||||||
|
public async Task<ActionResult> DeleteMultipleSeries(DeleteSeriesDto dto)
|
||||||
|
{
|
||||||
|
var username = User.GetUsername();
|
||||||
|
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", dto.SeriesIds, username);
|
||||||
|
|
||||||
|
var chapterMappings =
|
||||||
|
await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(dto.SeriesIds.ToArray());
|
||||||
|
|
||||||
|
var allChapterIds = new List<int>();
|
||||||
|
foreach (var mapping in chapterMappings)
|
||||||
|
{
|
||||||
|
allChapterIds.AddRange(mapping.Value);
|
||||||
|
}
|
||||||
|
|
||||||
|
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(dto.SeriesIds);
|
||||||
|
_unitOfWork.SeriesRepository.Remove(series);
|
||||||
|
|
||||||
|
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
|
||||||
|
{
|
||||||
|
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
|
||||||
|
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
|
||||||
|
_taskScheduler.CleanupChapters(allChapterIds.ToArray());
|
||||||
|
}
|
||||||
|
return Ok();
|
||||||
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Returns All volumes for a series with progress information and Chapters
|
/// Returns All volumes for a series with progress information and Chapters
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@ -212,6 +241,8 @@ namespace API.Controllers
|
|||||||
.Take(userParams.PageSize).ToList();
|
.Take(userParams.PageSize).ToList();
|
||||||
var pagedList = new PagedList<SeriesDto>(listResults, listResults.Count, userParams.PageNumber, userParams.PageSize);
|
var pagedList = new PagedList<SeriesDto>(listResults, listResults.Count, userParams.PageNumber, userParams.PageSize);
|
||||||
|
|
||||||
|
await _unitOfWork.SeriesRepository.AddSeriesModifiers(userId, pagedList);
|
||||||
|
|
||||||
Response.AddPaginationHeader(pagedList.CurrentPage, pagedList.PageSize, pagedList.TotalCount, pagedList.TotalPages);
|
Response.AddPaginationHeader(pagedList.CurrentPage, pagedList.PageSize, pagedList.TotalCount, pagedList.TotalPages);
|
||||||
|
|
||||||
return Ok(pagedList);
|
return Ok(pagedList);
|
||||||
|
@ -71,10 +71,10 @@ namespace API.Controllers
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <returns></returns>
|
/// <returns></returns>
|
||||||
[HttpPost("backup-db")]
|
[HttpPost("backup-db")]
|
||||||
public ActionResult BackupDatabase()
|
public async Task<ActionResult> BackupDatabase()
|
||||||
{
|
{
|
||||||
_logger.LogInformation("{UserName} is backing up database of server from admin dashboard", User.GetUsername());
|
_logger.LogInformation("{UserName} is backing up database of server from admin dashboard", User.GetUsername());
|
||||||
_backupService.BackupDatabase();
|
await _backupService.BackupDatabase();
|
||||||
|
|
||||||
return Ok();
|
return Ok();
|
||||||
}
|
}
|
||||||
|
@ -140,7 +140,7 @@ namespace API.Controllers
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!_unitOfWork.HasChanges()) return Ok("Nothing was updated");
|
if (!_unitOfWork.HasChanges()) return Ok(updateSettingsDto);
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
|
@ -25,7 +25,7 @@ namespace API.Controllers
|
|||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
await _statsService.PathData(clientInfoDto);
|
await _statsService.RecordClientInfo(clientInfoDto);
|
||||||
|
|
||||||
return Ok();
|
return Ok();
|
||||||
}
|
}
|
||||||
|
18
API/DTOs/CollectionTags/CollectionTagBulkAddDto.cs
Normal file
18
API/DTOs/CollectionTags/CollectionTagBulkAddDto.cs
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
using System.Collections.Generic;
|
||||||
|
|
||||||
|
namespace API.DTOs.CollectionTags
|
||||||
|
{
|
||||||
|
public class CollectionTagBulkAddDto
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Collection Tag Id
|
||||||
|
/// </summary>
|
||||||
|
/// <remarks>Can be 0 which then will use Title to create a tag</remarks>
|
||||||
|
public int CollectionTagId { get; init; }
|
||||||
|
public string CollectionTagTitle { get; init; }
|
||||||
|
/// <summary>
|
||||||
|
/// Series Ids to add onto Collection Tag
|
||||||
|
/// </summary>
|
||||||
|
public IEnumerable<int> SeriesIds { get; init; }
|
||||||
|
}
|
||||||
|
}
|
@ -1,4 +1,4 @@
|
|||||||
namespace API.DTOs
|
namespace API.DTOs.CollectionTags
|
||||||
{
|
{
|
||||||
public class CollectionTagDto
|
public class CollectionTagDto
|
||||||
{
|
{
|
@ -1,10 +1,10 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
|
|
||||||
namespace API.DTOs
|
namespace API.DTOs.CollectionTags
|
||||||
{
|
{
|
||||||
public class UpdateSeriesForTagDto
|
public class UpdateSeriesForTagDto
|
||||||
{
|
{
|
||||||
public CollectionTagDto Tag { get; init; }
|
public CollectionTagDto Tag { get; init; }
|
||||||
public ICollection<int> SeriesIdsToRemove { get; init; }
|
public IEnumerable<int> SeriesIdsToRemove { get; init; }
|
||||||
}
|
}
|
||||||
}
|
}
|
9
API/DTOs/DeleteSeriesDto.cs
Normal file
9
API/DTOs/DeleteSeriesDto.cs
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
using System.Collections.Generic;
|
||||||
|
|
||||||
|
namespace API.DTOs
|
||||||
|
{
|
||||||
|
public class DeleteSeriesDto
|
||||||
|
{
|
||||||
|
public IList<int> SeriesIds { get; set; }
|
||||||
|
}
|
||||||
|
}
|
@ -1,4 +1,5 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
|
|
||||||
namespace API.DTOs
|
namespace API.DTOs
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
|
|
||||||
namespace API.DTOs
|
namespace API.DTOs
|
||||||
{
|
{
|
||||||
|
166
API/Data/MigrateConfigFiles.cs
Normal file
166
API/Data/MigrateConfigFiles.cs
Normal file
@ -0,0 +1,166 @@
|
|||||||
|
using System;
|
||||||
|
using System.Collections.Generic;
|
||||||
|
using System.IO;
|
||||||
|
using System.Linq;
|
||||||
|
using API.Services;
|
||||||
|
using Kavita.Common;
|
||||||
|
|
||||||
|
namespace API.Data
|
||||||
|
{
|
||||||
|
public static class MigrateConfigFiles
|
||||||
|
{
|
||||||
|
private static readonly List<string> LooseLeafFiles = new List<string>()
|
||||||
|
{
|
||||||
|
"appsettings.json",
|
||||||
|
"appsettings.Development.json",
|
||||||
|
"kavita.db",
|
||||||
|
};
|
||||||
|
|
||||||
|
private static readonly List<string> AppFolders = new List<string>()
|
||||||
|
{
|
||||||
|
"covers",
|
||||||
|
"stats",
|
||||||
|
"logs",
|
||||||
|
"backups",
|
||||||
|
"cache",
|
||||||
|
"temp"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static readonly string ConfigDirectory = Path.Join(Directory.GetCurrentDirectory(), "config");
|
||||||
|
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// In v0.4.8 we moved all config files to config/ to match with how docker was setup. This will move all config files from current directory
|
||||||
|
/// to config/
|
||||||
|
/// </summary>
|
||||||
|
public static void Migrate(bool isDocker)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Checking if migration to config/ is needed");
|
||||||
|
|
||||||
|
if (isDocker)
|
||||||
|
{
|
||||||
|
if (Configuration.LogPath.Contains("config"))
|
||||||
|
{
|
||||||
|
Console.WriteLine("Migration to config/ not needed");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
Console.WriteLine(
|
||||||
|
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
|
||||||
|
|
||||||
|
CopyAppFolders();
|
||||||
|
DeleteAppFolders();
|
||||||
|
|
||||||
|
UpdateConfiguration();
|
||||||
|
|
||||||
|
Console.WriteLine("Migration complete. All config files are now in config/ directory");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (new FileInfo(Configuration.AppSettingsFilename).Exists)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Migration to config/ not needed");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
Console.WriteLine(
|
||||||
|
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
|
||||||
|
|
||||||
|
Console.WriteLine($"Creating {ConfigDirectory}");
|
||||||
|
DirectoryService.ExistOrCreate(ConfigDirectory);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
CopyLooseLeafFiles();
|
||||||
|
|
||||||
|
CopyAppFolders();
|
||||||
|
|
||||||
|
// Then we need to update the config file to point to the new DB file
|
||||||
|
UpdateConfiguration();
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
Console.WriteLine("There was an exception during migration. Please move everything manually.");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Finally delete everything in the source directory
|
||||||
|
Console.WriteLine("Removing old files");
|
||||||
|
DeleteLooseFiles();
|
||||||
|
DeleteAppFolders();
|
||||||
|
Console.WriteLine("Removing old files...DONE");
|
||||||
|
|
||||||
|
Console.WriteLine("Migration complete. All config files are now in config/ directory");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void DeleteAppFolders()
|
||||||
|
{
|
||||||
|
foreach (var folderToDelete in AppFolders)
|
||||||
|
{
|
||||||
|
if (!new DirectoryInfo(Path.Join(Directory.GetCurrentDirectory(), folderToDelete)).Exists) continue;
|
||||||
|
|
||||||
|
DirectoryService.ClearAndDeleteDirectory(Path.Join(Directory.GetCurrentDirectory(), folderToDelete));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void DeleteLooseFiles()
|
||||||
|
{
|
||||||
|
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
|
||||||
|
.Where(f => f.Exists);
|
||||||
|
DirectoryService.DeleteFiles(configFiles.Select(f => f.FullName));
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void CopyAppFolders()
|
||||||
|
{
|
||||||
|
Console.WriteLine("Moving folders to config");
|
||||||
|
|
||||||
|
foreach (var folderToMove in AppFolders)
|
||||||
|
{
|
||||||
|
if (new DirectoryInfo(Path.Join(ConfigDirectory, folderToMove)).Exists) continue;
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
DirectoryService.CopyDirectoryToDirectory(
|
||||||
|
Path.Join(Directory.GetCurrentDirectory(), folderToMove),
|
||||||
|
Path.Join(ConfigDirectory, folderToMove));
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Console.WriteLine("Moving folders to config...DONE");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void CopyLooseLeafFiles()
|
||||||
|
{
|
||||||
|
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
|
||||||
|
.Where(f => f.Exists);
|
||||||
|
// First step is to move all the files
|
||||||
|
Console.WriteLine("Moving files to config/");
|
||||||
|
foreach (var fileInfo in configFiles)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
fileInfo.CopyTo(Path.Join(ConfigDirectory, fileInfo.Name));
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow exception when already exists */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Console.WriteLine("Moving files to config...DONE");
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void UpdateConfiguration()
|
||||||
|
{
|
||||||
|
Console.WriteLine("Updating appsettings.json to new paths");
|
||||||
|
Configuration.DatabasePath = "config//kavita.db";
|
||||||
|
Configuration.LogPath = "config//logs/kavita.log";
|
||||||
|
Console.WriteLine("Updating appsettings.json to new paths...DONE");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -3,6 +3,7 @@ using System.IO;
|
|||||||
using System.Linq;
|
using System.Linq;
|
||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
using API.Interfaces.Repositories;
|
using API.Interfaces.Repositories;
|
||||||
using AutoMapper;
|
using AutoMapper;
|
||||||
@ -22,6 +23,11 @@ namespace API.Data.Repositories
|
|||||||
_mapper = mapper;
|
_mapper = mapper;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public void Add(CollectionTag tag)
|
||||||
|
{
|
||||||
|
_context.CollectionTag.Add(tag);
|
||||||
|
}
|
||||||
|
|
||||||
public void Remove(CollectionTag tag)
|
public void Remove(CollectionTag tag)
|
||||||
{
|
{
|
||||||
_context.CollectionTag.Remove(tag);
|
_context.CollectionTag.Remove(tag);
|
||||||
|
20
API/Data/Repositories/SeriesMetadataRepository.cs
Normal file
20
API/Data/Repositories/SeriesMetadataRepository.cs
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
using API.Entities;
|
||||||
|
using API.Interfaces.Repositories;
|
||||||
|
|
||||||
|
namespace API.Data.Repositories
|
||||||
|
{
|
||||||
|
public class SeriesMetadataRepository : ISeriesMetadataRepository
|
||||||
|
{
|
||||||
|
private readonly DataContext _context;
|
||||||
|
|
||||||
|
public SeriesMetadataRepository(DataContext context)
|
||||||
|
{
|
||||||
|
_context = context;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void Update(SeriesMetadata seriesMetadata)
|
||||||
|
{
|
||||||
|
_context.SeriesMetadata.Update(seriesMetadata);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -4,6 +4,7 @@ using System.Linq;
|
|||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
using API.Data.Scanner;
|
using API.Data.Scanner;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.DTOs.Filtering;
|
using API.DTOs.Filtering;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
using API.Extensions;
|
using API.Extensions;
|
||||||
@ -41,6 +42,11 @@ namespace API.Data.Repositories
|
|||||||
_context.Series.Remove(series);
|
_context.Series.Remove(series);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public void Remove(IEnumerable<Series> series)
|
||||||
|
{
|
||||||
|
_context.Series.RemoveRange(series);
|
||||||
|
}
|
||||||
|
|
||||||
public async Task<bool> DoesSeriesNameExistInLibrary(string name)
|
public async Task<bool> DoesSeriesNameExistInLibrary(string name)
|
||||||
{
|
{
|
||||||
var libraries = _context.Series
|
var libraries = _context.Series
|
||||||
@ -171,6 +177,21 @@ namespace API.Data.Repositories
|
|||||||
.SingleOrDefaultAsync();
|
.SingleOrDefaultAsync();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Returns Volumes, Metadata, and Collection Tags
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="seriesIds"></param>
|
||||||
|
/// <returns></returns>
|
||||||
|
public async Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds)
|
||||||
|
{
|
||||||
|
return await _context.Series
|
||||||
|
.Include(s => s.Volumes)
|
||||||
|
.Include(s => s.Metadata)
|
||||||
|
.ThenInclude(m => m.CollectionTags)
|
||||||
|
.Where(s => seriesIds.Contains(s.Id))
|
||||||
|
.ToListAsync();
|
||||||
|
}
|
||||||
|
|
||||||
public async Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds)
|
public async Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds)
|
||||||
{
|
{
|
||||||
var volumes = await _context.Volume
|
var volumes = await _context.Volume
|
||||||
@ -454,15 +475,15 @@ namespace API.Data.Repositories
|
|||||||
// TODO: Think about making this bigger depending on number of files a user has in said library
|
// TODO: Think about making this bigger depending on number of files a user has in said library
|
||||||
// and number of cores and amount of memory. We can then make an optimal choice
|
// and number of cores and amount of memory. We can then make an optimal choice
|
||||||
var totalSeries = await GetSeriesCount(libraryId);
|
var totalSeries = await GetSeriesCount(libraryId);
|
||||||
var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
|
// var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
|
||||||
|
//
|
||||||
if (totalSeries < procCount * 2 || totalSeries < 50)
|
// if (totalSeries < procCount * 2 || totalSeries < 50)
|
||||||
{
|
// {
|
||||||
return new Tuple<int, int>(totalSeries, totalSeries);
|
// return new Tuple<int, int>(totalSeries, totalSeries);
|
||||||
}
|
// }
|
||||||
|
//
|
||||||
|
// return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
|
||||||
return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
|
return new Tuple<int, int>(totalSeries, 50);
|
||||||
}
|
}
|
||||||
|
|
||||||
public async Task<Chunk> GetChunkInfo(int libraryId = 0)
|
public async Task<Chunk> GetChunkInfo(int libraryId = 0)
|
||||||
@ -485,5 +506,13 @@ namespace API.Data.Repositories
|
|||||||
TotalChunks = totalChunks
|
TotalChunks = totalChunks
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public async Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds)
|
||||||
|
{
|
||||||
|
return await _context.SeriesMetadata
|
||||||
|
.Where(sm => seriesIds.Contains(sm.SeriesId))
|
||||||
|
.Include(sm => sm.CollectionTags)
|
||||||
|
.ToListAsync();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -35,15 +35,6 @@ namespace API.Data.Repositories
|
|||||||
return _mapper.Map<ServerSettingDto>(settings);
|
return _mapper.Map<ServerSettingDto>(settings);
|
||||||
}
|
}
|
||||||
|
|
||||||
public ServerSettingDto GetSettingsDto()
|
|
||||||
{
|
|
||||||
var settings = _context.ServerSetting
|
|
||||||
.Select(x => x)
|
|
||||||
.AsNoTracking()
|
|
||||||
.ToList();
|
|
||||||
return _mapper.Map<ServerSettingDto>(settings);
|
|
||||||
}
|
|
||||||
|
|
||||||
public Task<ServerSetting> GetSettingAsync(ServerSettingKey key)
|
public Task<ServerSetting> GetSettingAsync(ServerSettingKey key)
|
||||||
{
|
{
|
||||||
return _context.ServerSetting.SingleOrDefaultAsync(x => x.Key == key);
|
return _context.ServerSetting.SingleOrDefaultAsync(x => x.Key == key);
|
||||||
|
@ -41,11 +41,11 @@ namespace API.Data
|
|||||||
|
|
||||||
IList<ServerSetting> defaultSettings = new List<ServerSetting>()
|
IList<ServerSetting> defaultSettings = new List<ServerSetting>()
|
||||||
{
|
{
|
||||||
new() {Key = ServerSettingKey.CacheDirectory, Value = CacheService.CacheDirectory},
|
new() {Key = ServerSettingKey.CacheDirectory, Value = DirectoryService.CacheDirectory},
|
||||||
new () {Key = ServerSettingKey.TaskScan, Value = "daily"},
|
new () {Key = ServerSettingKey.TaskScan, Value = "daily"},
|
||||||
new () {Key = ServerSettingKey.LoggingLevel, Value = "Information"}, // Not used from DB, but DB is sync with appSettings.json
|
new () {Key = ServerSettingKey.LoggingLevel, Value = "Information"}, // Not used from DB, but DB is sync with appSettings.json
|
||||||
new () {Key = ServerSettingKey.TaskBackup, Value = "weekly"},
|
new () {Key = ServerSettingKey.TaskBackup, Value = "weekly"},
|
||||||
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "backups/"))},
|
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(DirectoryService.BackupDirectory)},
|
||||||
new () {Key = ServerSettingKey.Port, Value = "5000"}, // Not used from DB, but DB is sync with appSettings.json
|
new () {Key = ServerSettingKey.Port, Value = "5000"}, // Not used from DB, but DB is sync with appSettings.json
|
||||||
new () {Key = ServerSettingKey.AllowStatCollection, Value = "true"},
|
new () {Key = ServerSettingKey.AllowStatCollection, Value = "true"},
|
||||||
new () {Key = ServerSettingKey.EnableOpds, Value = "false"},
|
new () {Key = ServerSettingKey.EnableOpds, Value = "false"},
|
||||||
@ -69,6 +69,8 @@ namespace API.Data
|
|||||||
Configuration.Port + string.Empty;
|
Configuration.Port + string.Empty;
|
||||||
context.ServerSetting.First(s => s.Key == ServerSettingKey.LoggingLevel).Value =
|
context.ServerSetting.First(s => s.Key == ServerSettingKey.LoggingLevel).Value =
|
||||||
Configuration.LogLevel + string.Empty;
|
Configuration.LogLevel + string.Empty;
|
||||||
|
context.ServerSetting.First(s => s.Key == ServerSettingKey.CacheDirectory).Value =
|
||||||
|
DirectoryService.CacheDirectory + string.Empty;
|
||||||
|
|
||||||
await context.SaveChangesAsync();
|
await context.SaveChangesAsync();
|
||||||
|
|
||||||
|
@ -34,6 +34,7 @@ namespace API.Data
|
|||||||
public IFileRepository FileRepository => new FileRepository(_context);
|
public IFileRepository FileRepository => new FileRepository(_context);
|
||||||
public IChapterRepository ChapterRepository => new ChapterRepository(_context, _mapper);
|
public IChapterRepository ChapterRepository => new ChapterRepository(_context, _mapper);
|
||||||
public IReadingListRepository ReadingListRepository => new ReadingListRepository(_context, _mapper);
|
public IReadingListRepository ReadingListRepository => new ReadingListRepository(_context, _mapper);
|
||||||
|
public ISeriesMetadataRepository SeriesMetadataRepository => new SeriesMetadataRepository(_context);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Commits changes to the DB. Completes the open transaction.
|
/// Commits changes to the DB. Completes the open transaction.
|
||||||
|
@ -8,6 +8,9 @@ namespace API.Entities
|
|||||||
{
|
{
|
||||||
[Key]
|
[Key]
|
||||||
public ServerSettingKey Key { get; set; }
|
public ServerSettingKey Key { get; set; }
|
||||||
|
/// <summary>
|
||||||
|
/// The value of the Setting. Converter knows how to convert to the correct type
|
||||||
|
/// </summary>
|
||||||
public string Value { get; set; }
|
public string Value { get; set; }
|
||||||
|
|
||||||
/// <inheritdoc />
|
/// <inheritdoc />
|
||||||
|
@ -1,24 +0,0 @@
|
|||||||
using API.Interfaces.Services;
|
|
||||||
using API.Services.Clients;
|
|
||||||
using Microsoft.Extensions.Configuration;
|
|
||||||
using Microsoft.Extensions.DependencyInjection;
|
|
||||||
|
|
||||||
namespace API.Extensions
|
|
||||||
{
|
|
||||||
public static class ServiceCollectionExtensions
|
|
||||||
{
|
|
||||||
public static IServiceCollection AddStartupTask<T>(this IServiceCollection services)
|
|
||||||
where T : class, IStartupTask
|
|
||||||
=> services.AddTransient<IStartupTask, T>();
|
|
||||||
|
|
||||||
public static IServiceCollection AddStatsClient(this IServiceCollection services, IConfiguration configuration)
|
|
||||||
{
|
|
||||||
services.AddHttpClient<StatsApiClient>(client =>
|
|
||||||
{
|
|
||||||
client.DefaultRequestHeaders.Add("api-key", "MsnvA2DfQqxSK5jh");
|
|
||||||
});
|
|
||||||
|
|
||||||
return services;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,6 +1,7 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.DTOs.Reader;
|
using API.DTOs.Reader;
|
||||||
using API.DTOs.ReadingLists;
|
using API.DTOs.ReadingLists;
|
||||||
using API.DTOs.Settings;
|
using API.DTOs.Settings;
|
||||||
|
@ -15,6 +15,7 @@ namespace API.Interfaces
|
|||||||
IFileRepository FileRepository { get; }
|
IFileRepository FileRepository { get; }
|
||||||
IChapterRepository ChapterRepository { get; }
|
IChapterRepository ChapterRepository { get; }
|
||||||
IReadingListRepository ReadingListRepository { get; }
|
IReadingListRepository ReadingListRepository { get; }
|
||||||
|
ISeriesMetadataRepository SeriesMetadataRepository { get; }
|
||||||
bool Commit();
|
bool Commit();
|
||||||
Task<bool> CommitAsync();
|
Task<bool> CommitAsync();
|
||||||
bool HasChanges();
|
bool HasChanges();
|
||||||
|
@ -1,12 +1,14 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
using API.DTOs;
|
using API.DTOs;
|
||||||
|
using API.DTOs.CollectionTags;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
|
|
||||||
namespace API.Interfaces.Repositories
|
namespace API.Interfaces.Repositories
|
||||||
{
|
{
|
||||||
public interface ICollectionTagRepository
|
public interface ICollectionTagRepository
|
||||||
{
|
{
|
||||||
|
void Add(CollectionTag tag);
|
||||||
void Remove(CollectionTag tag);
|
void Remove(CollectionTag tag);
|
||||||
Task<IEnumerable<CollectionTagDto>> GetAllTagDtosAsync();
|
Task<IEnumerable<CollectionTagDto>> GetAllTagDtosAsync();
|
||||||
Task<IEnumerable<CollectionTagDto>> SearchTagDtosAsync(string searchQuery);
|
Task<IEnumerable<CollectionTagDto>> SearchTagDtosAsync(string searchQuery);
|
||||||
|
9
API/Interfaces/Repositories/ISeriesMetadataRepository.cs
Normal file
9
API/Interfaces/Repositories/ISeriesMetadataRepository.cs
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
using API.Entities;
|
||||||
|
|
||||||
|
namespace API.Interfaces.Repositories
|
||||||
|
{
|
||||||
|
public interface ISeriesMetadataRepository
|
||||||
|
{
|
||||||
|
void Update(SeriesMetadata seriesMetadata);
|
||||||
|
}
|
||||||
|
}
|
@ -13,6 +13,7 @@ namespace API.Interfaces.Repositories
|
|||||||
void Attach(Series series);
|
void Attach(Series series);
|
||||||
void Update(Series series);
|
void Update(Series series);
|
||||||
void Remove(Series series);
|
void Remove(Series series);
|
||||||
|
void Remove(IEnumerable<Series> series);
|
||||||
Task<bool> DoesSeriesNameExistInLibrary(string name);
|
Task<bool> DoesSeriesNameExistInLibrary(string name);
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Adds user information like progress, ratings, etc
|
/// Adds user information like progress, ratings, etc
|
||||||
@ -33,6 +34,7 @@ namespace API.Interfaces.Repositories
|
|||||||
Task<SeriesDto> GetSeriesDtoByIdAsync(int seriesId, int userId);
|
Task<SeriesDto> GetSeriesDtoByIdAsync(int seriesId, int userId);
|
||||||
Task<bool> DeleteSeriesAsync(int seriesId);
|
Task<bool> DeleteSeriesAsync(int seriesId);
|
||||||
Task<Series> GetSeriesByIdAsync(int seriesId);
|
Task<Series> GetSeriesByIdAsync(int seriesId);
|
||||||
|
Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds);
|
||||||
Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds);
|
Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds);
|
||||||
Task<IDictionary<int, IList<int>>> GetChapterIdWithSeriesIdForSeriesAsync(int[] seriesIds);
|
Task<IDictionary<int, IList<int>>> GetChapterIdWithSeriesIdForSeriesAsync(int[] seriesIds);
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@ -54,5 +56,6 @@ namespace API.Interfaces.Repositories
|
|||||||
Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams);
|
Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams);
|
||||||
Task<Series> GetFullSeriesForSeriesIdAsync(int seriesId);
|
Task<Series> GetFullSeriesForSeriesIdAsync(int seriesId);
|
||||||
Task<Chunk> GetChunkInfo(int libraryId = 0);
|
Task<Chunk> GetChunkInfo(int libraryId = 0);
|
||||||
|
Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -10,7 +10,6 @@ namespace API.Interfaces.Repositories
|
|||||||
{
|
{
|
||||||
void Update(ServerSetting settings);
|
void Update(ServerSetting settings);
|
||||||
Task<ServerSettingDto> GetSettingsDtoAsync();
|
Task<ServerSettingDto> GetSettingsDtoAsync();
|
||||||
ServerSettingDto GetSettingsDto();
|
|
||||||
Task<ServerSetting> GetSettingAsync(ServerSettingKey key);
|
Task<ServerSetting> GetSettingAsync(ServerSettingKey key);
|
||||||
Task<IEnumerable<ServerSetting>> GetSettingsAsync();
|
Task<IEnumerable<ServerSetting>> GetSettingsAsync();
|
||||||
|
|
||||||
|
@ -12,21 +12,9 @@ namespace API.Interfaces.Services
|
|||||||
/// <param name="rootPath">Absolute path of directory to scan.</param>
|
/// <param name="rootPath">Absolute path of directory to scan.</param>
|
||||||
/// <returns>List of folder names</returns>
|
/// <returns>List of folder names</returns>
|
||||||
IEnumerable<string> ListDirectory(string rootPath);
|
IEnumerable<string> ListDirectory(string rootPath);
|
||||||
/// <summary>
|
|
||||||
/// Gets files in a directory. If searchPatternExpression is passed, will match the regex against for filtering.
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="path"></param>
|
|
||||||
/// <param name="searchPatternExpression"></param>
|
|
||||||
/// <returns></returns>
|
|
||||||
string[] GetFilesWithExtension(string path, string searchPatternExpression = "");
|
|
||||||
Task<byte[]> ReadFileAsync(string path);
|
Task<byte[]> ReadFileAsync(string path);
|
||||||
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
|
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
|
||||||
bool Exists(string directory);
|
bool Exists(string directory);
|
||||||
|
|
||||||
IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
|
|
||||||
SearchOption searchOption = SearchOption.TopDirectoryOnly);
|
|
||||||
|
|
||||||
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
|
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
|
||||||
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*");
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -5,7 +5,7 @@ namespace API.Interfaces.Services
|
|||||||
{
|
{
|
||||||
public interface IStatsService
|
public interface IStatsService
|
||||||
{
|
{
|
||||||
Task PathData(ClientInfoDto clientInfoDto);
|
Task RecordClientInfo(ClientInfoDto clientInfoDto);
|
||||||
Task CollectAndSendStatsData();
|
Task Send();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -24,11 +24,25 @@ namespace API.Parser
|
|||||||
private const RegexOptions MatchOptions =
|
private const RegexOptions MatchOptions =
|
||||||
RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant;
|
RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant;
|
||||||
|
|
||||||
public static readonly Regex FontSrcUrlRegex = new Regex(@"(src:url\(.{1})" + "([^\"']*)" + @"(.{1}\))",
|
/// <summary>
|
||||||
|
/// Matches against font-family css syntax. Does not match if url import has data: starting, as that is binary data
|
||||||
|
/// </summary>
|
||||||
|
/// <remarks>See here for some examples https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face</remarks>
|
||||||
|
public static readonly Regex FontSrcUrlRegex = new Regex(@"(?<Start>(src:\s?)?url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(?<End>.{1}\))",
|
||||||
MatchOptions, RegexTimeout);
|
MatchOptions, RegexTimeout);
|
||||||
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s[\"|'])(?<Filename>[\\w\\d/\\._-]+)([\"|'];?)",
|
/// <summary>
|
||||||
|
/// https://developer.mozilla.org/en-US/docs/Web/CSS/@import
|
||||||
|
/// </summary>
|
||||||
|
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s([\"|']|url\\([\"|']))(?<Filename>[^'\"]+)([\"|']\\)?);",
|
||||||
|
MatchOptions | RegexOptions.Multiline, RegexTimeout);
|
||||||
|
/// <summary>
|
||||||
|
/// Misc css image references, like background-image: url(), border-image, or list-style-image
|
||||||
|
/// </summary>
|
||||||
|
/// Original prepend: (background|border|list-style)-image:\s?)?
|
||||||
|
public static readonly Regex CssImageUrlRegex = new Regex(@"(url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(.\))",
|
||||||
MatchOptions, RegexTimeout);
|
MatchOptions, RegexTimeout);
|
||||||
|
|
||||||
|
|
||||||
private static readonly string XmlRegexExtensions = @"\.xml";
|
private static readonly string XmlRegexExtensions = @"\.xml";
|
||||||
private static readonly Regex ImageRegex = new Regex(ImageFileExtensions,
|
private static readonly Regex ImageRegex = new Regex(ImageFileExtensions,
|
||||||
MatchOptions, RegexTimeout);
|
MatchOptions, RegexTimeout);
|
||||||
@ -212,7 +226,7 @@ namespace API.Parser
|
|||||||
MatchOptions, RegexTimeout),
|
MatchOptions, RegexTimeout),
|
||||||
// Baketeriya ch01-05.zip, Akiiro Bousou Biyori - 01.jpg, Beelzebub_172_RHS.zip, Cynthia the Mission 29.rar, A Compendium of Ghosts - 031 - The Third Story_ Part 12 (Digital) (Cobalt001)
|
// Baketeriya ch01-05.zip, Akiiro Bousou Biyori - 01.jpg, Beelzebub_172_RHS.zip, Cynthia the Mission 29.rar, A Compendium of Ghosts - 031 - The Third Story_ Part 12 (Digital) (Cobalt001)
|
||||||
new Regex(
|
new Regex(
|
||||||
@"^(?!Vol\.?)(?<Series>.+?)( |_|-)(?<!-)(ch)?\d+-?\d*",
|
@"^(?!Vol\.?)(?!Chapter)(?<Series>.+?)(\s|_|-)(?<!-)(ch|chapter)?\.?\d+-?\d*",
|
||||||
MatchOptions, RegexTimeout),
|
MatchOptions, RegexTimeout),
|
||||||
// [BAA]_Darker_than_Black_c1 (This is very greedy, make sure it's close to last)
|
// [BAA]_Darker_than_Black_c1 (This is very greedy, make sure it's close to last)
|
||||||
new Regex(
|
new Regex(
|
||||||
@ -533,14 +547,16 @@ namespace API.Parser
|
|||||||
ret.Edition = edition;
|
ret.Edition = edition;
|
||||||
}
|
}
|
||||||
|
|
||||||
var isSpecial = ParseMangaSpecial(fileName);
|
var isSpecial = type == LibraryType.Comic ? ParseComicSpecial(fileName) : ParseMangaSpecial(fileName);
|
||||||
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
|
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
|
||||||
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
|
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
|
||||||
if (ret.Chapters == DefaultChapter && ret.Volumes == DefaultVolume && !string.IsNullOrEmpty(isSpecial))
|
if (ret.Chapters == DefaultChapter && ret.Volumes == DefaultVolume && !string.IsNullOrEmpty(isSpecial))
|
||||||
{
|
{
|
||||||
ret.IsSpecial = true;
|
ret.IsSpecial = true;
|
||||||
|
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name
|
||||||
if (HasSpecialMarker(fileName))
|
if (HasSpecialMarker(fileName))
|
||||||
{
|
{
|
||||||
ret.IsSpecial = true;
|
ret.IsSpecial = true;
|
||||||
@ -549,8 +565,6 @@ namespace API.Parser
|
|||||||
|
|
||||||
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
|
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
|
||||||
}
|
}
|
||||||
// here is the issue. If we are a special with marker, we need to ensure we use the correct series name.
|
|
||||||
// we can do this by falling back
|
|
||||||
|
|
||||||
if (string.IsNullOrEmpty(ret.Series))
|
if (string.IsNullOrEmpty(ret.Series))
|
||||||
{
|
{
|
||||||
@ -594,8 +608,6 @@ namespace API.Parser
|
|||||||
{
|
{
|
||||||
ret.Chapters = parsedChapter;
|
ret.Chapters = parsedChapter;
|
||||||
}
|
}
|
||||||
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
var series = ParseSeries(folder);
|
var series = ParseSeries(folder);
|
||||||
|
145
API/Program.cs
145
API/Program.cs
@ -1,98 +1,127 @@
|
|||||||
using System;
|
using System;
|
||||||
|
using System.Collections.Generic;
|
||||||
using System.IO;
|
using System.IO;
|
||||||
|
using System.Linq;
|
||||||
using System.Security.Cryptography;
|
using System.Security.Cryptography;
|
||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
using API.Data;
|
using API.Data;
|
||||||
using API.Entities;
|
using API.Entities;
|
||||||
using API.Services;
|
using API.Services;
|
||||||
using Kavita.Common;
|
using Kavita.Common;
|
||||||
|
using Kavita.Common.EnvironmentInfo;
|
||||||
using Microsoft.AspNetCore.Hosting;
|
using Microsoft.AspNetCore.Hosting;
|
||||||
using Microsoft.AspNetCore.Identity;
|
using Microsoft.AspNetCore.Identity;
|
||||||
using Microsoft.AspNetCore.Server.Kestrel.Core;
|
using Microsoft.AspNetCore.Server.Kestrel.Core;
|
||||||
using Microsoft.EntityFrameworkCore;
|
using Microsoft.EntityFrameworkCore;
|
||||||
|
using Microsoft.Extensions.Configuration;
|
||||||
using Microsoft.Extensions.DependencyInjection;
|
using Microsoft.Extensions.DependencyInjection;
|
||||||
using Microsoft.Extensions.Hosting;
|
using Microsoft.Extensions.Hosting;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
|
|
||||||
namespace API
|
namespace API
|
||||||
{
|
{
|
||||||
public class Program
|
public class Program
|
||||||
{
|
{
|
||||||
private static readonly int HttpPort = Configuration.Port;
|
private static readonly int HttpPort = Configuration.Port;
|
||||||
|
|
||||||
protected Program()
|
protected Program()
|
||||||
{
|
{
|
||||||
}
|
}
|
||||||
|
|
||||||
public static async Task Main(string[] args)
|
public static async Task Main(string[] args)
|
||||||
{
|
{
|
||||||
Console.OutputEncoding = System.Text.Encoding.UTF8;
|
Console.OutputEncoding = System.Text.Encoding.UTF8;
|
||||||
|
var isDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker;
|
||||||
|
|
||||||
// Before anything, check if JWT has been generated properly or if user still has default
|
MigrateConfigFiles.Migrate(isDocker);
|
||||||
if (!Configuration.CheckIfJwtTokenSet() &&
|
|
||||||
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
|
|
||||||
var rBytes = new byte[128];
|
|
||||||
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
|
|
||||||
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
|
|
||||||
}
|
|
||||||
|
|
||||||
var host = CreateHostBuilder(args).Build();
|
// Before anything, check if JWT has been generated properly or if user still has default
|
||||||
|
if (!Configuration.CheckIfJwtTokenSet() &&
|
||||||
|
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
|
||||||
|
var rBytes = new byte[128];
|
||||||
|
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
|
||||||
|
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
|
||||||
|
}
|
||||||
|
|
||||||
using var scope = host.Services.CreateScope();
|
var host = CreateHostBuilder(args).Build();
|
||||||
var services = scope.ServiceProvider;
|
|
||||||
|
|
||||||
try
|
using var scope = host.Services.CreateScope();
|
||||||
{
|
var services = scope.ServiceProvider;
|
||||||
var context = services.GetRequiredService<DataContext>();
|
|
||||||
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
|
|
||||||
|
|
||||||
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
// If this is a new install, tables wont exist yet
|
var context = services.GetRequiredService<DataContext>();
|
||||||
|
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
|
||||||
|
|
||||||
|
if (isDocker && new FileInfo("data/appsettings.json").Exists)
|
||||||
|
{
|
||||||
|
var logger = services.GetRequiredService<ILogger<Startup>>();
|
||||||
|
logger.LogCritical("WARNING! Mount point is incorrect, nothing here will persist. Please change your container mount from /kavita/data to /kavita/config");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
|
||||||
|
try
|
||||||
|
{
|
||||||
|
// If this is a new install, tables wont exist yet
|
||||||
|
if (requiresCoverImageMigration)
|
||||||
|
{
|
||||||
|
MigrateCoverImages.ExtractToImages(context);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
requiresCoverImageMigration = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply all migrations on startup
|
||||||
|
await context.Database.MigrateAsync();
|
||||||
|
|
||||||
if (requiresCoverImageMigration)
|
if (requiresCoverImageMigration)
|
||||||
{
|
{
|
||||||
MigrateCoverImages.ExtractToImages(context);
|
await MigrateCoverImages.UpdateDatabaseWithImages(context);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
await Seed.SeedRoles(roleManager);
|
||||||
|
await Seed.SeedSettings(context);
|
||||||
|
await Seed.SeedUserApiKeys(context);
|
||||||
}
|
}
|
||||||
catch (Exception )
|
catch (Exception ex)
|
||||||
{
|
{
|
||||||
requiresCoverImageMigration = false;
|
var logger = services.GetRequiredService<ILogger<Program>>();
|
||||||
|
logger.LogError(ex, "An error occurred during migration");
|
||||||
}
|
}
|
||||||
|
|
||||||
// Apply all migrations on startup
|
await host.RunAsync();
|
||||||
await context.Database.MigrateAsync();
|
}
|
||||||
|
|
||||||
if (requiresCoverImageMigration)
|
private static IHostBuilder CreateHostBuilder(string[] args) =>
|
||||||
{
|
Host.CreateDefaultBuilder(args)
|
||||||
await MigrateCoverImages.UpdateDatabaseWithImages(context);
|
.ConfigureAppConfiguration((hostingContext, config) =>
|
||||||
}
|
{
|
||||||
|
config.Sources.Clear();
|
||||||
|
|
||||||
await Seed.SeedRoles(roleManager);
|
var env = hostingContext.HostingEnvironment;
|
||||||
await Seed.SeedSettings(context);
|
|
||||||
await Seed.SeedUserApiKeys(context);
|
|
||||||
}
|
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
var logger = services.GetRequiredService<ILogger<Program>>();
|
|
||||||
logger.LogError(ex, "An error occurred during migration");
|
|
||||||
}
|
|
||||||
|
|
||||||
await host.RunAsync();
|
config.AddJsonFile("config/appsettings.json", optional: true, reloadOnChange: false)
|
||||||
}
|
.AddJsonFile($"config/appsettings.{env.EnvironmentName}.json",
|
||||||
|
optional: true, reloadOnChange: false);
|
||||||
|
})
|
||||||
|
.ConfigureWebHostDefaults(webBuilder =>
|
||||||
|
{
|
||||||
|
webBuilder.UseKestrel((opts) =>
|
||||||
|
{
|
||||||
|
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
|
||||||
|
});
|
||||||
|
|
||||||
private static IHostBuilder CreateHostBuilder(string[] args) =>
|
webBuilder.UseStartup<Startup>();
|
||||||
Host.CreateDefaultBuilder(args)
|
});
|
||||||
.ConfigureWebHostDefaults(webBuilder =>
|
|
||||||
{
|
|
||||||
webBuilder.UseKestrel((opts) =>
|
|
||||||
{
|
|
||||||
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
|
|
||||||
});
|
|
||||||
|
|
||||||
webBuilder.UseStartup<Startup>();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -123,12 +123,24 @@ namespace API.Services
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
/// <param name="entryFullNames"></param>
|
/// <param name="entryFullNames"></param>
|
||||||
/// <returns>Entry name of match, null if no match</returns>
|
/// <returns>Entry name of match, null if no match</returns>
|
||||||
public string FirstFileEntry(IEnumerable<string> entryFullNames)
|
public static string FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
|
||||||
{
|
{
|
||||||
var result = entryFullNames.OrderBy(Path.GetFileName, new NaturalSortComparer())
|
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
|
||||||
.FirstOrDefault(x => !Parser.Parser.HasBlacklistedFolderInPath(x)
|
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
|
||||||
&& Parser.Parser.IsImage(x)
|
var fullNames = entryFullNames.Where(x =>!Parser.Parser.HasBlacklistedFolderInPath(x)
|
||||||
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
|
&& Parser.Parser.IsImage(x)
|
||||||
|
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)).ToList();
|
||||||
|
if (fullNames.Count == 0) return null;
|
||||||
|
|
||||||
|
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
|
||||||
|
.OrderBy(Path.GetFullPath, new NaturalSortComparer())
|
||||||
|
.FirstOrDefault();
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
|
||||||
|
|
||||||
|
var result = fullNames
|
||||||
|
.OrderBy(Path.GetFileName, new NaturalSortComparer())
|
||||||
|
.FirstOrDefault();
|
||||||
|
|
||||||
return string.IsNullOrEmpty(result) ? null : result;
|
return string.IsNullOrEmpty(result) ? null : result;
|
||||||
}
|
}
|
||||||
@ -158,7 +170,7 @@ namespace API.Services
|
|||||||
using var archive = ZipFile.OpenRead(archivePath);
|
using var archive = ZipFile.OpenRead(archivePath);
|
||||||
var entryNames = archive.Entries.Select(e => e.FullName).ToArray();
|
var entryNames = archive.Entries.Select(e => e.FullName).ToArray();
|
||||||
|
|
||||||
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
|
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
|
||||||
var entry = archive.Entries.Single(e => e.FullName == entryName);
|
var entry = archive.Entries.Single(e => e.FullName == entryName);
|
||||||
using var stream = entry.Open();
|
using var stream = entry.Open();
|
||||||
|
|
||||||
@ -169,7 +181,7 @@ namespace API.Services
|
|||||||
using var archive = ArchiveFactory.Open(archivePath);
|
using var archive = ArchiveFactory.Open(archivePath);
|
||||||
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
|
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
|
||||||
|
|
||||||
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
|
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
|
||||||
var entry = archive.Entries.Single(e => e.Key == entryName);
|
var entry = archive.Entries.Single(e => e.Key == entryName);
|
||||||
|
|
||||||
using var stream = entry.OpenEntryStream();
|
using var stream = entry.OpenEntryStream();
|
||||||
|
@ -140,15 +140,22 @@ namespace API.Services
|
|||||||
}
|
}
|
||||||
|
|
||||||
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
|
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
|
||||||
stylesheetHtml =
|
var importMatches = Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml);
|
||||||
Parser.Parser.CssImportUrlRegex.Replace(stylesheetHtml, "$1" + apiBase + prepend + "$2" + "$3");
|
foreach (Match match in importMatches)
|
||||||
|
{
|
||||||
|
if (!match.Success) continue;
|
||||||
|
var importFile = match.Groups["Filename"].Value;
|
||||||
|
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if there are any background images and rewrite those urls
|
||||||
|
EscapeCssImageReferences(ref stylesheetHtml, apiBase, book);
|
||||||
|
|
||||||
var styleContent = RemoveWhiteSpaceFromStylesheets(stylesheetHtml);
|
var styleContent = RemoveWhiteSpaceFromStylesheets(stylesheetHtml);
|
||||||
styleContent =
|
|
||||||
Parser.Parser.FontSrcUrlRegex.Replace(styleContent, "$1" + apiBase + "$2" + "$3");
|
|
||||||
|
|
||||||
styleContent = styleContent.Replace("body", ".reading-section");
|
styleContent = styleContent.Replace("body", ".reading-section");
|
||||||
|
|
||||||
|
if (string.IsNullOrEmpty(styleContent)) return string.Empty;
|
||||||
|
|
||||||
var stylesheet = await _cssParser.ParseAsync(styleContent);
|
var stylesheet = await _cssParser.ParseAsync(styleContent);
|
||||||
foreach (var styleRule in stylesheet.StyleRules)
|
foreach (var styleRule in stylesheet.StyleRules)
|
||||||
{
|
{
|
||||||
@ -165,6 +172,21 @@ namespace API.Services
|
|||||||
return RemoveWhiteSpaceFromStylesheets(stylesheet.ToCss());
|
return RemoveWhiteSpaceFromStylesheets(stylesheet.ToCss());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book)
|
||||||
|
{
|
||||||
|
var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
|
||||||
|
foreach (Match match in matches)
|
||||||
|
{
|
||||||
|
if (!match.Success) continue;
|
||||||
|
|
||||||
|
var importFile = match.Groups["Filename"].Value;
|
||||||
|
var key = CleanContentKeys(importFile);
|
||||||
|
if (!book.Content.AllFiles.ContainsKey(key)) continue;
|
||||||
|
|
||||||
|
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public ComicInfo GetComicInfo(string filePath)
|
public ComicInfo GetComicInfo(string filePath)
|
||||||
{
|
{
|
||||||
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
|
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
|
||||||
@ -488,15 +510,29 @@ namespace API.Services
|
|||||||
|
|
||||||
private static string RemoveWhiteSpaceFromStylesheets(string body)
|
private static string RemoveWhiteSpaceFromStylesheets(string body)
|
||||||
{
|
{
|
||||||
|
if (string.IsNullOrEmpty(body))
|
||||||
|
{
|
||||||
|
return string.Empty;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove comments from CSS
|
||||||
|
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
|
||||||
|
|
||||||
body = Regex.Replace(body, @"[a-zA-Z]+#", "#");
|
body = Regex.Replace(body, @"[a-zA-Z]+#", "#");
|
||||||
body = Regex.Replace(body, @"[\n\r]+\s*", string.Empty);
|
body = Regex.Replace(body, @"[\n\r]+\s*", string.Empty);
|
||||||
body = Regex.Replace(body, @"\s+", " ");
|
body = Regex.Replace(body, @"\s+", " ");
|
||||||
body = Regex.Replace(body, @"\s?([:,;{}])\s?", "$1");
|
body = Regex.Replace(body, @"\s?([:,;{}])\s?", "$1");
|
||||||
body = body.Replace(";}", "}");
|
try
|
||||||
|
{
|
||||||
|
body = body.Replace(";}", "}");
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow exception. Some css doesn't have style rules ending in ; */
|
||||||
|
}
|
||||||
|
|
||||||
body = Regex.Replace(body, @"([\s:]0)(px|pt|%|em)", "$1");
|
body = Regex.Replace(body, @"([\s:]0)(px|pt|%|em)", "$1");
|
||||||
|
|
||||||
// Remove comments from CSS
|
|
||||||
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
|
|
||||||
|
|
||||||
return body;
|
return body;
|
||||||
}
|
}
|
||||||
|
@ -21,7 +21,6 @@ namespace API.Services
|
|||||||
private readonly IDirectoryService _directoryService;
|
private readonly IDirectoryService _directoryService;
|
||||||
private readonly IBookService _bookService;
|
private readonly IBookService _bookService;
|
||||||
private readonly NumericComparer _numericComparer;
|
private readonly NumericComparer _numericComparer;
|
||||||
public static readonly string CacheDirectory = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "cache/"));
|
|
||||||
|
|
||||||
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
|
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
|
||||||
IDirectoryService directoryService, IBookService bookService)
|
IDirectoryService directoryService, IBookService bookService)
|
||||||
@ -38,7 +37,7 @@ namespace API.Services
|
|||||||
{
|
{
|
||||||
if (!DirectoryService.ExistOrCreate(DirectoryService.CacheDirectory))
|
if (!DirectoryService.ExistOrCreate(DirectoryService.CacheDirectory))
|
||||||
{
|
{
|
||||||
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", CacheDirectory);
|
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", DirectoryService.CacheDirectory);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -102,7 +101,7 @@ namespace API.Services
|
|||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
|
DirectoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
|
||||||
Parser.Parser.ImageFileExtensions);
|
Parser.Parser.ImageFileExtensions);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -147,7 +146,7 @@ namespace API.Services
|
|||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
DirectoryService.ClearDirectory(CacheDirectory);
|
DirectoryService.ClearDirectory(DirectoryService.CacheDirectory);
|
||||||
}
|
}
|
||||||
catch (Exception ex)
|
catch (Exception ex)
|
||||||
{
|
{
|
||||||
@ -198,7 +197,7 @@ namespace API.Services
|
|||||||
if (page <= (mangaFile.Pages + pagesSoFar))
|
if (page <= (mangaFile.Pages + pagesSoFar))
|
||||||
{
|
{
|
||||||
var path = GetCachePath(chapter.Id);
|
var path = GetCachePath(chapter.Id);
|
||||||
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
|
var files = DirectoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
|
||||||
Array.Sort(files, _numericComparer);
|
Array.Sort(files, _numericComparer);
|
||||||
|
|
||||||
if (files.Length == 0)
|
if (files.Length == 0)
|
||||||
|
@ -1,55 +0,0 @@
|
|||||||
using System;
|
|
||||||
using System.Net.Http;
|
|
||||||
using System.Net.Http.Json;
|
|
||||||
using System.Threading.Tasks;
|
|
||||||
using API.DTOs.Stats;
|
|
||||||
using Microsoft.Extensions.Logging;
|
|
||||||
|
|
||||||
namespace API.Services.Clients
|
|
||||||
{
|
|
||||||
public class StatsApiClient
|
|
||||||
{
|
|
||||||
private readonly HttpClient _client;
|
|
||||||
private readonly ILogger<StatsApiClient> _logger;
|
|
||||||
#pragma warning disable S1075
|
|
||||||
private const string ApiUrl = "http://stats.kavitareader.com";
|
|
||||||
#pragma warning restore S1075
|
|
||||||
|
|
||||||
public StatsApiClient(HttpClient client, ILogger<StatsApiClient> logger)
|
|
||||||
{
|
|
||||||
_client = client;
|
|
||||||
_logger = logger;
|
|
||||||
_client.Timeout = TimeSpan.FromSeconds(30);
|
|
||||||
}
|
|
||||||
|
|
||||||
public async Task SendDataToStatsServer(UsageStatisticsDto data)
|
|
||||||
{
|
|
||||||
var responseContent = string.Empty;
|
|
||||||
|
|
||||||
try
|
|
||||||
{
|
|
||||||
using var response = await _client.PostAsJsonAsync(ApiUrl + "/api/InstallationStats", data);
|
|
||||||
|
|
||||||
responseContent = await response.Content.ReadAsStringAsync();
|
|
||||||
|
|
||||||
response.EnsureSuccessStatusCode();
|
|
||||||
}
|
|
||||||
catch (HttpRequestException e)
|
|
||||||
{
|
|
||||||
var info = new
|
|
||||||
{
|
|
||||||
dataSent = data,
|
|
||||||
response = responseContent
|
|
||||||
};
|
|
||||||
|
|
||||||
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
|
|
||||||
throw;
|
|
||||||
}
|
|
||||||
catch (Exception e)
|
|
||||||
{
|
|
||||||
_logger.LogError(e, "An error happened during the request to KavitaStats");
|
|
||||||
throw;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
@ -16,10 +16,12 @@ namespace API.Services
|
|||||||
private static readonly Regex ExcludeDirectories = new Regex(
|
private static readonly Regex ExcludeDirectories = new Regex(
|
||||||
@"@eaDir|\.DS_Store",
|
@"@eaDir|\.DS_Store",
|
||||||
RegexOptions.Compiled | RegexOptions.IgnoreCase);
|
RegexOptions.Compiled | RegexOptions.IgnoreCase);
|
||||||
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
|
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "temp");
|
||||||
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
|
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "logs");
|
||||||
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "cache");
|
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "cache");
|
||||||
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "covers");
|
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "covers");
|
||||||
|
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
|
||||||
|
public static readonly string StatsDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "stats");
|
||||||
|
|
||||||
public DirectoryService(ILogger<DirectoryService> logger)
|
public DirectoryService(ILogger<DirectoryService> logger)
|
||||||
{
|
{
|
||||||
@ -95,7 +97,7 @@ namespace API.Services
|
|||||||
return di.Exists;
|
return di.Exists;
|
||||||
}
|
}
|
||||||
|
|
||||||
public IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
|
public static IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
|
||||||
SearchOption searchOption = SearchOption.TopDirectoryOnly)
|
SearchOption searchOption = SearchOption.TopDirectoryOnly)
|
||||||
{
|
{
|
||||||
if (searchPatternExpression != string.Empty)
|
if (searchPatternExpression != string.Empty)
|
||||||
@ -134,13 +136,10 @@ namespace API.Services
|
|||||||
/// <param name="searchPattern">Defaults to *, meaning all files</param>
|
/// <param name="searchPattern">Defaults to *, meaning all files</param>
|
||||||
/// <returns></returns>
|
/// <returns></returns>
|
||||||
/// <exception cref="DirectoryNotFoundException"></exception>
|
/// <exception cref="DirectoryNotFoundException"></exception>
|
||||||
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*")
|
public static bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "")
|
||||||
{
|
{
|
||||||
if (string.IsNullOrEmpty(sourceDirName)) return false;
|
if (string.IsNullOrEmpty(sourceDirName)) return false;
|
||||||
|
|
||||||
var di = new DirectoryInfo(sourceDirName);
|
|
||||||
if (!di.Exists) return false;
|
|
||||||
|
|
||||||
// Get the subdirectories for the specified directory.
|
// Get the subdirectories for the specified directory.
|
||||||
var dir = new DirectoryInfo(sourceDirName);
|
var dir = new DirectoryInfo(sourceDirName);
|
||||||
|
|
||||||
@ -154,7 +153,7 @@ namespace API.Services
|
|||||||
var dirs = dir.GetDirectories();
|
var dirs = dir.GetDirectories();
|
||||||
|
|
||||||
// If the destination directory doesn't exist, create it.
|
// If the destination directory doesn't exist, create it.
|
||||||
Directory.CreateDirectory(destDirName);
|
ExistOrCreate(destDirName);
|
||||||
|
|
||||||
// Get the files in the directory and copy them to the new location.
|
// Get the files in the directory and copy them to the new location.
|
||||||
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => new FileInfo(n));
|
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => new FileInfo(n));
|
||||||
@ -176,7 +175,7 @@ namespace API.Services
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
public string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
|
public static string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
|
||||||
{
|
{
|
||||||
if (searchPatternExpression != string.Empty)
|
if (searchPatternExpression != string.Empty)
|
||||||
{
|
{
|
||||||
|
@ -13,7 +13,6 @@ namespace API.Services
|
|||||||
public class ImageService : IImageService
|
public class ImageService : IImageService
|
||||||
{
|
{
|
||||||
private readonly ILogger<ImageService> _logger;
|
private readonly ILogger<ImageService> _logger;
|
||||||
private readonly IDirectoryService _directoryService;
|
|
||||||
public const string ChapterCoverImageRegex = @"v\d+_c\d+";
|
public const string ChapterCoverImageRegex = @"v\d+_c\d+";
|
||||||
public const string SeriesCoverImageRegex = @"seres\d+";
|
public const string SeriesCoverImageRegex = @"seres\d+";
|
||||||
public const string CollectionTagCoverImageRegex = @"tag\d+";
|
public const string CollectionTagCoverImageRegex = @"tag\d+";
|
||||||
@ -24,10 +23,9 @@ namespace API.Services
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
private const int ThumbnailWidth = 320;
|
private const int ThumbnailWidth = 320;
|
||||||
|
|
||||||
public ImageService(ILogger<ImageService> logger, IDirectoryService directoryService)
|
public ImageService(ILogger<ImageService> logger)
|
||||||
{
|
{
|
||||||
_logger = logger;
|
_logger = logger;
|
||||||
_directoryService = directoryService;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@ -44,7 +42,7 @@ namespace API.Services
|
|||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
var firstImage = _directoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
|
var firstImage = DirectoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
|
||||||
.OrderBy(f => f, new NaturalSortComparer()).FirstOrDefault();
|
.OrderBy(f => f, new NaturalSortComparer()).FirstOrDefault();
|
||||||
|
|
||||||
return firstImage;
|
return firstImage;
|
||||||
|
@ -1,3 +1,4 @@
|
|||||||
|
using System;
|
||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.Diagnostics;
|
using System.Diagnostics;
|
||||||
using System.IO;
|
using System.IO;
|
||||||
@ -216,37 +217,45 @@ namespace API.Services
|
|||||||
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
|
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
|
||||||
var stopwatch = Stopwatch.StartNew();
|
var stopwatch = Stopwatch.StartNew();
|
||||||
var totalTime = 0L;
|
var totalTime = 0L;
|
||||||
_logger.LogDebug($"[MetadataService] Refreshing Library {library.Name}. Total Items: {chunkInfo.TotalSize}. Total Chunks: {chunkInfo.TotalChunks} with {chunkInfo.ChunkSize} size.");
|
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
|
||||||
|
|
||||||
// This technically does
|
|
||||||
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
|
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
|
||||||
{
|
{
|
||||||
|
if (chunkInfo.TotalChunks == 0) continue;
|
||||||
totalTime += stopwatch.ElapsedMilliseconds;
|
totalTime += stopwatch.ElapsedMilliseconds;
|
||||||
stopwatch.Restart();
|
stopwatch.Restart();
|
||||||
_logger.LogDebug($"[MetadataService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
|
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
|
||||||
|
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
|
||||||
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
|
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
|
||||||
new UserParams()
|
new UserParams()
|
||||||
{
|
{
|
||||||
PageNumber = chunk,
|
PageNumber = chunk,
|
||||||
PageSize = chunkInfo.ChunkSize
|
PageSize = chunkInfo.ChunkSize
|
||||||
});
|
});
|
||||||
_logger.LogDebug($"[MetadataService] Fetched {nonLibrarySeries.Count} series for refresh");
|
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
|
||||||
Parallel.ForEach(nonLibrarySeries, series =>
|
Parallel.ForEach(nonLibrarySeries, series =>
|
||||||
{
|
{
|
||||||
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
|
try
|
||||||
var volumeUpdated = false;
|
|
||||||
foreach (var volume in series.Volumes)
|
|
||||||
{
|
{
|
||||||
var chapterUpdated = false;
|
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
|
||||||
foreach (var chapter in volume.Chapters)
|
var volumeUpdated = false;
|
||||||
|
foreach (var volume in series.Volumes)
|
||||||
{
|
{
|
||||||
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
|
var chapterUpdated = false;
|
||||||
|
foreach (var chapter in volume.Chapters)
|
||||||
|
{
|
||||||
|
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
|
||||||
|
}
|
||||||
|
|
||||||
|
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
|
||||||
}
|
}
|
||||||
|
|
||||||
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
|
UpdateMetadata(series, volumeUpdated || forceUpdate);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow exception */
|
||||||
}
|
}
|
||||||
|
|
||||||
UpdateMetadata(series, volumeUpdated || forceUpdate);
|
|
||||||
});
|
});
|
||||||
|
|
||||||
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
|
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
|
||||||
|
@ -89,7 +89,7 @@ namespace API.Services
|
|||||||
}
|
}
|
||||||
|
|
||||||
_logger.LogDebug("Scheduling stat collection daily");
|
_logger.LogDebug("Scheduling stat collection daily");
|
||||||
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.CollectAndSendStatsData(), Cron.Daily, TimeZoneInfo.Local);
|
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.Send(), Cron.Daily, TimeZoneInfo.Local);
|
||||||
}
|
}
|
||||||
|
|
||||||
public void CancelStatsTasks()
|
public void CancelStatsTasks()
|
||||||
@ -102,7 +102,7 @@ namespace API.Services
|
|||||||
public void RunStatCollection()
|
public void RunStatCollection()
|
||||||
{
|
{
|
||||||
_logger.LogInformation("Enqueuing stat collection");
|
_logger.LogInformation("Enqueuing stat collection");
|
||||||
BackgroundJob.Enqueue(() => _statsService.CollectAndSendStatsData());
|
BackgroundJob.Enqueue(() => _statsService.Send());
|
||||||
}
|
}
|
||||||
|
|
||||||
#endregion
|
#endregion
|
||||||
@ -138,8 +138,7 @@ namespace API.Services
|
|||||||
|
|
||||||
public void CleanupTemp()
|
public void CleanupTemp()
|
||||||
{
|
{
|
||||||
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
|
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(DirectoryService.TempDirectory));
|
||||||
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(tempDirectory));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)
|
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)
|
||||||
|
@ -9,6 +9,7 @@ using API.Extensions;
|
|||||||
using API.Interfaces;
|
using API.Interfaces;
|
||||||
using API.Interfaces.Services;
|
using API.Interfaces.Services;
|
||||||
using Hangfire;
|
using Hangfire;
|
||||||
|
using Kavita.Common.EnvironmentInfo;
|
||||||
using Microsoft.Extensions.Configuration;
|
using Microsoft.Extensions.Configuration;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
|
|
||||||
@ -19,8 +20,8 @@ namespace API.Services.Tasks
|
|||||||
private readonly IUnitOfWork _unitOfWork;
|
private readonly IUnitOfWork _unitOfWork;
|
||||||
private readonly ILogger<BackupService> _logger;
|
private readonly ILogger<BackupService> _logger;
|
||||||
private readonly IDirectoryService _directoryService;
|
private readonly IDirectoryService _directoryService;
|
||||||
private readonly string _tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
|
private readonly string _tempDirectory = DirectoryService.TempDirectory;
|
||||||
private readonly string _logDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
|
private readonly string _logDirectory = DirectoryService.LogDirectory;
|
||||||
|
|
||||||
private readonly IList<string> _backupFiles;
|
private readonly IList<string> _backupFiles;
|
||||||
|
|
||||||
@ -33,15 +34,32 @@ namespace API.Services.Tasks
|
|||||||
var maxRollingFiles = config.GetMaxRollingFiles();
|
var maxRollingFiles = config.GetMaxRollingFiles();
|
||||||
var loggingSection = config.GetLoggingFileName();
|
var loggingSection = config.GetLoggingFileName();
|
||||||
var files = LogFiles(maxRollingFiles, loggingSection);
|
var files = LogFiles(maxRollingFiles, loggingSection);
|
||||||
_backupFiles = new List<string>()
|
|
||||||
|
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
|
||||||
{
|
{
|
||||||
"appsettings.json",
|
_backupFiles = new List<string>()
|
||||||
"Hangfire.db",
|
{
|
||||||
"Hangfire-log.db",
|
"data/appsettings.json",
|
||||||
"kavita.db",
|
"data/Hangfire.db",
|
||||||
"kavita.db-shm", // This wont always be there
|
"data/Hangfire-log.db",
|
||||||
"kavita.db-wal", // This wont always be there
|
"data/kavita.db",
|
||||||
};
|
"data/kavita.db-shm", // This wont always be there
|
||||||
|
"data/kavita.db-wal" // This wont always be there
|
||||||
|
};
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
_backupFiles = new List<string>()
|
||||||
|
{
|
||||||
|
"appsettings.json",
|
||||||
|
"Hangfire.db",
|
||||||
|
"Hangfire-log.db",
|
||||||
|
"kavita.db",
|
||||||
|
"kavita.db-shm", // This wont always be there
|
||||||
|
"kavita.db-wal" // This wont always be there
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
|
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
|
||||||
{
|
{
|
||||||
_backupFiles.Add(file);
|
_backupFiles.Add(file);
|
||||||
@ -54,7 +72,7 @@ namespace API.Services.Tasks
|
|||||||
var fi = new FileInfo(logFileName);
|
var fi = new FileInfo(logFileName);
|
||||||
|
|
||||||
var files = maxRollingFiles > 0
|
var files = maxRollingFiles > 0
|
||||||
? _directoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
|
? DirectoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
|
||||||
: new[] {"kavita.log"};
|
: new[] {"kavita.log"};
|
||||||
return files;
|
return files;
|
||||||
}
|
}
|
||||||
@ -129,6 +147,11 @@ namespace API.Services.Tasks
|
|||||||
{
|
{
|
||||||
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
|
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (!DirectoryService.GetFiles(outputTempDir).Any())
|
||||||
|
{
|
||||||
|
DirectoryService.ClearAndDeleteDirectory(outputTempDir);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
@ -141,7 +164,7 @@ namespace API.Services.Tasks
|
|||||||
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
|
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
|
||||||
if (!_directoryService.Exists(backupDirectory)) return;
|
if (!_directoryService.Exists(backupDirectory)) return;
|
||||||
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
|
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
|
||||||
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
|
var allBackups = DirectoryService.GetFiles(backupDirectory).ToList();
|
||||||
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
|
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
|
||||||
.Where(f => f.CreationTime > deltaTime)
|
.Where(f => f.CreationTime > deltaTime)
|
||||||
.ToList();
|
.ToList();
|
||||||
|
@ -16,16 +16,14 @@ namespace API.Services.Tasks
|
|||||||
private readonly ILogger<CleanupService> _logger;
|
private readonly ILogger<CleanupService> _logger;
|
||||||
private readonly IBackupService _backupService;
|
private readonly IBackupService _backupService;
|
||||||
private readonly IUnitOfWork _unitOfWork;
|
private readonly IUnitOfWork _unitOfWork;
|
||||||
private readonly IDirectoryService _directoryService;
|
|
||||||
|
|
||||||
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
|
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
|
||||||
IBackupService backupService, IUnitOfWork unitOfWork, IDirectoryService directoryService)
|
IBackupService backupService, IUnitOfWork unitOfWork)
|
||||||
{
|
{
|
||||||
_cacheService = cacheService;
|
_cacheService = cacheService;
|
||||||
_logger = logger;
|
_logger = logger;
|
||||||
_backupService = backupService;
|
_backupService = backupService;
|
||||||
_unitOfWork = unitOfWork;
|
_unitOfWork = unitOfWork;
|
||||||
_directoryService = directoryService;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public void CleanupCacheDirectory()
|
public void CleanupCacheDirectory()
|
||||||
@ -42,7 +40,7 @@ namespace API.Services.Tasks
|
|||||||
{
|
{
|
||||||
_logger.LogInformation("Starting Cleanup");
|
_logger.LogInformation("Starting Cleanup");
|
||||||
_logger.LogInformation("Cleaning temp directory");
|
_logger.LogInformation("Cleaning temp directory");
|
||||||
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
|
var tempDirectory = DirectoryService.TempDirectory;
|
||||||
DirectoryService.ClearDirectory(tempDirectory);
|
DirectoryService.ClearDirectory(tempDirectory);
|
||||||
CleanupCacheDirectory();
|
CleanupCacheDirectory();
|
||||||
_logger.LogInformation("Cleaning old database backups");
|
_logger.LogInformation("Cleaning old database backups");
|
||||||
@ -57,7 +55,7 @@ namespace API.Services.Tasks
|
|||||||
private async Task DeleteSeriesCoverImages()
|
private async Task DeleteSeriesCoverImages()
|
||||||
{
|
{
|
||||||
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
|
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
|
||||||
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
|
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
|
||||||
foreach (var file in files)
|
foreach (var file in files)
|
||||||
{
|
{
|
||||||
if (images.Contains(Path.GetFileName(file))) continue;
|
if (images.Contains(Path.GetFileName(file))) continue;
|
||||||
@ -69,7 +67,7 @@ namespace API.Services.Tasks
|
|||||||
private async Task DeleteChapterCoverImages()
|
private async Task DeleteChapterCoverImages()
|
||||||
{
|
{
|
||||||
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
|
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
|
||||||
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
|
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
|
||||||
foreach (var file in files)
|
foreach (var file in files)
|
||||||
{
|
{
|
||||||
if (images.Contains(Path.GetFileName(file))) continue;
|
if (images.Contains(Path.GetFileName(file))) continue;
|
||||||
@ -81,7 +79,7 @@ namespace API.Services.Tasks
|
|||||||
private async Task DeleteTagCoverImages()
|
private async Task DeleteTagCoverImages()
|
||||||
{
|
{
|
||||||
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
|
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
|
||||||
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
|
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
|
||||||
foreach (var file in files)
|
foreach (var file in files)
|
||||||
{
|
{
|
||||||
if (images.Contains(Path.GetFileName(file))) continue;
|
if (images.Contains(Path.GetFileName(file))) continue;
|
||||||
|
@ -73,9 +73,13 @@ namespace API.Services.Tasks.Scanner
|
|||||||
info = Parser.Parser.Parse(path, rootPath, type);
|
info = Parser.Parser.Parse(path, rootPath, type);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// If we couldn't match, log. But don't log if the file parses as a cover image
|
||||||
if (info == null)
|
if (info == null)
|
||||||
{
|
{
|
||||||
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
|
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
|
||||||
|
{
|
||||||
|
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
|
||||||
|
}
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -133,13 +137,11 @@ namespace API.Services.Tasks.Scanner
|
|||||||
public string MergeName(ParserInfo info)
|
public string MergeName(ParserInfo info)
|
||||||
{
|
{
|
||||||
var normalizedSeries = Parser.Parser.Normalize(info.Series);
|
var normalizedSeries = Parser.Parser.Normalize(info.Series);
|
||||||
_logger.LogDebug("Checking if we can merge {NormalizedSeries}", normalizedSeries);
|
|
||||||
var existingName =
|
var existingName =
|
||||||
_scannedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries && p.Key.Format == info.Format)
|
_scannedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries && p.Key.Format == info.Format)
|
||||||
.Key;
|
.Key;
|
||||||
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
|
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Found duplicate parsed infos, merged {Original} into {Merged}", info.Series, existingName.Name);
|
|
||||||
return existingName.Name;
|
return existingName.Name;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -261,13 +261,15 @@ namespace API.Services.Tasks
|
|||||||
var totalTime = 0L;
|
var totalTime = 0L;
|
||||||
|
|
||||||
// Update existing series
|
// Update existing series
|
||||||
_logger.LogDebug("[ScannerService] Updating existing series");
|
_logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size",
|
||||||
|
library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
|
||||||
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
|
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
|
||||||
{
|
{
|
||||||
if (chunkInfo.TotalChunks == 0) continue;
|
if (chunkInfo.TotalChunks == 0) continue;
|
||||||
totalTime += stopwatch.ElapsedMilliseconds;
|
totalTime += stopwatch.ElapsedMilliseconds;
|
||||||
stopwatch.Restart();
|
stopwatch.Restart();
|
||||||
_logger.LogDebug($"[ScannerService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
|
_logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
|
||||||
|
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
|
||||||
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams()
|
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams()
|
||||||
{
|
{
|
||||||
PageNumber = chunk,
|
PageNumber = chunk,
|
||||||
@ -299,7 +301,21 @@ namespace API.Services.Tasks
|
|||||||
UpdateSeries(series, parsedSeries);
|
UpdateSeries(series, parsedSeries);
|
||||||
});
|
});
|
||||||
|
|
||||||
await _unitOfWork.CommitAsync();
|
try
|
||||||
|
{
|
||||||
|
await _unitOfWork.CommitAsync();
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB. If debug mode, series to check will be printed", chunk);
|
||||||
|
foreach (var series in nonLibrarySeries)
|
||||||
|
{
|
||||||
|
_logger.LogDebug("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName);
|
||||||
|
}
|
||||||
|
await _messageHub.Clients.All.SendAsync(SignalREvents.ScanLibraryError,
|
||||||
|
MessageFactory.ScanLibraryError(library.Id));
|
||||||
|
continue;
|
||||||
|
}
|
||||||
_logger.LogInformation(
|
_logger.LogInformation(
|
||||||
"[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}",
|
"[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}",
|
||||||
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name);
|
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name);
|
||||||
@ -320,12 +336,14 @@ namespace API.Services.Tasks
|
|||||||
_logger.LogDebug("[ScannerService] Adding new series");
|
_logger.LogDebug("[ScannerService] Adding new series");
|
||||||
var newSeries = new List<Series>();
|
var newSeries = new List<Series>();
|
||||||
var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList();
|
var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList();
|
||||||
|
_logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series",
|
||||||
|
allSeries.Count, parsedSeries.Count - allSeries.Count);
|
||||||
foreach (var (key, infos) in parsedSeries)
|
foreach (var (key, infos) in parsedSeries)
|
||||||
{
|
{
|
||||||
// Key is normalized already
|
// Key is normalized already
|
||||||
Series existingSeries;
|
Series existingSeries;
|
||||||
try
|
try
|
||||||
{
|
{// NOTE: Maybe use .Equals() here
|
||||||
existingSeries = allSeries.SingleOrDefault(s =>
|
existingSeries = allSeries.SingleOrDefault(s =>
|
||||||
(s.NormalizedName == key.NormalizedName || Parser.Parser.Normalize(s.OriginalName) == key.NormalizedName)
|
(s.NormalizedName == key.NormalizedName || Parser.Parser.Normalize(s.OriginalName) == key.NormalizedName)
|
||||||
&& (s.Format == key.Format || s.Format == MangaFormat.Unknown));
|
&& (s.Format == key.Format || s.Format == MangaFormat.Unknown));
|
||||||
@ -386,7 +404,7 @@ namespace API.Services.Tasks
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
_logger.LogDebug(
|
_logger.LogInformation(
|
||||||
"[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
|
"[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
|
||||||
newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name);
|
newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name);
|
||||||
}
|
}
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
using System;
|
using System;
|
||||||
using System.IO;
|
using System.IO;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
|
using System.Net.Http;
|
||||||
using System.Runtime.InteropServices;
|
using System.Runtime.InteropServices;
|
||||||
using System.Text.Json;
|
using System.Text.Json;
|
||||||
using System.Threading;
|
using System.Threading;
|
||||||
@ -9,9 +10,11 @@ using API.Data;
|
|||||||
using API.DTOs.Stats;
|
using API.DTOs.Stats;
|
||||||
using API.Interfaces;
|
using API.Interfaces;
|
||||||
using API.Interfaces.Services;
|
using API.Interfaces.Services;
|
||||||
using API.Services.Clients;
|
using Flurl.Http;
|
||||||
|
using Hangfire;
|
||||||
using Kavita.Common;
|
using Kavita.Common;
|
||||||
using Kavita.Common.EnvironmentInfo;
|
using Kavita.Common.EnvironmentInfo;
|
||||||
|
using Microsoft.AspNetCore.Http;
|
||||||
using Microsoft.EntityFrameworkCore;
|
using Microsoft.EntityFrameworkCore;
|
||||||
using Microsoft.Extensions.Logging;
|
using Microsoft.Extensions.Logging;
|
||||||
|
|
||||||
@ -19,32 +22,65 @@ namespace API.Services.Tasks
|
|||||||
{
|
{
|
||||||
public class StatsService : IStatsService
|
public class StatsService : IStatsService
|
||||||
{
|
{
|
||||||
private const string TempFilePath = "stats/";
|
private const string StatFileName = "app_stats.json";
|
||||||
private const string TempFileName = "app_stats.json";
|
|
||||||
|
|
||||||
private readonly StatsApiClient _client;
|
|
||||||
private readonly DataContext _dbContext;
|
private readonly DataContext _dbContext;
|
||||||
private readonly ILogger<StatsService> _logger;
|
private readonly ILogger<StatsService> _logger;
|
||||||
private readonly IUnitOfWork _unitOfWork;
|
private readonly IUnitOfWork _unitOfWork;
|
||||||
|
|
||||||
public StatsService(StatsApiClient client, DataContext dbContext, ILogger<StatsService> logger,
|
#pragma warning disable S1075
|
||||||
|
private const string ApiUrl = "http://stats.kavitareader.com";
|
||||||
|
#pragma warning restore S1075
|
||||||
|
private static readonly string StatsFilePath = Path.Combine(DirectoryService.StatsDirectory, StatFileName);
|
||||||
|
|
||||||
|
private static bool FileExists => File.Exists(StatsFilePath);
|
||||||
|
|
||||||
|
public StatsService(DataContext dbContext, ILogger<StatsService> logger,
|
||||||
IUnitOfWork unitOfWork)
|
IUnitOfWork unitOfWork)
|
||||||
{
|
{
|
||||||
_client = client;
|
|
||||||
_dbContext = dbContext;
|
_dbContext = dbContext;
|
||||||
_logger = logger;
|
_logger = logger;
|
||||||
_unitOfWork = unitOfWork;
|
_unitOfWork = unitOfWork;
|
||||||
}
|
}
|
||||||
|
|
||||||
private static string FinalPath => Path.Combine(Directory.GetCurrentDirectory(), TempFilePath, TempFileName);
|
/// <summary>
|
||||||
private static bool FileExists => File.Exists(FinalPath);
|
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
|
||||||
|
/// randomly over a 6 hour spread
|
||||||
public async Task PathData(ClientInfoDto clientInfoDto)
|
/// </summary>
|
||||||
|
public async Task Send()
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Pathing client data to the file");
|
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
|
||||||
|
if (!allowStatCollection)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var rnd = new Random();
|
||||||
|
var offset = rnd.Next(0, 6);
|
||||||
|
if (offset == 0)
|
||||||
|
{
|
||||||
|
await SendData();
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
_logger.LogInformation("KavitaStats upload has been schedule to run in {Offset} hours", offset);
|
||||||
|
BackgroundJob.Schedule(() => SendData(), DateTimeOffset.Now.AddHours(offset));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// This must be public for Hangfire. Do not call this directly.
|
||||||
|
/// </summary>
|
||||||
|
// ReSharper disable once MemberCanBePrivate.Global
|
||||||
|
public async Task SendData()
|
||||||
|
{
|
||||||
|
await CollectRelevantData();
|
||||||
|
await FinalizeStats();
|
||||||
|
}
|
||||||
|
|
||||||
|
public async Task RecordClientInfo(ClientInfoDto clientInfoDto)
|
||||||
|
{
|
||||||
var statisticsDto = await GetData();
|
var statisticsDto = await GetData();
|
||||||
|
|
||||||
statisticsDto.AddClientInfo(clientInfoDto);
|
statisticsDto.AddClientInfo(clientInfoDto);
|
||||||
|
|
||||||
await SaveFile(statisticsDto);
|
await SaveFile(statisticsDto);
|
||||||
@ -52,12 +88,7 @@ namespace API.Services.Tasks
|
|||||||
|
|
||||||
private async Task CollectRelevantData()
|
private async Task CollectRelevantData()
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Collecting data from the server and database");
|
|
||||||
|
|
||||||
_logger.LogDebug("Collecting usage info");
|
|
||||||
var usageInfo = await GetUsageInfo();
|
var usageInfo = await GetUsageInfo();
|
||||||
|
|
||||||
_logger.LogDebug("Collecting server info");
|
|
||||||
var serverInfo = GetServerInfo();
|
var serverInfo = GetServerInfo();
|
||||||
|
|
||||||
await PathData(serverInfo, usageInfo);
|
await PathData(serverInfo, usageInfo);
|
||||||
@ -67,39 +98,68 @@ namespace API.Services.Tasks
|
|||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Finalizing Stats collection flow");
|
|
||||||
|
|
||||||
var data = await GetExistingData<UsageStatisticsDto>();
|
var data = await GetExistingData<UsageStatisticsDto>();
|
||||||
|
var successful = await SendDataToStatsServer(data);
|
||||||
|
|
||||||
_logger.LogDebug("Sending data to the Stats server");
|
if (successful)
|
||||||
await _client.SendDataToStatsServer(data);
|
{
|
||||||
|
ResetStats();
|
||||||
_logger.LogDebug("Deleting the file from disk");
|
}
|
||||||
if (FileExists) File.Delete(FinalPath);
|
|
||||||
}
|
}
|
||||||
catch (Exception ex)
|
catch (Exception ex)
|
||||||
{
|
{
|
||||||
_logger.LogError(ex, "Error Finalizing Stats collection flow");
|
_logger.LogError(ex, "There was an exception while sending data to KavitaStats");
|
||||||
throw;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public async Task CollectAndSendStatsData()
|
private async Task<bool> SendDataToStatsServer(UsageStatisticsDto data)
|
||||||
{
|
{
|
||||||
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
|
var responseContent = string.Empty;
|
||||||
if (!allowStatCollection)
|
|
||||||
|
try
|
||||||
{
|
{
|
||||||
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
|
var response = await (ApiUrl + "/api/InstallationStats")
|
||||||
return;
|
.WithHeader("Accept", "application/json")
|
||||||
|
.WithHeader("User-Agent", "Kavita")
|
||||||
|
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
|
||||||
|
.WithHeader("api-key", "MsnvA2DfQqxSK5jh")
|
||||||
|
.WithHeader("x-kavita-version", BuildInfo.Version)
|
||||||
|
.WithTimeout(TimeSpan.FromSeconds(30))
|
||||||
|
.PostJsonAsync(data);
|
||||||
|
|
||||||
|
if (response.StatusCode != StatusCodes.Status200OK)
|
||||||
|
{
|
||||||
|
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
}
|
}
|
||||||
await CollectRelevantData();
|
catch (HttpRequestException e)
|
||||||
await FinalizeStats();
|
{
|
||||||
|
var info = new
|
||||||
|
{
|
||||||
|
dataSent = data,
|
||||||
|
response = responseContent
|
||||||
|
};
|
||||||
|
|
||||||
|
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
|
||||||
|
}
|
||||||
|
catch (Exception e)
|
||||||
|
{
|
||||||
|
_logger.LogError(e, "An error happened during the request to KavitaStats");
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void ResetStats()
|
||||||
|
{
|
||||||
|
if (FileExists) File.Delete(StatsFilePath);
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task PathData(ServerInfoDto serverInfoDto, UsageInfoDto usageInfoDto)
|
private async Task PathData(ServerInfoDto serverInfoDto, UsageInfoDto usageInfoDto)
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Pathing server and usage info to the file");
|
|
||||||
|
|
||||||
var data = await GetData();
|
var data = await GetData();
|
||||||
|
|
||||||
data.ServerInfo = serverInfoDto;
|
data.ServerInfo = serverInfoDto;
|
||||||
@ -110,7 +170,7 @@ namespace API.Services.Tasks
|
|||||||
await SaveFile(data);
|
await SaveFile(data);
|
||||||
}
|
}
|
||||||
|
|
||||||
private async ValueTask<UsageStatisticsDto> GetData()
|
private static async ValueTask<UsageStatisticsDto> GetData()
|
||||||
{
|
{
|
||||||
if (!FileExists) return new UsageStatisticsDto {InstallId = HashUtil.AnonymousToken()};
|
if (!FileExists) return new UsageStatisticsDto {InstallId = HashUtil.AnonymousToken()};
|
||||||
|
|
||||||
@ -156,39 +216,17 @@ namespace API.Services.Tasks
|
|||||||
return serverInfo;
|
return serverInfo;
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task<T> GetExistingData<T>()
|
private static async Task<T> GetExistingData<T>()
|
||||||
{
|
{
|
||||||
_logger.LogInformation("Fetching existing data from file");
|
var json = await File.ReadAllTextAsync(StatsFilePath);
|
||||||
var existingDataJson = await GetFileDataAsString();
|
return JsonSerializer.Deserialize<T>(json);
|
||||||
|
|
||||||
_logger.LogInformation("Deserializing data from file to object");
|
|
||||||
var existingData = JsonSerializer.Deserialize<T>(existingDataJson);
|
|
||||||
|
|
||||||
return existingData;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
private async Task<string> GetFileDataAsString()
|
private static async Task SaveFile(UsageStatisticsDto statisticsDto)
|
||||||
{
|
{
|
||||||
_logger.LogInformation("Reading file from disk");
|
DirectoryService.ExistOrCreate(DirectoryService.StatsDirectory);
|
||||||
return await File.ReadAllTextAsync(FinalPath);
|
|
||||||
}
|
|
||||||
|
|
||||||
private async Task SaveFile(UsageStatisticsDto statisticsDto)
|
await File.WriteAllTextAsync(StatsFilePath, JsonSerializer.Serialize(statisticsDto));
|
||||||
{
|
|
||||||
_logger.LogDebug("Saving file");
|
|
||||||
|
|
||||||
var finalDirectory = FinalPath.Replace(TempFileName, string.Empty);
|
|
||||||
if (!Directory.Exists(finalDirectory))
|
|
||||||
{
|
|
||||||
_logger.LogDebug("Creating tmp directory");
|
|
||||||
Directory.CreateDirectory(finalDirectory);
|
|
||||||
}
|
|
||||||
|
|
||||||
_logger.LogDebug("Serializing data to write");
|
|
||||||
var dataJson = JsonSerializer.Serialize(statisticsDto);
|
|
||||||
|
|
||||||
_logger.LogDebug("Writing file to the disk");
|
|
||||||
await File.WriteAllTextAsync(FinalPath, dataJson);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -96,5 +96,17 @@ namespace API.SignalR
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public static SignalRMessage ScanLibraryError(int libraryId)
|
||||||
|
{
|
||||||
|
return new SignalRMessage
|
||||||
|
{
|
||||||
|
Name = SignalREvents.ScanLibraryError,
|
||||||
|
Body = new
|
||||||
|
{
|
||||||
|
LibraryId = libraryId,
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
using System.Collections.Generic;
|
using System;
|
||||||
|
using System.Collections.Generic;
|
||||||
using System.Linq;
|
using System.Linq;
|
||||||
using System.Threading.Tasks;
|
using System.Threading.Tasks;
|
||||||
using API.Interfaces;
|
using API.Interfaces;
|
||||||
@ -27,7 +28,7 @@ namespace API.SignalR.Presence
|
|||||||
_unitOfWork = unitOfWork;
|
_unitOfWork = unitOfWork;
|
||||||
}
|
}
|
||||||
|
|
||||||
public Task UserConnected(string username, string connectionId)
|
public async Task UserConnected(string username, string connectionId)
|
||||||
{
|
{
|
||||||
lock (OnlineUsers)
|
lock (OnlineUsers)
|
||||||
{
|
{
|
||||||
@ -41,7 +42,10 @@ namespace API.SignalR.Presence
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return Task.CompletedTask;
|
// Update the last active for the user
|
||||||
|
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username);
|
||||||
|
user.LastActive = DateTime.Now;
|
||||||
|
await _unitOfWork.CommitAsync();
|
||||||
}
|
}
|
||||||
|
|
||||||
public Task UserDisconnected(string username, string connectionId)
|
public Task UserDisconnected(string username, string connectionId)
|
||||||
|
@ -11,5 +11,6 @@
|
|||||||
public const string ScanLibraryProgress = "ScanLibraryProgress";
|
public const string ScanLibraryProgress = "ScanLibraryProgress";
|
||||||
public const string OnlineUsers = "OnlineUsers";
|
public const string OnlineUsers = "OnlineUsers";
|
||||||
public const string SeriesAddedToCollection = "SeriesAddedToCollection";
|
public const string SeriesAddedToCollection = "SeriesAddedToCollection";
|
||||||
|
public const string ScanLibraryError = "ScanLibraryError";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -24,6 +24,7 @@ using Microsoft.AspNetCore.StaticFiles;
|
|||||||
using Microsoft.Extensions.Configuration;
|
using Microsoft.Extensions.Configuration;
|
||||||
using Microsoft.Extensions.DependencyInjection;
|
using Microsoft.Extensions.DependencyInjection;
|
||||||
using Microsoft.Extensions.Hosting;
|
using Microsoft.Extensions.Hosting;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
using Microsoft.OpenApi.Models;
|
using Microsoft.OpenApi.Models;
|
||||||
|
|
||||||
namespace API
|
namespace API
|
||||||
@ -106,7 +107,11 @@ namespace API
|
|||||||
|
|
||||||
services.AddResponseCaching();
|
services.AddResponseCaching();
|
||||||
|
|
||||||
services.AddStatsClient(_config);
|
services.Configure<ForwardedHeadersOptions>(options =>
|
||||||
|
{
|
||||||
|
options.ForwardedHeaders =
|
||||||
|
ForwardedHeaders.All;
|
||||||
|
});
|
||||||
|
|
||||||
services.AddHangfire(configuration => configuration
|
services.AddHangfire(configuration => configuration
|
||||||
.UseSimpleAssemblyNameTypeSerializer()
|
.UseSimpleAssemblyNameTypeSerializer()
|
||||||
@ -139,7 +144,10 @@ namespace API
|
|||||||
|
|
||||||
app.UseResponseCompression();
|
app.UseResponseCompression();
|
||||||
|
|
||||||
app.UseForwardedHeaders();
|
app.UseForwardedHeaders(new ForwardedHeadersOptions
|
||||||
|
{
|
||||||
|
ForwardedHeaders = ForwardedHeaders.All
|
||||||
|
});
|
||||||
|
|
||||||
app.UseRouting();
|
app.UseRouting();
|
||||||
|
|
||||||
@ -210,6 +218,15 @@ namespace API
|
|||||||
applicationLifetime.ApplicationStopping.Register(OnShutdown);
|
applicationLifetime.ApplicationStopping.Register(OnShutdown);
|
||||||
applicationLifetime.ApplicationStarted.Register(() =>
|
applicationLifetime.ApplicationStarted.Register(() =>
|
||||||
{
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var logger = serviceProvider.GetRequiredService<ILogger<Startup>>();
|
||||||
|
logger.LogInformation("Kavita - v{Version}", BuildInfo.Version);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
}
|
||||||
Console.WriteLine($"Kavita - v{BuildInfo.Version}");
|
Console.WriteLine($"Kavita - v{BuildInfo.Version}");
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
@ -1,21 +1,21 @@
|
|||||||
{
|
{
|
||||||
"ConnectionStrings": {
|
"ConnectionStrings": {
|
||||||
"DefaultConnection": "Data source=kavita.db"
|
"DefaultConnection": "Data source=config//kavita.db"
|
||||||
},
|
},
|
||||||
"TokenKey": "super secret unguessable key",
|
"TokenKey": "super secret unguessable key",
|
||||||
"Logging": {
|
"Logging": {
|
||||||
"LogLevel": {
|
"LogLevel": {
|
||||||
"Default": "Debug",
|
"Default": "Information",
|
||||||
"Microsoft": "Information",
|
"Microsoft": "Information",
|
||||||
"Microsoft.Hosting.Lifetime": "Error",
|
"Microsoft.Hosting.Lifetime": "Error",
|
||||||
"Hangfire": "Information",
|
"Hangfire": "Information",
|
||||||
"Microsoft.AspNetCore.Hosting.Internal.WebHost": "Information"
|
"Microsoft.AspNetCore.Hosting.Internal.WebHost": "Information"
|
||||||
},
|
},
|
||||||
"File": {
|
"File": {
|
||||||
"Path": "logs/kavita.log",
|
"Path": "config//logs/kavita.log",
|
||||||
"Append": "True",
|
"Append": "True",
|
||||||
"FileSizeLimitBytes": 10485760,
|
"FileSizeLimitBytes": 26214400,
|
||||||
"MaxRollingFiles": 5
|
"MaxRollingFiles": 2
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"Port": 5000
|
"Port": 5000
|
@ -20,19 +20,14 @@ COPY --from=copytask /files/wwwroot /kavita/wwwroot
|
|||||||
|
|
||||||
#Installs program dependencies
|
#Installs program dependencies
|
||||||
RUN apt-get update \
|
RUN apt-get update \
|
||||||
&& apt-get install -y libicu-dev libssl1.1 pwgen libgdiplus \
|
&& apt-get install -y libicu-dev libssl1.1 libgdiplus \
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
#Creates the data directory
|
|
||||||
RUN mkdir /kavita/data
|
|
||||||
|
|
||||||
RUN sed -i 's/Data source=kavita.db/Data source=data\/kavita.db/g' /kavita/appsettings.json
|
|
||||||
|
|
||||||
COPY entrypoint.sh /entrypoint.sh
|
COPY entrypoint.sh /entrypoint.sh
|
||||||
|
|
||||||
EXPOSE 5000
|
EXPOSE 5000
|
||||||
|
|
||||||
WORKDIR /kavita
|
WORKDIR /kavita
|
||||||
|
|
||||||
ENTRYPOINT ["/bin/bash"]
|
ENTRYPOINT [ "/bin/bash" ]
|
||||||
CMD ["/entrypoint.sh"]
|
CMD ["/entrypoint.sh"]
|
||||||
|
@ -3,3 +3,5 @@
|
|||||||
2. (Linux only) Chmod and Chown so Kavita can write to the directory you placed in.
|
2. (Linux only) Chmod and Chown so Kavita can write to the directory you placed in.
|
||||||
3. Run Kavita executable.
|
3. Run Kavita executable.
|
||||||
4. Open localhost:5000 and setup your account and libraries in the UI.
|
4. Open localhost:5000 and setup your account and libraries in the UI.
|
||||||
|
|
||||||
|
If updating, copy everything but the config/ directory over. Restart Kavita.
|
||||||
|
7
Kavita.Common/AppSettingsConfig.cs
Normal file
7
Kavita.Common/AppSettingsConfig.cs
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
namespace Kavita.Common
|
||||||
|
{
|
||||||
|
public class AppSettingsConfig
|
||||||
|
{
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
@ -6,236 +6,349 @@ using Microsoft.Extensions.Hosting;
|
|||||||
|
|
||||||
namespace Kavita.Common
|
namespace Kavita.Common
|
||||||
{
|
{
|
||||||
public static class Configuration
|
public static class Configuration
|
||||||
{
|
{
|
||||||
private static readonly string AppSettingsFilename = GetAppSettingFilename();
|
public static readonly string AppSettingsFilename = Path.Join("config", GetAppSettingFilename());
|
||||||
public static string Branch
|
|
||||||
{
|
|
||||||
get => GetBranch(GetAppSettingFilename());
|
|
||||||
set => SetBranch(GetAppSettingFilename(), value);
|
|
||||||
}
|
|
||||||
|
|
||||||
public static int Port
|
public static string Branch
|
||||||
{
|
{
|
||||||
get => GetPort(GetAppSettingFilename());
|
get => GetBranch(GetAppSettingFilename());
|
||||||
set => SetPort(GetAppSettingFilename(), value);
|
set => SetBranch(GetAppSettingFilename(), value);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static string JwtToken
|
public static int Port
|
||||||
{
|
{
|
||||||
get => GetJwtToken(GetAppSettingFilename());
|
get => GetPort(GetAppSettingFilename());
|
||||||
set => SetJwtToken(GetAppSettingFilename(), value);
|
set => SetPort(GetAppSettingFilename(), value);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static string LogLevel
|
public static string JwtToken
|
||||||
{
|
{
|
||||||
get => GetLogLevel(GetAppSettingFilename());
|
get => GetJwtToken(GetAppSettingFilename());
|
||||||
set => SetLogLevel(GetAppSettingFilename(), value);
|
set => SetJwtToken(GetAppSettingFilename(), value);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static string GetAppSettingFilename()
|
public static string LogLevel
|
||||||
{
|
{
|
||||||
if (!string.IsNullOrEmpty(AppSettingsFilename))
|
get => GetLogLevel(GetAppSettingFilename());
|
||||||
{
|
set => SetLogLevel(GetAppSettingFilename(), value);
|
||||||
return AppSettingsFilename;
|
}
|
||||||
}
|
|
||||||
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
|
|
||||||
var isDevelopment = environment == Environments.Development;
|
|
||||||
return "appsettings" + (isDevelopment ? ".Development" : "") + ".json";
|
|
||||||
}
|
|
||||||
|
|
||||||
#region JWT Token
|
public static string LogPath
|
||||||
|
{
|
||||||
|
get => GetLoggingFile(GetAppSettingFilename());
|
||||||
|
set => SetLoggingFile(GetAppSettingFilename(), value);
|
||||||
|
}
|
||||||
|
|
||||||
private static string GetJwtToken(string filePath)
|
public static string DatabasePath
|
||||||
{
|
{
|
||||||
try
|
get => GetDatabasePath(GetAppSettingFilename());
|
||||||
{
|
set => SetDatabasePath(GetAppSettingFilename(), value);
|
||||||
var json = File.ReadAllText(filePath);
|
}
|
||||||
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
|
||||||
const string key = "TokenKey";
|
|
||||||
|
|
||||||
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
private static string GetAppSettingFilename()
|
||||||
|
{
|
||||||
|
if (!string.IsNullOrEmpty(AppSettingsFilename))
|
||||||
{
|
{
|
||||||
return tokenElement.GetString();
|
return AppSettingsFilename;
|
||||||
|
}
|
||||||
|
|
||||||
|
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
|
||||||
|
var isDevelopment = environment == Environments.Development;
|
||||||
|
return "appsettings" + (isDevelopment ? ".Development" : string.Empty) + ".json";
|
||||||
|
}
|
||||||
|
|
||||||
|
#region JWT Token
|
||||||
|
|
||||||
|
private static string GetJwtToken(string filePath)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var json = File.ReadAllText(filePath);
|
||||||
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
|
const string key = "TokenKey";
|
||||||
|
|
||||||
|
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
||||||
|
{
|
||||||
|
return tokenElement.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
return string.Empty;
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error reading app settings: " + ex.Message);
|
||||||
}
|
}
|
||||||
|
|
||||||
return string.Empty;
|
return string.Empty;
|
||||||
}
|
}
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Error reading app settings: " + ex.Message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return string.Empty;
|
private static void SetJwtToken(string filePath, string token)
|
||||||
}
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var currentToken = GetJwtToken(filePath);
|
||||||
|
var json = File.ReadAllText(filePath)
|
||||||
|
.Replace("\"TokenKey\": \"" + currentToken, "\"TokenKey\": \"" + token);
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow exception */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private static void SetJwtToken(string filePath, string token)
|
public static bool CheckIfJwtTokenSet()
|
||||||
{
|
{
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var currentToken = GetJwtToken(filePath);
|
return GetJwtToken(GetAppSettingFilename()) != "super secret unguessable key";
|
||||||
var json = File.ReadAllText(filePath)
|
}
|
||||||
.Replace("\"TokenKey\": \"" + currentToken, "\"TokenKey\": \"" + token);
|
catch (Exception ex)
|
||||||
File.WriteAllText(filePath, json);
|
{
|
||||||
}
|
Console.WriteLine("Error writing app settings: " + ex.Message);
|
||||||
catch (Exception)
|
}
|
||||||
{
|
|
||||||
/* Swallow exception */
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
public static bool CheckIfJwtTokenSet()
|
return false;
|
||||||
{
|
}
|
||||||
try
|
|
||||||
{
|
|
||||||
return GetJwtToken(GetAppSettingFilename()) != "super secret unguessable key";
|
|
||||||
}
|
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Error writing app settings: " + ex.Message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return false;
|
#endregion
|
||||||
}
|
|
||||||
|
|
||||||
|
#region Port
|
||||||
|
|
||||||
#endregion
|
private static void SetPort(string filePath, int port)
|
||||||
|
{
|
||||||
|
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
#region Port
|
try
|
||||||
|
{
|
||||||
|
var currentPort = GetPort(filePath);
|
||||||
|
var json = File.ReadAllText(filePath).Replace("\"Port\": " + currentPort, "\"Port\": " + port);
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private static void SetPort(string filePath, int port)
|
private static int GetPort(string filePath)
|
||||||
{
|
{
|
||||||
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
|
const int defaultPort = 5000;
|
||||||
{
|
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
|
||||||
return;
|
{
|
||||||
}
|
return defaultPort;
|
||||||
|
}
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var currentPort = GetPort(filePath);
|
var json = File.ReadAllText(filePath);
|
||||||
var json = File.ReadAllText(filePath).Replace("\"Port\": " + currentPort, "\"Port\": " + port);
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
File.WriteAllText(filePath, json);
|
const string key = "Port";
|
||||||
}
|
|
||||||
catch (Exception)
|
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
||||||
{
|
{
|
||||||
/* Swallow Exception */
|
return tokenElement.GetInt32();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error writing app settings: " + ex.Message);
|
||||||
|
}
|
||||||
|
|
||||||
private static int GetPort(string filePath)
|
|
||||||
{
|
|
||||||
const int defaultPort = 5000;
|
|
||||||
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
|
|
||||||
{
|
|
||||||
return defaultPort;
|
return defaultPort;
|
||||||
}
|
}
|
||||||
|
|
||||||
try
|
#endregion
|
||||||
{
|
|
||||||
var json = File.ReadAllText(filePath);
|
|
||||||
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
|
||||||
const string key = "Port";
|
|
||||||
|
|
||||||
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
#region LogLevel
|
||||||
|
|
||||||
|
private static void SetLogLevel(string filePath, string logLevel)
|
||||||
|
{
|
||||||
|
try
|
||||||
{
|
{
|
||||||
return tokenElement.GetInt32();
|
var currentLevel = GetLogLevel(filePath);
|
||||||
|
var json = File.ReadAllText(filePath)
|
||||||
|
.Replace($"\"Default\": \"{currentLevel}\"", $"\"Default\": \"{logLevel}\"");
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
}
|
}
|
||||||
}
|
catch (Exception)
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Error writing app settings: " + ex.Message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return defaultPort;
|
|
||||||
}
|
|
||||||
|
|
||||||
#endregion
|
|
||||||
|
|
||||||
#region LogLevel
|
|
||||||
|
|
||||||
private static void SetLogLevel(string filePath, string logLevel)
|
|
||||||
{
|
|
||||||
try
|
|
||||||
{
|
|
||||||
var currentLevel = GetLogLevel(filePath);
|
|
||||||
var json = File.ReadAllText(filePath)
|
|
||||||
.Replace($"\"Default\": \"{currentLevel}\"", $"\"Default\": \"{logLevel}\"");
|
|
||||||
File.WriteAllText(filePath, json);
|
|
||||||
}
|
|
||||||
catch (Exception)
|
|
||||||
{
|
|
||||||
/* Swallow Exception */
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private static string GetLogLevel(string filePath)
|
|
||||||
{
|
|
||||||
try
|
|
||||||
{
|
|
||||||
var json = File.ReadAllText(filePath);
|
|
||||||
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
|
||||||
|
|
||||||
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
|
|
||||||
{
|
{
|
||||||
foreach (var property in tokenElement.EnumerateObject())
|
/* Swallow Exception */
|
||||||
{
|
|
||||||
if (!property.Name.Equals("LogLevel")) continue;
|
|
||||||
foreach (var logProperty in property.Value.EnumerateObject())
|
|
||||||
{
|
|
||||||
if (logProperty.Name.Equals("Default"))
|
|
||||||
{
|
|
||||||
return logProperty.Value.GetString();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Error writing app settings: " + ex.Message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return "Information";
|
private static string GetLogLevel(string filePath)
|
||||||
}
|
{
|
||||||
|
try
|
||||||
#endregion
|
|
||||||
|
|
||||||
private static string GetBranch(string filePath)
|
|
||||||
{
|
|
||||||
const string defaultBranch = "main";
|
|
||||||
|
|
||||||
try
|
|
||||||
{
|
|
||||||
var json = File.ReadAllText(filePath);
|
|
||||||
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
|
||||||
const string key = "Branch";
|
|
||||||
|
|
||||||
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
|
||||||
{
|
{
|
||||||
return tokenElement.GetString();
|
var json = File.ReadAllText(filePath);
|
||||||
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
|
|
||||||
|
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
|
||||||
|
{
|
||||||
|
foreach (var property in tokenElement.EnumerateObject())
|
||||||
|
{
|
||||||
|
if (!property.Name.Equals("LogLevel")) continue;
|
||||||
|
foreach (var logProperty in property.Value.EnumerateObject())
|
||||||
|
{
|
||||||
|
if (logProperty.Name.Equals("Default"))
|
||||||
|
{
|
||||||
|
return logProperty.Value.GetString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error writing app settings: " + ex.Message);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
catch (Exception ex)
|
|
||||||
{
|
|
||||||
Console.WriteLine("Error reading app settings: " + ex.Message);
|
|
||||||
}
|
|
||||||
|
|
||||||
return defaultBranch;
|
return "Information";
|
||||||
}
|
}
|
||||||
|
|
||||||
private static void SetBranch(string filePath, string updatedBranch)
|
#endregion
|
||||||
{
|
|
||||||
try
|
private static string GetBranch(string filePath)
|
||||||
{
|
{
|
||||||
var currentBranch = GetBranch(filePath);
|
const string defaultBranch = "main";
|
||||||
var json = File.ReadAllText(filePath)
|
|
||||||
.Replace("\"Branch\": " + currentBranch, "\"Branch\": " + updatedBranch);
|
try
|
||||||
File.WriteAllText(filePath, json);
|
{
|
||||||
}
|
var json = File.ReadAllText(filePath);
|
||||||
catch (Exception)
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
{
|
const string key = "Branch";
|
||||||
/* Swallow Exception */
|
|
||||||
}
|
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
|
||||||
}
|
{
|
||||||
}
|
return tokenElement.GetString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error reading app settings: " + ex.Message);
|
||||||
|
}
|
||||||
|
|
||||||
|
return defaultBranch;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void SetBranch(string filePath, string updatedBranch)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var currentBranch = GetBranch(filePath);
|
||||||
|
var json = File.ReadAllText(filePath)
|
||||||
|
.Replace("\"Branch\": " + currentBranch, "\"Branch\": " + updatedBranch);
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string GetLoggingFile(string filePath)
|
||||||
|
{
|
||||||
|
const string defaultFile = "config/logs/kavita.log";
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var json = File.ReadAllText(filePath);
|
||||||
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
|
|
||||||
|
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
|
||||||
|
{
|
||||||
|
foreach (var property in tokenElement.EnumerateObject())
|
||||||
|
{
|
||||||
|
if (!property.Name.Equals("File")) continue;
|
||||||
|
foreach (var logProperty in property.Value.EnumerateObject())
|
||||||
|
{
|
||||||
|
if (logProperty.Name.Equals("Path"))
|
||||||
|
{
|
||||||
|
return logProperty.Value.GetString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error writing app settings: " + ex.Message);
|
||||||
|
}
|
||||||
|
|
||||||
|
return defaultFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// This should NEVER be called except by <see cref="MigrateConfigFiles"/>
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="filePath"></param>
|
||||||
|
/// <param name="directory"></param>
|
||||||
|
private static void SetLoggingFile(string filePath, string directory)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var currentFile = GetLoggingFile(filePath);
|
||||||
|
var json = File.ReadAllText(filePath)
|
||||||
|
.Replace("\"Path\": \"" + currentFile + "\"", "\"Path\": \"" + directory + "\"");
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
Console.WriteLine(ex);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string GetDatabasePath(string filePath)
|
||||||
|
{
|
||||||
|
const string defaultFile = "config/kavita.db";
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var json = File.ReadAllText(filePath);
|
||||||
|
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
|
||||||
|
|
||||||
|
if (jsonObj.TryGetProperty("ConnectionStrings", out JsonElement tokenElement))
|
||||||
|
{
|
||||||
|
foreach (var property in tokenElement.EnumerateObject())
|
||||||
|
{
|
||||||
|
if (!property.Name.Equals("DefaultConnection")) continue;
|
||||||
|
return property.Value.GetString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
Console.WriteLine("Error writing app settings: " + ex.Message);
|
||||||
|
}
|
||||||
|
|
||||||
|
return defaultFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// This should NEVER be called except by <see cref="MigrateConfigFiles"/>
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="filePath"></param>
|
||||||
|
/// <param name="updatedPath"></param>
|
||||||
|
private static void SetDatabasePath(string filePath, string updatedPath)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var existingString = GetDatabasePath(filePath);
|
||||||
|
var json = File.ReadAllText(filePath)
|
||||||
|
.Replace(existingString,
|
||||||
|
"Data source=" + updatedPath);
|
||||||
|
File.WriteAllText(filePath, json);
|
||||||
|
}
|
||||||
|
catch (Exception)
|
||||||
|
{
|
||||||
|
/* Swallow Exception */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -41,6 +41,7 @@ namespace Kavita.Common.EnvironmentInfo
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public OsInfo(IEnumerable<IOsVersionAdapter> versionAdapters)
|
public OsInfo(IEnumerable<IOsVersionAdapter> versionAdapters)
|
||||||
|
@ -4,7 +4,7 @@
|
|||||||
<TargetFramework>net5.0</TargetFramework>
|
<TargetFramework>net5.0</TargetFramework>
|
||||||
<Company>kavitareader.com</Company>
|
<Company>kavitareader.com</Company>
|
||||||
<Product>Kavita</Product>
|
<Product>Kavita</Product>
|
||||||
<AssemblyVersion>0.4.7.0</AssemblyVersion>
|
<AssemblyVersion>0.4.8.1</AssemblyVersion>
|
||||||
<NeutralLanguage>en</NeutralLanguage>
|
<NeutralLanguage>en</NeutralLanguage>
|
||||||
</PropertyGroup>
|
</PropertyGroup>
|
||||||
|
|
||||||
|
@ -2,4 +2,5 @@
|
|||||||
<s:String x:Key="/Default/CodeInspection/ExcludedFiles/FilesAndFoldersToSkip2/=1BC0273F_002DFEBE_002D4DA1_002DBC04_002D3A3167E4C86C_002Fd_003AData_002Fd_003AMigrations/@EntryIndexedValue">ExplicitlyExcluded</s:String>
|
<s:String x:Key="/Default/CodeInspection/ExcludedFiles/FilesAndFoldersToSkip2/=1BC0273F_002DFEBE_002D4DA1_002DBC04_002D3A3167E4C86C_002Fd_003AData_002Fd_003AMigrations/@EntryIndexedValue">ExplicitlyExcluded</s:String>
|
||||||
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunLongAnalysisInSwa/@EntryValue">True</s:Boolean>
|
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunLongAnalysisInSwa/@EntryValue">True</s:Boolean>
|
||||||
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunValueAnalysisInNullableWarningsEnabledContext2/@EntryValue">True</s:Boolean>
|
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunValueAnalysisInNullableWarningsEnabledContext2/@EntryValue">True</s:Boolean>
|
||||||
<s:Boolean x:Key="/Default/UserDictionary/Words/=Opds/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>
|
<s:Boolean x:Key="/Default/UserDictionary/Words/=Opds/@EntryIndexedValue">True</s:Boolean>
|
||||||
|
<s:Boolean x:Key="/Default/UserDictionary/Words/=rewinded/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>
|
11
README.md
11
README.md
@ -48,7 +48,7 @@ Password: Demouser64
|
|||||||
- Place in a directory that is writable. If on windows, do not place in Program Files
|
- Place in a directory that is writable. If on windows, do not place in Program Files
|
||||||
- Linux users must ensure the directory & kavita.db is writable by Kavita (might require starting server once)
|
- Linux users must ensure the directory & kavita.db is writable by Kavita (might require starting server once)
|
||||||
- Run Kavita
|
- Run Kavita
|
||||||
- If you are updating, do not copy appsettings.json from the new version over. It will override your TokenKey and you will have to reauthenticate on your devices.
|
- If you are updating, copy everything over into install location. All Kavita data is stored in config/, so nothing will be overwritten.
|
||||||
- Open localhost:5000 and setup your account and libraries in the UI.
|
- Open localhost:5000 and setup your account and libraries in the UI.
|
||||||
### Docker
|
### Docker
|
||||||
Running your Kavita server in docker is super easy! Barely an inconvenience. You can run it with this command:
|
Running your Kavita server in docker is super easy! Barely an inconvenience. You can run it with this command:
|
||||||
@ -56,7 +56,7 @@ Running your Kavita server in docker is super easy! Barely an inconvenience. You
|
|||||||
```
|
```
|
||||||
docker run --name kavita -p 5000:5000 \
|
docker run --name kavita -p 5000:5000 \
|
||||||
-v /your/manga/directory:/manga \
|
-v /your/manga/directory:/manga \
|
||||||
-v /kavita/data/directory:/kavita/data \
|
-v /kavita/data/directory:/kavita/config \
|
||||||
--restart unless-stopped \
|
--restart unless-stopped \
|
||||||
-d kizaing/kavita:latest
|
-d kizaing/kavita:latest
|
||||||
```
|
```
|
||||||
@ -64,19 +64,20 @@ docker run --name kavita -p 5000:5000 \
|
|||||||
You can also run it via the docker-compose file:
|
You can also run it via the docker-compose file:
|
||||||
|
|
||||||
```
|
```
|
||||||
version: '3.9'
|
version: '3'
|
||||||
services:
|
services:
|
||||||
kavita:
|
kavita:
|
||||||
image: kizaing/kavita:latest
|
image: kizaing/kavita:latest
|
||||||
|
container_name: kavita
|
||||||
volumes:
|
volumes:
|
||||||
- ./manga:/manga
|
- ./manga:/manga
|
||||||
- ./data:/kavita/data
|
- ./config:/kavita/config
|
||||||
ports:
|
ports:
|
||||||
- "5000:5000"
|
- "5000:5000"
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
```
|
```
|
||||||
|
|
||||||
**Note: Kavita is under heavy development and is being updated all the time, so the tag for current builds is `:nightly`. The `:latest` tag will be the latest stable release. There is also the `:alpine` tag if you want a smaller image, but it is only available for x64 systems.**
|
**Note: Kavita is under heavy development and is being updated all the time, so the tag for current builds is `:nightly`. The `:latest` tag will be the latest stable release.**
|
||||||
|
|
||||||
## Feature Requests
|
## Feature Requests
|
||||||
Got a great idea? Throw it up on the FeatHub or vote on another idea. Please check the [Project Board](https://github.com/Kareadita/Kavita/projects) first for a list of planned features.
|
Got a great idea? Throw it up on the FeatHub or vote on another idea. Please check the [Project Board](https://github.com/Kareadita/Kavita/projects) first for a list of planned features.
|
||||||
|
@ -111,11 +111,7 @@ export class ErrorInterceptor implements HttpInterceptor {
|
|||||||
// NOTE: Signin has error.error or error.statusText available.
|
// NOTE: Signin has error.error or error.statusText available.
|
||||||
// if statement is due to http/2 spec issue: https://github.com/angular/angular/issues/23334
|
// if statement is due to http/2 spec issue: https://github.com/angular/angular/issues/23334
|
||||||
this.accountService.currentUser$.pipe(take(1)).subscribe(user => {
|
this.accountService.currentUser$.pipe(take(1)).subscribe(user => {
|
||||||
if (user) {
|
|
||||||
this.toastr.error(error.statusText === 'OK' ? 'Unauthorized' : error.statusText, error.status);
|
|
||||||
}
|
|
||||||
this.accountService.logout();
|
this.accountService.logout();
|
||||||
});
|
});
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -6,7 +6,7 @@ export interface SearchResult {
|
|||||||
libraryName: string;
|
libraryName: string;
|
||||||
name: string;
|
name: string;
|
||||||
originalName: string;
|
originalName: string;
|
||||||
|
localizedName: string;
|
||||||
sortName: string;
|
sortName: string;
|
||||||
coverImage: string; // byte64 encoded (not used)
|
|
||||||
format: MangaFormat;
|
format: MangaFormat;
|
||||||
}
|
}
|
||||||
|
@ -19,7 +19,8 @@ export enum Action {
|
|||||||
Download = 7,
|
Download = 7,
|
||||||
Bookmarks = 8,
|
Bookmarks = 8,
|
||||||
IncognitoRead = 9,
|
IncognitoRead = 9,
|
||||||
AddToReadingList = 10
|
AddToReadingList = 10,
|
||||||
|
AddToCollection = 11
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface ActionItem<T> {
|
export interface ActionItem<T> {
|
||||||
@ -90,6 +91,13 @@ export class ActionFactoryService {
|
|||||||
requiresAdmin: true
|
requiresAdmin: true
|
||||||
});
|
});
|
||||||
|
|
||||||
|
this.seriesActions.push({
|
||||||
|
action: Action.AddToCollection,
|
||||||
|
title: 'Add to Collection',
|
||||||
|
callback: this.dummyCallback,
|
||||||
|
requiresAdmin: true
|
||||||
|
});
|
||||||
|
|
||||||
this.seriesActions.push({
|
this.seriesActions.push({
|
||||||
action: Action.Edit,
|
action: Action.Edit,
|
||||||
title: 'Edit',
|
title: 'Edit',
|
||||||
@ -209,7 +217,7 @@ export class ActionFactoryService {
|
|||||||
title: 'Add to Reading List',
|
title: 'Add to Reading List',
|
||||||
callback: this.dummyCallback,
|
callback: this.dummyCallback,
|
||||||
requiresAdmin: false
|
requiresAdmin: false
|
||||||
},
|
}
|
||||||
];
|
];
|
||||||
|
|
||||||
this.volumeActions = [
|
this.volumeActions = [
|
||||||
|
@ -4,6 +4,7 @@ import { ToastrService } from 'ngx-toastr';
|
|||||||
import { Subject } from 'rxjs';
|
import { Subject } from 'rxjs';
|
||||||
import { take } from 'rxjs/operators';
|
import { take } from 'rxjs/operators';
|
||||||
import { BookmarksModalComponent } from '../cards/_modals/bookmarks-modal/bookmarks-modal.component';
|
import { BookmarksModalComponent } from '../cards/_modals/bookmarks-modal/bookmarks-modal.component';
|
||||||
|
import { BulkAddToCollectionComponent } from '../cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component';
|
||||||
import { AddToListModalComponent, ADD_FLOW } from '../reading-list/_modals/add-to-list-modal/add-to-list-modal.component';
|
import { AddToListModalComponent, ADD_FLOW } from '../reading-list/_modals/add-to-list-modal/add-to-list-modal.component';
|
||||||
import { EditReadingListModalComponent } from '../reading-list/_modals/edit-reading-list-modal/edit-reading-list-modal.component';
|
import { EditReadingListModalComponent } from '../reading-list/_modals/edit-reading-list-modal/edit-reading-list-modal.component';
|
||||||
import { ConfirmService } from '../shared/confirm.service';
|
import { ConfirmService } from '../shared/confirm.service';
|
||||||
@ -34,6 +35,7 @@ export class ActionService implements OnDestroy {
|
|||||||
private readonly onDestroy = new Subject<void>();
|
private readonly onDestroy = new Subject<void>();
|
||||||
private bookmarkModalRef: NgbModalRef | null = null;
|
private bookmarkModalRef: NgbModalRef | null = null;
|
||||||
private readingListModalRef: NgbModalRef | null = null;
|
private readingListModalRef: NgbModalRef | null = null;
|
||||||
|
private collectionModalRef: NgbModalRef | null = null;
|
||||||
|
|
||||||
constructor(private libraryService: LibraryService, private seriesService: SeriesService,
|
constructor(private libraryService: LibraryService, private seriesService: SeriesService,
|
||||||
private readerService: ReaderService, private toastr: ToastrService, private modalService: NgbModal,
|
private readerService: ReaderService, private toastr: ToastrService, private modalService: NgbModal,
|
||||||
@ -358,6 +360,32 @@ export class ActionService implements OnDestroy {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Adds a set of series to a collection tag
|
||||||
|
* @param series
|
||||||
|
* @param callback
|
||||||
|
* @returns
|
||||||
|
*/
|
||||||
|
addMultipleSeriesToCollectionTag(series: Array<Series>, callback?: VoidActionCallback) {
|
||||||
|
if (this.collectionModalRef != null) { return; }
|
||||||
|
this.collectionModalRef = this.modalService.open(BulkAddToCollectionComponent, { scrollable: true, size: 'md' });
|
||||||
|
this.collectionModalRef.componentInstance.seriesIds = series.map(v => v.id);
|
||||||
|
this.collectionModalRef.componentInstance.title = 'New Collection';
|
||||||
|
|
||||||
|
this.collectionModalRef.closed.pipe(take(1)).subscribe(() => {
|
||||||
|
this.collectionModalRef = null;
|
||||||
|
if (callback) {
|
||||||
|
callback();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
this.collectionModalRef.dismissed.pipe(take(1)).subscribe(() => {
|
||||||
|
this.collectionModalRef = null;
|
||||||
|
if (callback) {
|
||||||
|
callback();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
addSeriesToReadingList(series: Series, callback?: SeriesActionCallback) {
|
addSeriesToReadingList(series: Series, callback?: SeriesActionCallback) {
|
||||||
if (this.readingListModalRef != null) { return; }
|
if (this.readingListModalRef != null) { return; }
|
||||||
this.readingListModalRef = this.modalService.open(AddToListModalComponent, { scrollable: true, size: 'md' });
|
this.readingListModalRef = this.modalService.open(AddToListModalComponent, { scrollable: true, size: 'md' });
|
||||||
@ -439,4 +467,21 @@ export class ActionService implements OnDestroy {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Mark all chapters and the volumes as Read. All volumes and chapters must belong to a series
|
||||||
|
* @param seriesId Series Id
|
||||||
|
* @param volumes Volumes, should have id, chapters and pagesRead populated
|
||||||
|
* @param chapters? Chapters, should have id
|
||||||
|
* @param callback Optional callback to perform actions after API completes
|
||||||
|
*/
|
||||||
|
deleteMultipleSeries(seriesIds: Array<Series>, callback?: VoidActionCallback) {
|
||||||
|
this.seriesService.deleteMultipleSeries(seriesIds.map(s => s.id)).pipe(take(1)).subscribe(() => {
|
||||||
|
this.toastr.success('Series deleted');
|
||||||
|
|
||||||
|
if (callback) {
|
||||||
|
callback();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -35,4 +35,8 @@ export class CollectionTagService {
|
|||||||
updateSeriesForTag(tag: CollectionTag, seriesIdsToRemove: Array<number>) {
|
updateSeriesForTag(tag: CollectionTag, seriesIdsToRemove: Array<number>) {
|
||||||
return this.httpClient.post(this.baseUrl + 'collection/update-series', {tag, seriesIdsToRemove}, {responseType: 'text' as 'json'});
|
return this.httpClient.post(this.baseUrl + 'collection/update-series', {tag, seriesIdsToRemove}, {responseType: 'text' as 'json'});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
addByMultiple(tagId: number, seriesIds: Array<number>, tagTitle: string = '') {
|
||||||
|
return this.httpClient.post(this.baseUrl + 'collection/update-for-series', {collectionTagId: tagId, collectionTagTitle: tagTitle, seriesIds}, {responseType: 'text' as 'json'});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -18,7 +18,8 @@ export enum EVENTS {
|
|||||||
SeriesAdded = 'SeriesAdded',
|
SeriesAdded = 'SeriesAdded',
|
||||||
ScanLibraryProgress = 'ScanLibraryProgress',
|
ScanLibraryProgress = 'ScanLibraryProgress',
|
||||||
OnlineUsers = 'OnlineUsers',
|
OnlineUsers = 'OnlineUsers',
|
||||||
SeriesAddedToCollection = 'SeriesAddedToCollection'
|
SeriesAddedToCollection = 'SeriesAddedToCollection',
|
||||||
|
ScanLibraryError = 'ScanLibraryError'
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface Message<T> {
|
export interface Message<T> {
|
||||||
@ -93,6 +94,16 @@ export class MessageHubService {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
this.hubConnection.on(EVENTS.ScanLibraryError, resp => {
|
||||||
|
this.messagesSource.next({
|
||||||
|
event: EVENTS.ScanLibraryError,
|
||||||
|
payload: resp.body
|
||||||
|
});
|
||||||
|
if (this.isAdmin) {
|
||||||
|
this.toastr.error('Library Scan had a critical error. Some series were not saved. Check logs');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
this.hubConnection.on(EVENTS.SeriesAdded, resp => {
|
this.hubConnection.on(EVENTS.SeriesAdded, resp => {
|
||||||
this.messagesSource.next({
|
this.messagesSource.next({
|
||||||
event: EVENTS.SeriesAdded,
|
event: EVENTS.SeriesAdded,
|
||||||
|
@ -80,6 +80,10 @@ export class SeriesService {
|
|||||||
return this.httpClient.delete<boolean>(this.baseUrl + 'series/' + seriesId);
|
return this.httpClient.delete<boolean>(this.baseUrl + 'series/' + seriesId);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
deleteMultipleSeries(seriesIds: Array<number>) {
|
||||||
|
return this.httpClient.post<boolean>(this.baseUrl + 'series/delete-multiple', {seriesIds});
|
||||||
|
}
|
||||||
|
|
||||||
updateRating(seriesId: number, userRating: number, userReview: string) {
|
updateRating(seriesId: number, userRating: number, userReview: string) {
|
||||||
return this.httpClient.post(this.baseUrl + 'series/update-rating', {seriesId, userRating, userReview});
|
return this.httpClient.post(this.baseUrl + 'series/update-rating', {seriesId, userRating, userReview});
|
||||||
}
|
}
|
||||||
|
@ -56,13 +56,13 @@ export class ManageSettingsComponent implements OnInit {
|
|||||||
async saveSettings() {
|
async saveSettings() {
|
||||||
const modelSettings = this.settingsForm.value;
|
const modelSettings = this.settingsForm.value;
|
||||||
|
|
||||||
if (this.settingsForm.get('enableAuthentication')?.value === false) {
|
if (this.settingsForm.get('enableAuthentication')?.dirty && this.settingsForm.get('enableAuthentication')?.value === false) {
|
||||||
if (!await this.confirmService.confirm('Disabling Authentication opens your server up to unauthorized access and possible hacking. Are you sure you want to continue with this?')) {
|
if (!await this.confirmService.confirm('Disabling Authentication opens your server up to unauthorized access and possible hacking. Are you sure you want to continue with this?')) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const informUserAfterAuthenticationEnabled = this.settingsForm.get('enableAuthentication')?.value && !this.serverSettings.enableAuthentication;
|
const informUserAfterAuthenticationEnabled = this.settingsForm.get('enableAuthentication')?.dirty && this.settingsForm.get('enableAuthentication')?.value && !this.serverSettings.enableAuthentication;
|
||||||
|
|
||||||
this.settingsService.updateServerSettings(modelSettings).pipe(take(1)).subscribe(async (settings: ServerSettings) => {
|
this.settingsService.updateServerSettings(modelSettings).pipe(take(1)).subscribe(async (settings: ServerSettings) => {
|
||||||
this.serverSettings = settings;
|
this.serverSettings = settings;
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
<div class="float-right">
|
<div class="float-right">
|
||||||
<div class="d-inline-block" ngbDropdown #myDrop="ngbDropdown">
|
<div class="d-inline-block" ngbDropdown #myDrop="ngbDropdown">
|
||||||
<button class="btn btn-outline-primary mr-2" id="dropdownManual" ngbDropdownAnchor (focus)="myDrop.open()">
|
<button class="btn btn-outline-primary mr-2" id="dropdownManual" ngbDropdownToggle>
|
||||||
<ng-container *ngIf="backupDBInProgress || clearCacheInProgress || isCheckingForUpdate || downloadLogsInProgress">
|
<ng-container *ngIf="backupDBInProgress || clearCacheInProgress || isCheckingForUpdate || downloadLogsInProgress">
|
||||||
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span>
|
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span>
|
||||||
<span class="sr-only">Loading...</span>
|
<span class="sr-only">Loading...</span>
|
||||||
|
@ -1,5 +1,6 @@
|
|||||||
import { HttpClient } from '@angular/common/http';
|
import { HttpClient } from '@angular/common/http';
|
||||||
import { Injectable } from '@angular/core';
|
import { Injectable } from '@angular/core';
|
||||||
|
import { map } from 'rxjs/operators';
|
||||||
import { environment } from 'src/environments/environment';
|
import { environment } from 'src/environments/environment';
|
||||||
import { ServerSettings } from './_models/server-settings';
|
import { ServerSettings } from './_models/server-settings';
|
||||||
|
|
||||||
@ -37,6 +38,8 @@ export class SettingsService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
getAuthenticationEnabled() {
|
getAuthenticationEnabled() {
|
||||||
return this.http.get<boolean>(this.baseUrl + 'settings/authentication-enabled', {responseType: 'text' as 'json'});
|
return this.http.get<string>(this.baseUrl + 'settings/authentication-enabled', {responseType: 'text' as 'json'}).pipe(map((res: string) => {
|
||||||
|
return res === 'true';
|
||||||
|
}));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -111,6 +111,9 @@
|
|||||||
<ng-container [ngTemplateOutlet]="actionBar"></ng-container>
|
<ng-container [ngTemplateOutlet]="actionBar"></ng-container>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<!-- <div *ngIf="page !== undefined && scrollbarNeeded">
|
||||||
|
<ng-container [ngTemplateOutlet]="actionBar"></ng-container>
|
||||||
|
</div> -->
|
||||||
|
|
||||||
<ng-template #actionBar>
|
<ng-template #actionBar>
|
||||||
<div class="reading-bar row no-gutters justify-content-between">
|
<div class="reading-bar row no-gutters justify-content-between">
|
||||||
@ -122,7 +125,7 @@
|
|||||||
</button>
|
</button>
|
||||||
<button *ngIf="!this.adhocPageHistory.isEmpty()" class="btn btn-outline-secondary btn-icon col-2 col-xs-1" (click)="goBack()" title="Go Back"><i class="fa fa-reply" aria-hidden="true"></i><span class="phone-hidden"> Go Back</span></button>
|
<button *ngIf="!this.adhocPageHistory.isEmpty()" class="btn btn-outline-secondary btn-icon col-2 col-xs-1" (click)="goBack()" title="Go Back"><i class="fa fa-reply" aria-hidden="true"></i><span class="phone-hidden"> Go Back</span></button>
|
||||||
<button class="btn btn-secondary col-2 col-xs-1" (click)="toggleDrawer()"><i class="fa fa-bars" aria-hidden="true"></i><span class="phone-hidden"> Settings</span></button>
|
<button class="btn btn-secondary col-2 col-xs-1" (click)="toggleDrawer()"><i class="fa fa-bars" aria-hidden="true"></i><span class="phone-hidden"> Settings</span></button>
|
||||||
<div class="book-title col-2 phone-hidden">{{bookTitle}} <span *ngIf="incognitoMode">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
|
<div class="book-title col-2 phone-hidden">{{bookTitle}} <span *ngIf="incognitoMode" (click)="turnOffIncognito()" role="button" aria-label="Incognito mode is on. Toggle to turn off.">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
|
||||||
<button class="btn btn-secondary col-2 col-xs-1" (click)="closeReader()"><i class="fa fa-times-circle" aria-hidden="true"></i><span class="phone-hidden"> Close</span></button>
|
<button class="btn btn-secondary col-2 col-xs-1" (click)="closeReader()"><i class="fa fa-times-circle" aria-hidden="true"></i><span class="phone-hidden"> Close</span></button>
|
||||||
<button class="btn btn-outline-secondary btn-icon col-2 col-xs-1"
|
<button class="btn btn-outline-secondary btn-icon col-2 col-xs-1"
|
||||||
[disabled]="IsNextDisabled"
|
[disabled]="IsNextDisabled"
|
||||||
|
@ -155,6 +155,11 @@ $primary-color: #0062cc;
|
|||||||
|
|
||||||
.reading-section {
|
.reading-section {
|
||||||
height: 100vh;
|
height: 100vh;
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.book-content {
|
||||||
|
position: relative;
|
||||||
}
|
}
|
||||||
|
|
||||||
.drawer-body {
|
.drawer-body {
|
||||||
|
@ -160,7 +160,11 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
|
|||||||
readerStyles: string = '';
|
readerStyles: string = '';
|
||||||
darkModeStyleElem!: HTMLElement;
|
darkModeStyleElem!: HTMLElement;
|
||||||
topOffset: number = 0; // Offset for drawer and rendering canvas
|
topOffset: number = 0; // Offset for drawer and rendering canvas
|
||||||
scrollbarNeeded = false; // Used for showing/hiding bottom action bar
|
/**
|
||||||
|
* Used for showing/hiding bottom action bar. Calculates if there is enough scroll to show it.
|
||||||
|
* Will hide if all content in book is absolute positioned
|
||||||
|
*/
|
||||||
|
scrollbarNeeded = false;
|
||||||
readingDirection: ReadingDirection = ReadingDirection.LeftToRight;
|
readingDirection: ReadingDirection = ReadingDirection.LeftToRight;
|
||||||
|
|
||||||
private readonly onDestroy = new Subject<void>();
|
private readonly onDestroy = new Subject<void>();
|
||||||
@ -713,7 +717,7 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
|
|||||||
|
|
||||||
setupPage(part?: string | undefined, scrollTop?: number | undefined) {
|
setupPage(part?: string | undefined, scrollTop?: number | undefined) {
|
||||||
this.isLoading = false;
|
this.isLoading = false;
|
||||||
this.scrollbarNeeded = this.readingSectionElemRef.nativeElement.scrollHeight > this.readingSectionElemRef.nativeElement.clientHeight;
|
this.scrollbarNeeded = this.readingHtml.nativeElement.clientHeight > this.readingSectionElemRef.nativeElement.clientHeight;
|
||||||
|
|
||||||
// Find all the part ids and their top offset
|
// Find all the part ids and their top offset
|
||||||
this.setupPageAnchors();
|
this.setupPageAnchors();
|
||||||
@ -995,4 +999,15 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
|
|||||||
return '';
|
return '';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Turns off Incognito mode. This can only happen once if the user clicks the icon. This will modify URL state
|
||||||
|
*/
|
||||||
|
turnOffIncognito() {
|
||||||
|
this.incognitoMode = false;
|
||||||
|
const newRoute = this.readerService.getNextChapterUrl(this.router.url, this.chapterId, this.incognitoMode, this.readingListMode, this.readingListId);
|
||||||
|
window.history.replaceState({}, '', newRoute);
|
||||||
|
this.toastr.info('Incognito mode is off. Progress will now start being tracked.');
|
||||||
|
this.readerService.saveProgress(this.seriesId, this.volumeId, this.chapterId, this.pageNum).pipe(take(1)).subscribe(() => {/* No operation */});
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -0,0 +1,46 @@
|
|||||||
|
|
||||||
|
<div class="modal-header">
|
||||||
|
<h4 class="modal-title" id="modal-basic-title">Add to Collection</h4>
|
||||||
|
<button type="button" class="close" aria-label="Close" (click)="close()">
|
||||||
|
<span aria-hidden="true">×</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<form style="width: 100%" [formGroup]="listForm">
|
||||||
|
<div class="modal-body">
|
||||||
|
<div class="form-group" *ngIf="lists.length >= 5">
|
||||||
|
<label for="filter">Filter</label>
|
||||||
|
<div class="input-group">
|
||||||
|
<input id="filter" autocomplete="off" class="form-control" formControlName="filterQuery" type="text" aria-describedby="reset-input">
|
||||||
|
<div class="input-group-append">
|
||||||
|
<button class="btn btn-outline-secondary" type="button" id="reset-input" (click)="listForm.get('filterQuery')?.setValue('');">Clear</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<ul class="list-group">
|
||||||
|
<li class="list-group-item clickable" tabindex="0" role="button" *ngFor="let collectionTag of lists | filter: filterList; let i = index" (click)="addToCollection(collectionTag)">
|
||||||
|
{{collectionTag.title}} <i class="fa fa-angle-double-up" *ngIf="collectionTag.promoted" title="Promoted"></i>
|
||||||
|
</li>
|
||||||
|
<li class="list-group-item" *ngIf="lists.length === 0 && !loading">No collections created yet</li>
|
||||||
|
<li class="list-group-item" *ngIf="loading">
|
||||||
|
<div class="spinner-border text-secondary" role="status">
|
||||||
|
<span class="sr-only">Loading...</span>
|
||||||
|
</div>
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
<div class="modal-footer" style="justify-content: normal">
|
||||||
|
<div style="width: 100%;">
|
||||||
|
<div class="form-row">
|
||||||
|
<div class="col-9 col-lg-10">
|
||||||
|
<label class="sr-only" for="add-rlist">Collection</label>
|
||||||
|
<input width="100%" #title ngbAutofocus type="text" class="form-control mb-2" id="add-rlist" formControlName="title">
|
||||||
|
</div>
|
||||||
|
<div class="col-2">
|
||||||
|
<button type="submit" class="btn btn-primary" (click)="create()">Create</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
|
||||||
|
|
@ -0,0 +1,7 @@
|
|||||||
|
.clickable {
|
||||||
|
cursor: pointer;
|
||||||
|
}
|
||||||
|
|
||||||
|
.clickable:hover, .clickable:focus {
|
||||||
|
background-color: lightgreen;
|
||||||
|
}
|
@ -0,0 +1,79 @@
|
|||||||
|
import { Component, ElementRef, Input, OnInit, ViewChild } from '@angular/core';
|
||||||
|
import { FormGroup, FormControl } from '@angular/forms';
|
||||||
|
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
|
||||||
|
import { ToastrService } from 'ngx-toastr';
|
||||||
|
import { CollectionTag } from 'src/app/_models/collection-tag';
|
||||||
|
import { ReadingList } from 'src/app/_models/reading-list';
|
||||||
|
import { CollectionTagService } from 'src/app/_services/collection-tag.service';
|
||||||
|
|
||||||
|
@Component({
|
||||||
|
selector: 'app-bulk-add-to-collection',
|
||||||
|
templateUrl: './bulk-add-to-collection.component.html',
|
||||||
|
styleUrls: ['./bulk-add-to-collection.component.scss']
|
||||||
|
})
|
||||||
|
export class BulkAddToCollectionComponent implements OnInit {
|
||||||
|
|
||||||
|
@Input() title!: string;
|
||||||
|
/**
|
||||||
|
* Series Ids to add to Collection Tag
|
||||||
|
*/
|
||||||
|
@Input() seriesIds: Array<number> = [];
|
||||||
|
|
||||||
|
/**
|
||||||
|
* All existing collections sorted by recent use date
|
||||||
|
*/
|
||||||
|
lists: Array<CollectionTag> = [];
|
||||||
|
loading: boolean = false;
|
||||||
|
listForm: FormGroup = new FormGroup({});
|
||||||
|
|
||||||
|
@ViewChild('title') inputElem!: ElementRef<HTMLInputElement>;
|
||||||
|
|
||||||
|
|
||||||
|
constructor(private modal: NgbActiveModal, private collectionService: CollectionTagService, private toastr: ToastrService) { }
|
||||||
|
|
||||||
|
ngOnInit(): void {
|
||||||
|
|
||||||
|
this.listForm.addControl('title', new FormControl(this.title, []));
|
||||||
|
this.listForm.addControl('filterQuery', new FormControl('', []));
|
||||||
|
|
||||||
|
this.loading = true;
|
||||||
|
this.collectionService.allTags().subscribe(tags => {
|
||||||
|
this.lists = tags;
|
||||||
|
this.loading = false;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
ngAfterViewInit() {
|
||||||
|
// Shift focus to input
|
||||||
|
if (this.inputElem) {
|
||||||
|
this.inputElem.nativeElement.select();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
close() {
|
||||||
|
this.modal.close();
|
||||||
|
}
|
||||||
|
|
||||||
|
create() {
|
||||||
|
const tagName = this.listForm.value.title;
|
||||||
|
this.collectionService.addByMultiple(0, this.seriesIds, tagName).subscribe(() => {
|
||||||
|
this.toastr.success('Series added to ' + tagName + ' collection');
|
||||||
|
this.modal.close();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
addToCollection(tag: CollectionTag) {
|
||||||
|
if (this.seriesIds.length === 0) return;
|
||||||
|
|
||||||
|
this.collectionService.addByMultiple(tag.id, this.seriesIds, '').subscribe(() => {
|
||||||
|
this.toastr.success('Series added to ' + tag.title + ' collection');
|
||||||
|
this.modal.close();
|
||||||
|
});
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
filterList = (listItem: ReadingList) => {
|
||||||
|
return listItem.title.toLowerCase().indexOf((this.listForm.value.filterQuery || '').toLowerCase()) >= 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
@ -127,7 +127,7 @@ export class BulkSelectionService {
|
|||||||
getActions(callback: (action: Action, data: any) => void) {
|
getActions(callback: (action: Action, data: any) => void) {
|
||||||
// checks if series is present. If so, returns only series actions
|
// checks if series is present. If so, returns only series actions
|
||||||
// else returns volume/chapter items
|
// else returns volume/chapter items
|
||||||
const allowedActions = [Action.AddToReadingList, Action.MarkAsRead, Action.MarkAsUnread];
|
const allowedActions = [Action.AddToReadingList, Action.MarkAsRead, Action.MarkAsUnread, Action.AddToCollection, Action.Delete];
|
||||||
if (Object.keys(this.selectedCards).filter(item => item === 'series').length > 0) {
|
if (Object.keys(this.selectedCards).filter(item => item === 'series').length > 0) {
|
||||||
return this.actionFactory.getSeriesActions(callback).filter(item => allowedActions.includes(item.action));
|
return this.actionFactory.getSeriesActions(callback).filter(item => allowedActions.includes(item.action));
|
||||||
}
|
}
|
||||||
|
@ -16,10 +16,11 @@ import { CardItemComponent } from './card-item/card-item.component';
|
|||||||
import { SharedModule } from '../shared/shared.module';
|
import { SharedModule } from '../shared/shared.module';
|
||||||
import { RouterModule } from '@angular/router';
|
import { RouterModule } from '@angular/router';
|
||||||
import { TypeaheadModule } from '../typeahead/typeahead.module';
|
import { TypeaheadModule } from '../typeahead/typeahead.module';
|
||||||
import { BrowserModule } from '@angular/platform-browser';
|
|
||||||
import { CardDetailLayoutComponent } from './card-detail-layout/card-detail-layout.component';
|
import { CardDetailLayoutComponent } from './card-detail-layout/card-detail-layout.component';
|
||||||
import { CardDetailsModalComponent } from './_modals/card-details-modal/card-details-modal.component';
|
import { CardDetailsModalComponent } from './_modals/card-details-modal/card-details-modal.component';
|
||||||
import { BulkOperationsComponent } from './bulk-operations/bulk-operations.component';
|
import { BulkOperationsComponent } from './bulk-operations/bulk-operations.component';
|
||||||
|
import { BulkAddToCollectionComponent } from './_modals/bulk-add-to-collection/bulk-add-to-collection.component';
|
||||||
|
import { PipeModule } from '../pipe/pipe.module';
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@ -36,11 +37,11 @@ import { BulkOperationsComponent } from './bulk-operations/bulk-operations.compo
|
|||||||
CardActionablesComponent,
|
CardActionablesComponent,
|
||||||
CardDetailLayoutComponent,
|
CardDetailLayoutComponent,
|
||||||
CardDetailsModalComponent,
|
CardDetailsModalComponent,
|
||||||
BulkOperationsComponent
|
BulkOperationsComponent,
|
||||||
|
BulkAddToCollectionComponent
|
||||||
],
|
],
|
||||||
imports: [
|
imports: [
|
||||||
CommonModule,
|
CommonModule,
|
||||||
//BrowserModule,
|
|
||||||
RouterModule,
|
RouterModule,
|
||||||
ReactiveFormsModule,
|
ReactiveFormsModule,
|
||||||
FormsModule, // EditCollectionsModal
|
FormsModule, // EditCollectionsModal
|
||||||
@ -58,6 +59,7 @@ import { BulkOperationsComponent } from './bulk-operations/bulk-operations.compo
|
|||||||
NgbDropdownModule,
|
NgbDropdownModule,
|
||||||
NgbProgressbarModule,
|
NgbProgressbarModule,
|
||||||
NgxFileDropModule, // Cover Chooser
|
NgxFileDropModule, // Cover Chooser
|
||||||
|
PipeModule // filter for BulkAddToCollectionComponent
|
||||||
],
|
],
|
||||||
exports: [
|
exports: [
|
||||||
CardItemComponent,
|
CardItemComponent,
|
||||||
|
@ -55,6 +55,11 @@ export class CollectionDetailComponent implements OnInit, OnDestroy {
|
|||||||
this.bulkSelectionService.deselectAll();
|
this.bulkSelectionService.deselectAll();
|
||||||
});
|
});
|
||||||
break;
|
break;
|
||||||
|
case Action.AddToCollection:
|
||||||
|
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
case Action.MarkAsRead:
|
case Action.MarkAsRead:
|
||||||
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
|
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
|
||||||
this.loadPage();
|
this.loadPage();
|
||||||
@ -67,6 +72,12 @@ export class CollectionDetailComponent implements OnInit, OnDestroy {
|
|||||||
this.bulkSelectionService.deselectAll();
|
this.bulkSelectionService.deselectAll();
|
||||||
});
|
});
|
||||||
break;
|
break;
|
||||||
|
case Action.Delete:
|
||||||
|
this.actionService.deleteMultipleSeries(selectedSeries, () => {
|
||||||
|
this.loadPage();
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1,14 +1,14 @@
|
|||||||
<ng-container>
|
<app-bulk-operations [actionCallback]="bulkActionCallback"></app-bulk-operations>
|
||||||
<app-card-detail-layout header="In Progress"
|
<app-card-detail-layout header="In Progress"
|
||||||
[isLoading]="isLoading"
|
[isLoading]="isLoading"
|
||||||
[items]="recentlyAdded"
|
[items]="series"
|
||||||
[filters]="filters"
|
[filters]="filters"
|
||||||
[pagination]="pagination"
|
[pagination]="pagination"
|
||||||
(pageChange)="onPageChange($event)"
|
(pageChange)="onPageChange($event)"
|
||||||
(applyFilter)="updateFilter($event)"
|
(applyFilter)="updateFilter($event)"
|
||||||
|
|
||||||
>
|
>
|
||||||
<ng-template #cardItem let-item let-position="idx">
|
<ng-template #cardItem let-item let-position="idx">
|
||||||
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="loadPage()"></app-series-card>
|
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
|
||||||
</ng-template>
|
</ng-template>
|
||||||
</app-card-detail-layout>
|
</app-card-detail-layout>
|
||||||
</ng-container>
|
|
||||||
|
@ -1,11 +1,15 @@
|
|||||||
import { Component, OnInit } from '@angular/core';
|
import { Component, HostListener, OnInit } from '@angular/core';
|
||||||
import { Title } from '@angular/platform-browser';
|
import { Title } from '@angular/platform-browser';
|
||||||
import { Router, ActivatedRoute } from '@angular/router';
|
import { Router, ActivatedRoute } from '@angular/router';
|
||||||
import { take } from 'rxjs/operators';
|
import { take } from 'rxjs/operators';
|
||||||
|
import { BulkSelectionService } from '../cards/bulk-selection.service';
|
||||||
import { UpdateFilterEvent } from '../cards/card-detail-layout/card-detail-layout.component';
|
import { UpdateFilterEvent } from '../cards/card-detail-layout/card-detail-layout.component';
|
||||||
|
import { KEY_CODES } from '../shared/_services/utility.service';
|
||||||
import { Pagination } from '../_models/pagination';
|
import { Pagination } from '../_models/pagination';
|
||||||
import { Series } from '../_models/series';
|
import { Series } from '../_models/series';
|
||||||
import { FilterItem, SeriesFilter, mangaFormatFilters } from '../_models/series-filter';
|
import { FilterItem, SeriesFilter, mangaFormatFilters } from '../_models/series-filter';
|
||||||
|
import { Action } from '../_services/action-factory.service';
|
||||||
|
import { ActionService } from '../_services/action.service';
|
||||||
import { SeriesService } from '../_services/series.service';
|
import { SeriesService } from '../_services/series.service';
|
||||||
|
|
||||||
@Component({
|
@Component({
|
||||||
@ -16,7 +20,7 @@ import { SeriesService } from '../_services/series.service';
|
|||||||
export class InProgressComponent implements OnInit {
|
export class InProgressComponent implements OnInit {
|
||||||
|
|
||||||
isLoading: boolean = true;
|
isLoading: boolean = true;
|
||||||
recentlyAdded: Series[] = [];
|
series: Series[] = [];
|
||||||
pagination!: Pagination;
|
pagination!: Pagination;
|
||||||
libraryId!: number;
|
libraryId!: number;
|
||||||
filters: Array<FilterItem> = mangaFormatFilters;
|
filters: Array<FilterItem> = mangaFormatFilters;
|
||||||
@ -24,7 +28,8 @@ export class InProgressComponent implements OnInit {
|
|||||||
mangaFormat: null
|
mangaFormat: null
|
||||||
};
|
};
|
||||||
|
|
||||||
constructor(private router: Router, private route: ActivatedRoute, private seriesService: SeriesService, private titleService: Title) {
|
constructor(private router: Router, private route: ActivatedRoute, private seriesService: SeriesService, private titleService: Title,
|
||||||
|
private actionService: ActionService, public bulkSelectionService: BulkSelectionService) {
|
||||||
this.router.routeReuseStrategy.shouldReuseRoute = () => false;
|
this.router.routeReuseStrategy.shouldReuseRoute = () => false;
|
||||||
this.titleService.setTitle('Kavita - In Progress');
|
this.titleService.setTitle('Kavita - In Progress');
|
||||||
if (this.pagination === undefined || this.pagination === null) {
|
if (this.pagination === undefined || this.pagination === null) {
|
||||||
@ -33,6 +38,20 @@ export class InProgressComponent implements OnInit {
|
|||||||
this.loadPage();
|
this.loadPage();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@HostListener('document:keydown.shift', ['$event'])
|
||||||
|
handleKeypress(event: KeyboardEvent) {
|
||||||
|
if (event.key === KEY_CODES.SHIFT) {
|
||||||
|
this.bulkSelectionService.isShiftDown = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@HostListener('document:keyup.shift', ['$event'])
|
||||||
|
handleKeyUp(event: KeyboardEvent) {
|
||||||
|
if (event.key === KEY_CODES.SHIFT) {
|
||||||
|
this.bulkSelectionService.isShiftDown = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
ngOnInit() {}
|
ngOnInit() {}
|
||||||
|
|
||||||
seriesClicked(series: Series) {
|
seriesClicked(series: Series) {
|
||||||
@ -61,7 +80,7 @@ export class InProgressComponent implements OnInit {
|
|||||||
}
|
}
|
||||||
this.isLoading = true;
|
this.isLoading = true;
|
||||||
this.seriesService.getInProgress(this.libraryId, this.pagination?.currentPage, this.pagination?.itemsPerPage, this.filter).pipe(take(1)).subscribe(series => {
|
this.seriesService.getInProgress(this.libraryId, this.pagination?.currentPage, this.pagination?.itemsPerPage, this.filter).pipe(take(1)).subscribe(series => {
|
||||||
this.recentlyAdded = series.result;
|
this.series = series.result;
|
||||||
this.pagination = series.pagination;
|
this.pagination = series.pagination;
|
||||||
this.isLoading = false;
|
this.isLoading = false;
|
||||||
window.scrollTo(0, 0);
|
window.scrollTo(0, 0);
|
||||||
@ -73,4 +92,40 @@ export class InProgressComponent implements OnInit {
|
|||||||
return urlParams.get('page');
|
return urlParams.get('page');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
bulkActionCallback = (action: Action, data: any) => {
|
||||||
|
const selectedSeriesIndexies = this.bulkSelectionService.getSelectedCardsForSource('series');
|
||||||
|
const selectedSeries = this.series.filter((series, index: number) => selectedSeriesIndexies.includes(index + ''));
|
||||||
|
|
||||||
|
switch (action) {
|
||||||
|
case Action.AddToReadingList:
|
||||||
|
this.actionService.addMultipleSeriesToReadingList(selectedSeries, () => {
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case Action.AddToCollection:
|
||||||
|
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case Action.MarkAsRead:
|
||||||
|
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
|
||||||
|
this.loadPage();
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case Action.MarkAsUnread:
|
||||||
|
this.actionService.markMultipleSeriesAsUnread(selectedSeries, () => {
|
||||||
|
this.loadPage();
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case Action.Delete:
|
||||||
|
this.actionService.deleteMultipleSeries(selectedSeries, () => {
|
||||||
|
this.loadPage();
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -9,6 +9,6 @@
|
|||||||
(pageChange)="onPageChange($event)"
|
(pageChange)="onPageChange($event)"
|
||||||
>
|
>
|
||||||
<ng-template #cardItem let-item let-position="idx">
|
<ng-template #cardItem let-item let-position="idx">
|
||||||
<app-series-card [data]="item" [libraryId]="libraryId" [suppressLibraryLink]="true" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
|
<app-series-card [data]="item" [libraryId]="libraryId" [suppressLibraryLink]="true" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
|
||||||
</ng-template>
|
</ng-template>
|
||||||
</app-card-detail-layout>
|
</app-card-detail-layout>
|
||||||
|
@ -46,6 +46,11 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
|
|||||||
this.bulkSelectionService.deselectAll();
|
this.bulkSelectionService.deselectAll();
|
||||||
});
|
});
|
||||||
break;
|
break;
|
||||||
|
case Action.AddToCollection:
|
||||||
|
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
case Action.MarkAsRead:
|
case Action.MarkAsRead:
|
||||||
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
|
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
|
||||||
this.loadPage();
|
this.loadPage();
|
||||||
@ -59,6 +64,12 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
|
|||||||
this.bulkSelectionService.deselectAll();
|
this.bulkSelectionService.deselectAll();
|
||||||
});
|
});
|
||||||
break;
|
break;
|
||||||
|
case Action.Delete:
|
||||||
|
this.actionService.deleteMultipleSeries(selectedSeries, () => {
|
||||||
|
this.loadPage();
|
||||||
|
this.bulkSelectionService.deselectAll();
|
||||||
|
});
|
||||||
|
break;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -13,7 +13,7 @@
|
|||||||
|
|
||||||
<app-carousel-reel [items]="recentlyAdded" title="Recently Added" (sectionClick)="handleSectionClick($event)">
|
<app-carousel-reel [items]="recentlyAdded" title="Recently Added" (sectionClick)="handleSectionClick($event)">
|
||||||
<ng-template #carouselItem let-item let-position="idx">
|
<ng-template #carouselItem let-item let-position="idx">
|
||||||
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="reloadTags()" (dataChanged)="loadRecentlyAdded()"></app-series-card>
|
<app-series-card [data]="item" [libraryId]="item.libraryId" (dataChanged)="loadRecentlyAdded()"></app-series-card>
|
||||||
</ng-template>
|
</ng-template>
|
||||||
</app-carousel-reel>
|
</app-carousel-reel>
|
||||||
|
|
||||||
|
@ -6,6 +6,7 @@ import { Subject } from 'rxjs';
|
|||||||
import { take, takeUntil } from 'rxjs/operators';
|
import { take, takeUntil } from 'rxjs/operators';
|
||||||
import { EditCollectionTagsComponent } from '../cards/_modals/edit-collection-tags/edit-collection-tags.component';
|
import { EditCollectionTagsComponent } from '../cards/_modals/edit-collection-tags/edit-collection-tags.component';
|
||||||
import { CollectionTag } from '../_models/collection-tag';
|
import { CollectionTag } from '../_models/collection-tag';
|
||||||
|
import { SeriesAddedEvent } from '../_models/events/series-added-event';
|
||||||
import { InProgressChapter } from '../_models/in-progress-chapter';
|
import { InProgressChapter } from '../_models/in-progress-chapter';
|
||||||
import { Library } from '../_models/library';
|
import { Library } from '../_models/library';
|
||||||
import { Series } from '../_models/series';
|
import { Series } from '../_models/series';
|
||||||
@ -15,6 +16,7 @@ import { Action, ActionFactoryService, ActionItem } from '../_services/action-fa
|
|||||||
import { CollectionTagService } from '../_services/collection-tag.service';
|
import { CollectionTagService } from '../_services/collection-tag.service';
|
||||||
import { ImageService } from '../_services/image.service';
|
import { ImageService } from '../_services/image.service';
|
||||||
import { LibraryService } from '../_services/library.service';
|
import { LibraryService } from '../_services/library.service';
|
||||||
|
import { EVENTS, MessageHubService } from '../_services/message-hub.service';
|
||||||
import { SeriesService } from '../_services/series.service';
|
import { SeriesService } from '../_services/series.service';
|
||||||
|
|
||||||
@Component({
|
@Component({
|
||||||
@ -32,17 +34,24 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
recentlyAdded: Series[] = [];
|
recentlyAdded: Series[] = [];
|
||||||
inProgress: Series[] = [];
|
inProgress: Series[] = [];
|
||||||
continueReading: InProgressChapter[] = [];
|
continueReading: InProgressChapter[] = [];
|
||||||
// collectionTags: CollectionTag[] = [];
|
|
||||||
// collectionTagActions: ActionItem<CollectionTag>[] = [];
|
|
||||||
|
|
||||||
private readonly onDestroy = new Subject<void>();
|
private readonly onDestroy = new Subject<void>();
|
||||||
|
|
||||||
seriesTrackBy = (index: number, item: any) => `${item.name}_${item.pagesRead}`;
|
seriesTrackBy = (index: number, item: any) => `${item.name}_${item.pagesRead}`;
|
||||||
|
|
||||||
constructor(public accountService: AccountService, private libraryService: LibraryService,
|
constructor(public accountService: AccountService, private libraryService: LibraryService,
|
||||||
private seriesService: SeriesService, private actionFactoryService: ActionFactoryService,
|
private seriesService: SeriesService, private router: Router,
|
||||||
private collectionService: CollectionTagService, private router: Router,
|
private titleService: Title, public imageService: ImageService,
|
||||||
private modalService: NgbModal, private titleService: Title, public imageService: ImageService) { }
|
private messageHub: MessageHubService) {
|
||||||
|
this.messageHub.messages$.pipe(takeUntil(this.onDestroy)).subscribe(res => {
|
||||||
|
if (res.event == EVENTS.SeriesAdded) {
|
||||||
|
const seriesAddedEvent = res.payload as SeriesAddedEvent;
|
||||||
|
this.seriesService.getSeries(seriesAddedEvent.seriesId).subscribe(series => {
|
||||||
|
this.recentlyAdded.unshift(series);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
ngOnInit(): void {
|
ngOnInit(): void {
|
||||||
this.titleService.setTitle('Kavita - Dashboard');
|
this.titleService.setTitle('Kavita - Dashboard');
|
||||||
@ -56,8 +65,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
//this.collectionTagActions = this.actionFactoryService.getCollectionTagActions(this.handleCollectionActionCallback.bind(this));
|
|
||||||
|
|
||||||
this.reloadSeries();
|
this.reloadSeries();
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -68,10 +75,7 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
|
|
||||||
reloadSeries() {
|
reloadSeries() {
|
||||||
this.loadRecentlyAdded();
|
this.loadRecentlyAdded();
|
||||||
|
|
||||||
this.loadInProgress();
|
this.loadInProgress();
|
||||||
|
|
||||||
this.reloadTags();
|
|
||||||
}
|
}
|
||||||
|
|
||||||
reloadInProgress(series: Series | boolean) {
|
reloadInProgress(series: Series | boolean) {
|
||||||
@ -85,7 +89,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
}
|
}
|
||||||
|
|
||||||
this.loadInProgress();
|
this.loadInProgress();
|
||||||
this.reloadTags();
|
|
||||||
}
|
}
|
||||||
|
|
||||||
loadInProgress() {
|
loadInProgress() {
|
||||||
@ -100,12 +103,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
reloadTags() {
|
|
||||||
// this.collectionService.allTags().pipe(takeUntil(this.onDestroy)).subscribe(tags => {
|
|
||||||
// this.collectionTags = tags;
|
|
||||||
// });
|
|
||||||
}
|
|
||||||
|
|
||||||
handleSectionClick(sectionTitle: string) {
|
handleSectionClick(sectionTitle: string) {
|
||||||
if (sectionTitle.toLowerCase() === 'collections') {
|
if (sectionTitle.toLowerCase() === 'collections') {
|
||||||
this.router.navigate(['collections']);
|
this.router.navigate(['collections']);
|
||||||
@ -115,26 +112,4 @@ export class LibraryComponent implements OnInit, OnDestroy {
|
|||||||
this.router.navigate(['in-progress']);
|
this.router.navigate(['in-progress']);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
loadCollection(item: CollectionTag) {
|
|
||||||
//this.router.navigate(['collections', item.id]);
|
|
||||||
}
|
|
||||||
|
|
||||||
// handleCollectionActionCallback(action: Action, collectionTag: CollectionTag) {
|
|
||||||
// switch (action) {
|
|
||||||
// case(Action.Edit):
|
|
||||||
// const modalRef = this.modalService.open(EditCollectionTagsComponent, { size: 'lg', scrollable: true });
|
|
||||||
// modalRef.componentInstance.tag = collectionTag;
|
|
||||||
// modalRef.closed.subscribe((results: {success: boolean, coverImageUpdated: boolean}) => {
|
|
||||||
// this.reloadTags();
|
|
||||||
// if (results.coverImageUpdated) {
|
|
||||||
// collectionTag.coverImage = this.imageService.randomize(collectionTag.coverImage);
|
|
||||||
// }
|
|
||||||
// });
|
|
||||||
// break;
|
|
||||||
// default:
|
|
||||||
// break;
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
@ -7,7 +7,7 @@
|
|||||||
</button>
|
</button>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div style="font-weight: bold;">{{title}} <span *ngIf="incognitoMode">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
|
<div style="font-weight: bold;">{{title}} <span class="clickable" *ngIf="incognitoMode" (click)="turnOffIncognito()" role="button" aria-label="Incognito mode is on. Toggle to turn off.">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode:</span>)</span></div>
|
||||||
<div class="subtitle">
|
<div class="subtitle">
|
||||||
{{subtitle}}
|
{{subtitle}}
|
||||||
</div>
|
</div>
|
||||||
|
@ -1113,4 +1113,15 @@ export class MangaReaderComponent implements OnInit, AfterViewInit, OnDestroy {
|
|||||||
}
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Turns off Incognito mode. This can only happen once if the user clicks the icon. This will modify URL state
|
||||||
|
*/
|
||||||
|
turnOffIncognito() {
|
||||||
|
this.incognitoMode = false;
|
||||||
|
const newRoute = this.readerService.getNextChapterUrl(this.router.url, this.chapterId, this.incognitoMode, this.readingListMode, this.readingListId);
|
||||||
|
window.history.replaceState({}, '', newRoute);
|
||||||
|
this.toastr.info('Incognito mode is off. Progress will now start being tracked.');
|
||||||
|
this.readerService.saveProgress(this.seriesId, this.volumeId, this.chapterId, this.pageNum).pipe(take(1)).subscribe(() => {/* No operation */});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
@ -23,6 +23,7 @@
|
|||||||
(selected)='clickSearchResult($event)'
|
(selected)='clickSearchResult($event)'
|
||||||
(inputChanged)='onChangeSearch($event)'
|
(inputChanged)='onChangeSearch($event)'
|
||||||
[isLoading]="isLoading"
|
[isLoading]="isLoading"
|
||||||
|
[customFilter]="customFilter"
|
||||||
[debounceTime]="debounceTime"
|
[debounceTime]="debounceTime"
|
||||||
[itemTemplate]="itemTemplate"
|
[itemTemplate]="itemTemplate"
|
||||||
[notFoundTemplate]="notFoundTemplate">
|
[notFoundTemplate]="notFoundTemplate">
|
||||||
@ -35,7 +36,7 @@
|
|||||||
</div>
|
</div>
|
||||||
<div class="ml-1">
|
<div class="ml-1">
|
||||||
<app-series-format [format]="item.format"></app-series-format>
|
<app-series-format [format]="item.format"></app-series-format>
|
||||||
<span *ngIf="item.name.toLowerCase().indexOf(searchTerm) >= 0; else localizedName" [innerHTML]="item.name"></span>
|
<span *ngIf="item.name.toLowerCase().trim().indexOf(searchTerm) >= 0; else localizedName" [innerHTML]="item.name"></span>
|
||||||
<ng-template #localizedName>
|
<ng-template #localizedName>
|
||||||
<span [innerHTML]="item.localizedName"></span>
|
<span [innerHTML]="item.localizedName"></span>
|
||||||
</ng-template>
|
</ng-template>
|
||||||
|
@ -3,6 +3,7 @@ import { Component, HostListener, Inject, OnDestroy, OnInit, ViewChild } from '@
|
|||||||
import { Router } from '@angular/router';
|
import { Router } from '@angular/router';
|
||||||
import { Subject } from 'rxjs';
|
import { Subject } from 'rxjs';
|
||||||
import { takeUntil } from 'rxjs/operators';
|
import { takeUntil } from 'rxjs/operators';
|
||||||
|
import { isTemplateSpan } from 'typescript';
|
||||||
import { ScrollService } from '../scroll.service';
|
import { ScrollService } from '../scroll.service';
|
||||||
import { SearchResult } from '../_models/search-result';
|
import { SearchResult } from '../_models/search-result';
|
||||||
import { AccountService } from '../_services/account.service';
|
import { AccountService } from '../_services/account.service';
|
||||||
@ -24,6 +25,16 @@ export class NavHeaderComponent implements OnInit, OnDestroy {
|
|||||||
imageStyles = {width: '24px', 'margin-top': '5px'};
|
imageStyles = {width: '24px', 'margin-top': '5px'};
|
||||||
searchResults: SearchResult[] = [];
|
searchResults: SearchResult[] = [];
|
||||||
searchTerm = '';
|
searchTerm = '';
|
||||||
|
customFilter: (items: SearchResult[], query: string) => SearchResult[] = (items: SearchResult[], query: string) => {
|
||||||
|
const normalizedQuery = query.trim().toLowerCase();
|
||||||
|
const matches = items.filter(item => {
|
||||||
|
const normalizedSeriesName = item.name.toLowerCase().trim();
|
||||||
|
const normalizedOriginalName = item.originalName.toLowerCase().trim();
|
||||||
|
const normalizedLocalizedName = item.localizedName.toLowerCase().trim();
|
||||||
|
return normalizedSeriesName.indexOf(normalizedQuery) >= 0 || normalizedOriginalName.indexOf(normalizedQuery) >= 0 || normalizedLocalizedName.indexOf(normalizedQuery) >= 0;
|
||||||
|
});
|
||||||
|
return matches;
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
backToTopNeeded = false;
|
backToTopNeeded = false;
|
||||||
|
@ -1,6 +1,5 @@
|
|||||||
import { noUndefined } from '@angular/compiler/src/util';
|
|
||||||
import { AfterViewInit, Component, ElementRef, Input, OnInit, ViewChild } from '@angular/core';
|
import { AfterViewInit, Component, ElementRef, Input, OnInit, ViewChild } from '@angular/core';
|
||||||
import { FormControl, FormGroup, Validators } from '@angular/forms';
|
import { FormControl, FormGroup } from '@angular/forms';
|
||||||
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
|
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
|
||||||
import { ToastrService } from 'ngx-toastr';
|
import { ToastrService } from 'ngx-toastr';
|
||||||
import { ReadingList } from 'src/app/_models/reading-list';
|
import { ReadingList } from 'src/app/_models/reading-list';
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user