v0.4.8 Release (#720)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Workflow updates (#658)

# Added
- Added: Added automatic character parsing for discord notifier. Now if the PR is over a certain character limit, it will trim and add an appropriate link to the full changelog. (Release for Stable, PR for Dev)

# Removed
- Removed: Removed Sentry map task from the workflow since Sentry is no longer used.

* Bump versions by dotnet-bump-version.

* Misc Updates (#665)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Bump versions by dotnet-bump-version.

* Logging Cleanup (#668)

* Do not allow non-admins to change their passwords when authentication is disabled

* Clean up the login page so that input field text is black

* cleanup some resizing when typing a password and having a lot of users

* Changed the LastActive for a user to not just be login, but also when they open an already authenticated session.

* Removed some verbose debugging statements and moved some debug to information to be more prevelant to logs for default installs.

* In Progress now sends progress information on the Series

* Add ability to add cards to recently added when new series are added in backend

* Implemented the ability to click the glasses icon to turn off incognito mode from within the reader so you can start tracking progress

* Don't warn the user about authentication when they don't touch that control

* Bump versions by dotnet-bump-version.

* Changed the stats that are sent back to stat server from installed server.

* Revert "Changed the stats that are sent back to stat server from installed server."

This reverts commit 644cb6d1f67de9531ea1a1dfd3853709e0329ce7.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Bulk Add to Collection (#674)

* Fixed the typeahead not having the same size input box as other inputs

* Implemented the ability to add multiple series to a collection through bulk operations flow. Updated book parser to handle "@import url('...');" syntax as well as @import '...';

* Implemented the ability to create a new Collection tag via bulk operations flow.

* Bump versions by dotnet-bump-version.

* Bulk Operations for In Progress and Recently Added (#677)

* Don't log a message about bad match if the file is a cover image

* Enable bulk operations for In Progress and Recently Added

* Fixed a bad logic case

* Bump versions by dotnet-bump-version.

* Regression Fix (#680)

* Ensure we mount the backups directory for Docker users

* Fixed a huge logic bug that deleted files in users libraries

* Bump versions by dotnet-bump-version.

* Change chunk size to be a fixed 50 to validate if it's causing issue with refresh. Added some try catches to see if exceptions are causing issues. (#681)

* Bump versions by dotnet-bump-version.

* Fixed a bug where searching on localized name would fail to show on the search. Fixed a bug where extra spaces would cause the search results not to show properly. (#682)

* Bump versions by dotnet-bump-version.

* When we have a special marker, ensure we fall back to folder parsing to try and group correctly to the actual series before just accepting what we parsed. (#684)

Fixed a missed parsing case where comic special parsing wasn't being called on comic libraries.

* Bump versions by dotnet-bump-version.

* iOS Admin page dropdown fix (#686)

# Fixed:
- Fixed: Fixed an issue where the dropdown on the admin server page would not work on Safari or other iOS browsers.

* When the DB fails to save, log out all the series the user should look into for constraint issues and push a message to the admins connected to webui. (#687)

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Stat upload will now schedule itself between midnight and 6am in server time for upload. (#688)

* Bump versions by dotnet-bump-version.

* EPUB CSS Parsing Issues (#690)

* WIP. Rewrote some of the Regex to better support css escaping. We now escape background-image, border-image, and list-style-image within css files.

* Added position relative to help with positioning on books that are just absolute positioned elements.

* When there is absolute positioning, like in some epub based comics, supress the bottom action bar since it wont render in the correct location.

* Fixed tests

* Commented out tests

* Bump versions by dotnet-bump-version.

* More EPUB Scoping Fixes (#691)

* Added better handling around when importing css files that are empty. Moved comment removal on css files to before some css whitespace cleanup to get better matches.

* Some enhancements on the checks to see if we need the bottom action bar on reader. Now we don't query DOM and have something that works more reliably.

* Bump versions by dotnet-bump-version.

* Fixed an issue where docker users were not properly backing up the database. Removed an empty File for when covers/ had nothing in it. (#692)

* Bump versions by dotnet-bump-version.

* Fallback to Folder Parsing Issue (#694)

* Fixed a bug in the scanner where we fall back to parsing from folders for poorly named files. The code was exiting early if a chapter or volume could be parsed out.

* Fixed a unit test by tweaking a regex for fallback

* Bump versions by dotnet-bump-version.

* KavitaStats Cleanup (#695)

* Refactored Stats code to be much cleaner and user better naming.

* Cleaned up the actual http code to use Flurl and to return if the upload was successful or not so we can delete the file where appropriate.

* More refactoring for the stats code to clean it up and keep it consistent with our standards.

* Removed a confusing log statement

* Added support for old api key header from original stat server

* Use the correct endpoint, not the new one.

* Code smell

* Bump versions by dotnet-bump-version.

* Bulk Deletion (#697)

* Implemented bulk deletion of series

* Don't show unauthorized exception on UI, just redirect to the login page.

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (#700)

* Ensure Kavita knows about forwarding headers (fixes issue with epub urls not going through https with reverse proxy). Fixed a case where cover image selection preferred nested folders vs files in root directory.

* Fixed broken unit test

* Added bug that I fixed to the unit tests

* Cover Image Picking + Forwarding Headers with EPUBs (#702)

* Updating GA Bump version temporarily for fix (#703)

* Bump versions by dotnet-bump-version.

* Cover Image Picking + Forwarding Headers with EPUBs (GA Fix) (#704)

* Bump versions by dotnet-bump-version.

* Vacation Fixes (#709)

* Ignore system and hidden folders when performing directory scan.

* Fixed the comic parser tests not using Comic mode for parsing.

* Accept all forwarded headers and use them.

* Ignore some changes from another branch

* Bump versions by dotnet-bump-version.

* Breaking Changes: Docker Parity (#698)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Bump versions by dotnet-bump-version.

* Feature/docker parity (#714)

* Refactored all the config files for Kavita to be loaded from config/. This will allow docker to just mount one folder and for Update functionality to be trivial.

* Cleaned up documentation around new update method.

* Updated docker files to support config directory

* Removed entrypoint, no longer needed

* Update appsettings to point to config directory for logs

* Updated message for docker users that are upgrading

* Ensure that docker users that have not updated their mount points from upgrade cannot start the server

* Code smells

* More cleanup

* Added entrypoint to fix bind mount issues

* Updated README with new folder structure

* Fixed build system for new setup

* Updated string path if user is docker

* Updated the migration flow for docker to work properly and Fixed LogFile configuration updating.

* Migrating docker images is now working 100%

* Fixed config from bad code

* Code cleanup

* Fixed monorepo-build.sh

Co-authored-by: Chris Plaatjes <kizaing@gmail.com>

* Breaking Changes: Docker Parity (#715)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Bump versions by dotnet-bump-version.

* Build issue (#716)

* Fixed a bug in the copy directory to directory in the migration

* Somehow GetFiles lost static modifier.

* Please work

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Shakeout Changes (#717)

* Make the appsettings public on Configuration and change how we detect when to migrate for non-docker users.

* Fixed up non-docker copy command and removed duplicate check on source directory for a copy.

* Don't delete files unless we know we are successful

* Bump versions by dotnet-bump-version.

* Fixed a migration issue on docker happening too many times or throwing exception when source wasn't there. (#719)

* Bump versions by dotnet-bump-version.

* Version bump for release (#718)

* Bump versions by dotnet-bump-version.

Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: YEGCSharpDev <89283498+YEGCSharpDev@users.noreply.github.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
Joseph Milazzo 2021-11-04 07:29:02 -05:00 committed by GitHub
parent cb9fa0dda8
commit 33db123e81
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
115 changed files with 1818 additions and 910 deletions

View File

@ -2,7 +2,7 @@
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
labels: needs-triage
assignees: ''
---
@ -24,7 +24,7 @@ A clear and concise description of what you expected to happen.
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- OS: [e.g. iOS, Docker]
- Browser [e.g. chrome, safari]
- Version [e.g. 22] (can be found on Server Settings -> System tab)

View File

@ -1,63 +0,0 @@
name: Sentry Map Release
on:
workflow_dispatch:
inputs:
version:
description: "version to update package.json"
required: true
# No default
jobs:
build:
name: Setup Sentry CLI
runs-on: ubuntu-latest
steps:
- uses: mathieu-bour/setup-sentry-cli@1.2.0
with:
version: latest
token: ${{ secrets.SENTRY_TOKEN }}
organization: kavita-7n
project: angular
- name: Check out repository
uses: actions/checkout@v2
- name: Parse Version
run: |
version='${{ github.event.inputs.version }}'
newVersion=${version%.*}
echo $newVersion
echo "::set-output name=VERSION::$newVersion"
id: parse-version
- name: NodeJS to Compile WebUI
uses: actions/setup-node@v2.1.5
with:
node-version: '14'
- run: |
cd UI/Web || exit
echo 'Installing web dependencies'
npm install
npm version --allow-same-version "${{ steps.parse-version.outputs.VERSION }}"
echo 'Building UI'
npm run prod
- name: Cache dependencies
uses: actions/cache@v2
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Create Release
run: sentry-cli releases new ${{ steps.parse-version.outputs.VERSION }}
- name: Upload Source Maps
run: sentry-cli releases files ${{ steps.parse-version.outputs.VERSION }} upload-sourcemaps UI/Web/dist
- name: Finalize Release
run: sentry-cli releases finalize ${{ steps.parse-version.outputs.VERSION }}

View File

@ -115,7 +115,7 @@ jobs:
run: dotnet build --configuration Release --no-restore
- name: Bump versions
uses: SiqiLu/dotnet-bump-version@master
uses: ThomasEg/dotnet-bump-version@patch-1
with:
version_files: Kavita.Common/Kavita.Common.csproj
github_token: ${{ secrets.REPO_GHA_PAT }}
@ -136,6 +136,13 @@ jobs:
id: parse-body
run: |
body="${{ steps.findPr.outputs.body }}"
if [[ ${#body} -gt 1870 ]] ; then
body=${body:0:1870}
body="${body}...and much more.
Read full changelog: https://github.com/Kareadita/Kavita/pull/${{ steps.findPr.outputs.pr }}"
fi
body=${body//\'/}
body=${body//'%'/'%25'}
body=${body//$'\n'/'%0A'}
@ -180,13 +187,6 @@ jobs:
dotnet-version: '5.0.x'
- run: ./monorepo-build.sh
- name: Trigger Sentry workflow
uses: benc-uk/workflow-dispatch@v1
with:
workflow: Sentry Map Release
token: ${{ secrets.REPO_GHA_PAT }}
inputs: '{ "version": "${{steps.get-version.outputs.assembly-version}}" }'
- name: Login to Docker Hub
uses: docker/login-action@v1
with:
@ -238,6 +238,13 @@ jobs:
id: parse-body
run: |
body="${{ steps.findPr.outputs.body }}"
if [[ ${#body} -gt 1870 ]] ; then
body=${body:0:1870}
body="${body}...and much more.
Read full changelog: https://github.com/Kareadita/Kavita/releases/latest"
fi
body=${body//\'/}
body=${body//'%'/'%25'}
body=${body//$'\n'/'%0A'}
@ -291,13 +298,6 @@ jobs:
dotnet-version: '5.0.x'
- run: ./monorepo-build.sh
- name: Trigger Sentry workflow
uses: benc-uk/workflow-dispatch@v1
with:
workflow: Sentry Map Release
token: ${{ secrets.REPO_GHA_PAT }}
inputs: '{ "version": "${{steps.get-version.outputs.assembly-version}}" }'
- name: Login to Docker Hub
uses: docker/login-action@v1
with:

20
.gitignore vendored
View File

@ -500,4 +500,22 @@ _output/
API/stats/
UI/Web/dist/
/API.Tests/Extensions/Test Data/modified on run.txt
/API/covers/
# All config files/folders in config except appsettings.json
/API/config/covers/
/API/config/logs/
/API/config/backups/
/API/config/cache/
/API/config/temp/
/API/config/stats/
/API/config/kavita.db
/API/config/kavita.db-shm
/API/config/kavita.db-wal
/API/config/Hangfire.db
/API/config/Hangfire-log.db
API/config/covers/
API/config/*.db
API/config/stats/*
API/config/stats/app_stats.json
UI/Web/.vscode/settings.json

View File

@ -8,42 +8,42 @@ namespace API.Tests.Comparers
public class NaturalSortComparerTest
{
private readonly NaturalSortComparer _nc = new NaturalSortComparer();
[Theory]
[InlineData(
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
)]
[InlineData(
new[] {"Beelzebub_153b_RHS.zip", "Beelzebub_01_[Noodles].zip",},
new[] {"Beelzebub_153b_RHS.zip", "Beelzebub_01_[Noodles].zip",},
new[] {"Beelzebub_01_[Noodles].zip", "Beelzebub_153b_RHS.zip"}
)]
[InlineData(
new[] {"[SCX-Scans]_Vandread_v02_Act02.zip", "[SCX-Scans]_Vandread_v02_Act01.zip",},
new[] {"[SCX-Scans]_Vandread_v02_Act02.zip", "[SCX-Scans]_Vandread_v02_Act01.zip",},
new[] {"[SCX-Scans]_Vandread_v02_Act01.zip", "[SCX-Scans]_Vandread_v02_Act02.zip",}
)]
[InlineData(
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",},
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",},
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",}
)]
[InlineData(
new[] {"001.jpg", "10.jpg",},
new[] {"001.jpg", "10.jpg",},
new[] {"001.jpg", "10.jpg",}
)]
[InlineData(
new[] {"10/001.jpg", "10.jpg",},
new[] {"10/001.jpg", "10.jpg",},
new[] {"10.jpg", "10/001.jpg",}
)]
[InlineData(
new[] {"Batman - Black white vol 1 #04.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr"},
new[] {"Batman - Black white vol 1 #04.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr"},
new[] {"Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #04.cbr"}
)]
[InlineData(
new[] {"3and4.cbz", "The World God Only Knows - Oneshot.cbz", "5.cbz", "1and2.cbz"},
new[] {"3and4.cbz", "The World God Only Knows - Oneshot.cbz", "5.cbz", "1and2.cbz"},
new[] {"1and2.cbz", "3and4.cbz", "5.cbz", "The World God Only Knows - Oneshot.cbz"}
)]
[InlineData(
new[] {"Solo Leveling - c000 (v01) - p000 [Cover] [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p001 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p002 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p003 [dig] [Yen Press] [LuCaZ].jpg"},
new[] {"Solo Leveling - c000 (v01) - p000 [Cover] [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p001 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p002 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p003 [dig] [Yen Press] [LuCaZ].jpg"},
new[] {"Solo Leveling - c000 (v01) - p000 [Cover] [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p001 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p002 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p003 [dig] [Yen Press] [LuCaZ].jpg"}
)]
public void TestNaturalSortComparer(string[] input, string[] expected)
@ -57,39 +57,39 @@ namespace API.Tests.Comparers
i++;
}
}
[Theory]
[InlineData(
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
)]
[InlineData(
new[] {"x2.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x2.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
new[] {"x2.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
)]
[InlineData(
new[] {"Beelzebub_153b_RHS.zip", "Beelzebub_01_[Noodles].zip",},
new[] {"Beelzebub_153b_RHS.zip", "Beelzebub_01_[Noodles].zip",},
new[] {"Beelzebub_01_[Noodles].zip", "Beelzebub_153b_RHS.zip"}
)]
[InlineData(
new[] {"[SCX-Scans]_Vandread_v02_Act02.zip", "[SCX-Scans]_Vandread_v02_Act01.zip","[SCX-Scans]_Vandread_v02_Act07.zip",},
new[] {"[SCX-Scans]_Vandread_v02_Act02.zip", "[SCX-Scans]_Vandread_v02_Act01.zip","[SCX-Scans]_Vandread_v02_Act07.zip",},
new[] {"[SCX-Scans]_Vandread_v02_Act01.zip", "[SCX-Scans]_Vandread_v02_Act02.zip","[SCX-Scans]_Vandread_v02_Act07.zip",}
)]
[InlineData(
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",},
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",},
new[] {"Frogman v01 001.jpg", "Frogman v01 ch01 p00 Credits.jpg",}
)]
[InlineData(
new[] {"001.jpg", "10.jpg",},
new[] {"001.jpg", "10.jpg",},
new[] {"001.jpg", "10.jpg",}
)]
[InlineData(
new[] {"10/001.jpg", "10.jpg",},
new[] {"10/001.jpg", "10.jpg",},
new[] {"10.jpg", "10/001.jpg",}
)]
[InlineData(
new[] {"Batman - Black white vol 1 #04.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr"},
new[] {"Batman - Black white vol 1 #04.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr"},
new[] {"Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #04.cbr"}
)]
public void TestNaturalSortComparerLinq(string[] input, string[] expected)
@ -104,4 +104,4 @@ namespace API.Tests.Comparers
}
}
}
}
}

View File

@ -17,5 +17,24 @@ namespace API.Tests.Parser
{
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
}
// [Theory]
// [InlineData("@font-face{font-family:'syyskuu_repaleinen';src:url(data:font/opentype;base64,AAEAAAA", "@font-face{font-family:'syyskuu_repaleinen';src:url(data:font/opentype;base64,AAEAAAA")]
// [InlineData("@font-face{font-family:'syyskuu_repaleinen';src:url('fonts/font.css')", "@font-face{font-family:'syyskuu_repaleinen';src:url('TEST/fonts/font.css')")]
// public void ReplaceFontSrcUrl(string input, string expected)
// {
// var apiBase = "TEST/";
// var actual = API.Parser.Parser.FontSrcUrlRegex.Replace(input, "$1" + apiBase + "$2" + "$3");
// Assert.Equal(expected, actual);
// }
//
// [Theory]
// [InlineData("@import url('font.css');", "@import url('TEST/font.css');")]
// public void ReplaceImportSrcUrl(string input, string expected)
// {
// var apiBase = "TEST/";
// var actual = API.Parser.Parser.CssImportUrlRegex.Replace(input, "$1" + apiBase + "$2" + "$3");
// Assert.Equal(expected, actual);
// }
}
}

View File

@ -56,6 +56,8 @@ namespace API.Tests.Parser
[InlineData("Batgirl V2000 #57", "Batgirl")]
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire)", "Fables")]
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "2000 AD")]
[InlineData("Daredevil - v6 - 10 - (2019)", "Daredevil")]
[InlineData("Batman - The Man Who Laughs #1 (2005)", "Batman - The Man Who Laughs")]
public void ParseComicSeriesTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename));
@ -93,6 +95,7 @@ namespace API.Tests.Parser
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "0")]
[InlineData("Cyberpunk 2077 - Trauma Team 04.cbz", "0")]
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "0")]
[InlineData("Daredevil - v6 - 10 - (2019)", "6")]
public void ParseComicVolumeTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename));
@ -134,6 +137,7 @@ namespace API.Tests.Parser
[InlineData("Fables 021 (2004) (Digital) (Nahga-Empire).cbr", "21")]
[InlineData("Cyberpunk 2077 - Trauma Team #04.cbz", "4")]
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "366")]
[InlineData("Daredevil - v6 - 10 - (2019)", "10")]
public void ParseComicChapterTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename));
@ -172,10 +176,26 @@ namespace API.Tests.Parser
FullFilePath = filepath, IsSpecial = false
});
filepath = @"E:\Comics\Comics\Publisher\Batman the Detective (2021)\Batman the Detective - v6 - 11 - (2021).cbr";
expected.Add(filepath, new ParserInfo
{
Series = "Batman the Detective", Volumes = "6", Edition = "",
Chapters = "11", Filename = "Batman the Detective - v6 - 11 - (2021).cbr", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = false
});
filepath = @"E:\Comics\Comics\Batman - The Man Who Laughs #1 (2005)\Batman - The Man Who Laughs #1 (2005).cbr";
expected.Add(filepath, new ParserInfo
{
Series = "Batman - The Man Who Laughs", Volumes = "0", Edition = "",
Chapters = "1", Filename = "Batman - The Man Who Laughs #1 (2005).cbr", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = false
});
foreach (var file in expected.Keys)
{
var expectedInfo = expected[file];
var actual = API.Parser.Parser.Parse(file, rootPath);
var actual = API.Parser.Parser.Parse(file, rootPath, LibraryType.Comic);
if (expectedInfo == null)
{
Assert.Null(actual);

View File

@ -297,6 +297,7 @@ namespace API.Tests.Parser
[Theory]
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
[InlineData("/manga/Monster #8/Ch. 001-016 [MangaPlus] [Digital] [amit34521]/Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]/13.jpg", "Monster #8~0~1")]
public void ParseFromFallbackFoldersTest(string inputFile, string expectedParseInfo)
{
const string rootDirectory = "/manga/";
@ -438,6 +439,22 @@ namespace API.Tests.Parser
filepath = @"E:\Manga\Seraph of the End\cover.png";
expected.Add(filepath, null);
filepath = @"E:\Manga\The Beginning After the End\Chapter 001.cbz";
expected.Add(filepath, new ParserInfo
{
Series = "The Beginning After the End", Volumes = "0", Edition = "",
Chapters = "1", Filename = "Chapter 001.cbz", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = false
});
filepath = @"E:\Manga\Monster #8\Ch. 001-016 [MangaPlus] [Digital] [amit34521]\Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]\13.jpg";
expected.Add(filepath, new ParserInfo
{
Series = "Monster #8", Volumes = "0", Edition = "",
Chapters = "1", Filename = "13.jpg", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = false
});
foreach (var file in expected.Keys)
{

View File

@ -140,9 +140,10 @@ namespace API.Tests.Services
[InlineData(new [] {"page 2.jpg", "page 10.jpg"}, "page 2.jpg")]
[InlineData(new [] {"__MACOSX/cover.jpg", "vol1/page 01.jpg"}, "vol1/page 01.jpg")]
[InlineData(new [] {"Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c060 (v10) - p200 [Digital] [LuCaZ].jpg", "folder.jpg"}, "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg")]
[InlineData(new [] {"001.jpg", "001 - chapter 1/001.jpg"}, "001.jpg")]
public void FindFirstEntry(string[] files, string expected)
{
var foundFile = _archiveService.FirstFileEntry(files);
var foundFile = ArchiveService.FirstFileEntry(files, string.Empty);
Assert.Equal(expected, string.IsNullOrEmpty(foundFile) ? "" : foundFile);
}

View File

@ -36,7 +36,7 @@ namespace API.Tests.Services
public void GetFiles_WithCustomRegex_ShouldPass_Test()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/regex");
var files = _directoryService.GetFiles(testDirectory, @"file\d*.txt");
var files = DirectoryService.GetFiles(testDirectory, @"file\d*.txt");
Assert.Equal(2, files.Count());
}
@ -44,7 +44,7 @@ namespace API.Tests.Services
public void GetFiles_TopLevel_ShouldBeEmpty_Test()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService");
var files = _directoryService.GetFiles(testDirectory);
var files = DirectoryService.GetFiles(testDirectory);
Assert.Empty(files);
}
@ -52,7 +52,7 @@ namespace API.Tests.Services
public void GetFilesWithExtensions_ShouldBeEmpty_Test()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extensions");
var files = _directoryService.GetFiles(testDirectory, "*.txt");
var files = DirectoryService.GetFiles(testDirectory, "*.txt");
Assert.Empty(files);
}
@ -60,7 +60,7 @@ namespace API.Tests.Services
public void GetFilesWithExtensions_Test()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extension");
var files = _directoryService.GetFiles(testDirectory, ".cbz|.rar");
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
Assert.Equal(3, files.Count());
}
@ -68,7 +68,7 @@ namespace API.Tests.Services
public void GetFilesWithExtensions_BadDirectory_ShouldBeEmpty_Test()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/doesntexist");
var files = _directoryService.GetFiles(testDirectory, ".cbz|.rar");
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
Assert.Empty(files);
}

View File

@ -6,6 +6,10 @@ using static System.String;
namespace API.Comparators
{
/// <summary>
/// Attempts to emulate Windows explorer sorting
/// </summary>
/// <remarks>This is not thread-safe</remarks>
public sealed class NaturalSortComparer : IComparer<string>, IDisposable
{
private readonly bool _isAscending;
@ -23,7 +27,6 @@ namespace API.Comparators
{
if (x == y) return 0;
// Should be fixed: Operations that change non-concurrent collections must have exclusive access. A concurrent update was performed on this collection and corrupted its state. The collection's state is no longer correct.
if (!_table.TryGetValue(x ?? Empty, out var x1))
{
x1 = Regex.Split(x ?? Empty, "([0-9]+)");
@ -33,7 +36,6 @@ namespace API.Comparators
if (!_table.TryGetValue(y ?? Empty, out var y1))
{
y1 = Regex.Split(y ?? Empty, "([0-9]+)");
// Should be fixed: EXCEPTION: An item with the same key has already been added. Key: M:\Girls of the Wild's\Girls of the Wild's - Ep. 083 (Season 1) [LINE Webtoon].cbz
_table.Add(y ?? Empty, y1);
}
@ -59,6 +61,7 @@ namespace API.Comparators
returnVal = 0;
}
return _isAscending ? returnVal : -returnVal;
}

View File

@ -259,7 +259,10 @@ namespace API.Controllers
}
var styleContent = await _bookService.ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase, book.Content.Css[key].FileName, book);
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
if (styleContent != null)
{
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
}
}
}

View File

@ -2,7 +2,9 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Extensions;
using API.Interfaces;
@ -90,6 +92,40 @@ namespace API.Controllers
return BadRequest("Something went wrong, please try again");
}
/// <summary>
/// Adds a collection tag onto multiple Series. If tag id is 0, this will create a new tag.
/// </summary>
/// <param name="dto"></param>
/// <returns></returns>
[HttpPost("update-for-series")]
public async Task<ActionResult> AddToMultipleSeries(CollectionTagBulkAddDto dto)
{
var tag = await _unitOfWork.CollectionTagRepository.GetFullTagAsync(dto.CollectionTagId);
if (tag == null)
{
tag = DbFactory.CollectionTag(0, dto.CollectionTagTitle, String.Empty, false);
_unitOfWork.CollectionTagRepository.Add(tag);
}
var seriesMetadatas = await _unitOfWork.SeriesRepository.GetSeriesMetadataForIdsAsync(dto.SeriesIds);
foreach (var metadata in seriesMetadatas)
{
if (!metadata.CollectionTags.Any(t => t.Title.Equals(tag.Title, StringComparison.InvariantCulture)))
{
metadata.CollectionTags.Add(tag);
_unitOfWork.SeriesMetadataRepository.Update(metadata);
}
}
if (!_unitOfWork.HasChanges()) return Ok();
if (await _unitOfWork.CommitAsync())
{
return Ok();
}
return BadRequest("There was an issue updating series with collection tag");
}
/// <summary>
/// For a given tag, update the summary if summary has changed and remove a set of series from the tag.
/// </summary>

View File

@ -164,7 +164,7 @@ namespace API.Controllers
case MangaFormat.Archive:
case MangaFormat.Pdf:
_cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
var originalFiles = _directoryService.GetFilesWithExtension(chapterExtractPath,
var originalFiles = DirectoryService.GetFilesWithExtension(chapterExtractPath,
Parser.Parser.ImageFileExtensions);
_directoryService.CopyFilesToDirectory(originalFiles, chapterExtractPath, $"{chapterId}_");
DirectoryService.DeleteFiles(originalFiles);
@ -175,7 +175,7 @@ namespace API.Controllers
return BadRequest("Series is not in a valid format. Please rescan series and try again.");
}
var files = _directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
var files = DirectoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
// Filter out images that aren't in bookmarks
Array.Sort(files, _numericComparer);
totalFilePaths.AddRange(files.Where((_, i) => chapterPages.Contains(i)));

View File

@ -226,7 +226,7 @@ namespace API.Controllers
[HttpGet("search")]
public async Task<ActionResult<IEnumerable<SearchResultDto>>> Search(string queryString)
{
queryString = queryString.Trim().Replace(@"%", "");
queryString = Uri.UnescapeDataString(queryString).Trim().Replace(@"%", string.Empty);
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
// Get libraries user has access to

View File

@ -6,6 +6,7 @@ using System.Threading.Tasks;
using System.Xml.Serialization;
using API.Comparators;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
using API.DTOs.OPDS;
using API.Entities;
@ -738,7 +739,7 @@ namespace API.Controllers
[HttpGet("{apiKey}/favicon")]
public async Task<ActionResult> GetFavicon(string apiKey)
{
var files = _directoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
var files = DirectoryService.GetFilesWithExtension(Path.Join(Directory.GetCurrentDirectory(), ".."), @"\.ico");
if (files.Length == 0) return BadRequest("Cannot find icon");
var path = files[0];
var content = await _directoryService.ReadFileAsync(path);

View File

@ -78,8 +78,9 @@ namespace API.Controllers
public async Task<ActionResult<bool>> DeleteSeries(int seriesId)
{
var username = User.GetUsername();
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
if (result)
@ -92,6 +93,34 @@ namespace API.Controllers
return Ok(result);
}
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("delete-multiple")]
public async Task<ActionResult> DeleteMultipleSeries(DeleteSeriesDto dto)
{
var username = User.GetUsername();
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", dto.SeriesIds, username);
var chapterMappings =
await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(dto.SeriesIds.ToArray());
var allChapterIds = new List<int>();
foreach (var mapping in chapterMappings)
{
allChapterIds.AddRange(mapping.Value);
}
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(dto.SeriesIds);
_unitOfWork.SeriesRepository.Remove(series);
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
_taskScheduler.CleanupChapters(allChapterIds.ToArray());
}
return Ok();
}
/// <summary>
/// Returns All volumes for a series with progress information and Chapters
/// </summary>
@ -212,6 +241,8 @@ namespace API.Controllers
.Take(userParams.PageSize).ToList();
var pagedList = new PagedList<SeriesDto>(listResults, listResults.Count, userParams.PageNumber, userParams.PageSize);
await _unitOfWork.SeriesRepository.AddSeriesModifiers(userId, pagedList);
Response.AddPaginationHeader(pagedList.CurrentPage, pagedList.PageSize, pagedList.TotalCount, pagedList.TotalPages);
return Ok(pagedList);

View File

@ -71,10 +71,10 @@ namespace API.Controllers
/// </summary>
/// <returns></returns>
[HttpPost("backup-db")]
public ActionResult BackupDatabase()
public async Task<ActionResult> BackupDatabase()
{
_logger.LogInformation("{UserName} is backing up database of server from admin dashboard", User.GetUsername());
_backupService.BackupDatabase();
await _backupService.BackupDatabase();
return Ok();
}

View File

@ -140,7 +140,7 @@ namespace API.Controllers
}
}
if (!_unitOfWork.HasChanges()) return Ok("Nothing was updated");
if (!_unitOfWork.HasChanges()) return Ok(updateSettingsDto);
try
{

View File

@ -25,7 +25,7 @@ namespace API.Controllers
{
try
{
await _statsService.PathData(clientInfoDto);
await _statsService.RecordClientInfo(clientInfoDto);
return Ok();
}

View File

@ -0,0 +1,18 @@
using System.Collections.Generic;
namespace API.DTOs.CollectionTags
{
public class CollectionTagBulkAddDto
{
/// <summary>
/// Collection Tag Id
/// </summary>
/// <remarks>Can be 0 which then will use Title to create a tag</remarks>
public int CollectionTagId { get; init; }
public string CollectionTagTitle { get; init; }
/// <summary>
/// Series Ids to add onto Collection Tag
/// </summary>
public IEnumerable<int> SeriesIds { get; init; }
}
}

View File

@ -1,4 +1,4 @@
namespace API.DTOs
namespace API.DTOs.CollectionTags
{
public class CollectionTagDto
{

View File

@ -1,10 +1,10 @@
using System.Collections.Generic;
namespace API.DTOs
namespace API.DTOs.CollectionTags
{
public class UpdateSeriesForTagDto
{
public CollectionTagDto Tag { get; init; }
public ICollection<int> SeriesIdsToRemove { get; init; }
public IEnumerable<int> SeriesIdsToRemove { get; init; }
}
}
}

View File

@ -0,0 +1,9 @@
using System.Collections.Generic;
namespace API.DTOs
{
public class DeleteSeriesDto
{
public IList<int> SeriesIds { get; set; }
}
}

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using API.DTOs.CollectionTags;
using API.Entities;
namespace API.DTOs

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using API.DTOs.CollectionTags;
namespace API.DTOs
{

View File

@ -0,0 +1,166 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using API.Services;
using Kavita.Common;
namespace API.Data
{
public static class MigrateConfigFiles
{
private static readonly List<string> LooseLeafFiles = new List<string>()
{
"appsettings.json",
"appsettings.Development.json",
"kavita.db",
};
private static readonly List<string> AppFolders = new List<string>()
{
"covers",
"stats",
"logs",
"backups",
"cache",
"temp"
};
private static readonly string ConfigDirectory = Path.Join(Directory.GetCurrentDirectory(), "config");
/// <summary>
/// In v0.4.8 we moved all config files to config/ to match with how docker was setup. This will move all config files from current directory
/// to config/
/// </summary>
public static void Migrate(bool isDocker)
{
Console.WriteLine("Checking if migration to config/ is needed");
if (isDocker)
{
if (Configuration.LogPath.Contains("config"))
{
Console.WriteLine("Migration to config/ not needed");
return;
}
Console.WriteLine(
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
CopyAppFolders();
DeleteAppFolders();
UpdateConfiguration();
Console.WriteLine("Migration complete. All config files are now in config/ directory");
return;
}
if (new FileInfo(Configuration.AppSettingsFilename).Exists)
{
Console.WriteLine("Migration to config/ not needed");
return;
}
Console.WriteLine(
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
Console.WriteLine($"Creating {ConfigDirectory}");
DirectoryService.ExistOrCreate(ConfigDirectory);
try
{
CopyLooseLeafFiles();
CopyAppFolders();
// Then we need to update the config file to point to the new DB file
UpdateConfiguration();
}
catch (Exception)
{
Console.WriteLine("There was an exception during migration. Please move everything manually.");
return;
}
// Finally delete everything in the source directory
Console.WriteLine("Removing old files");
DeleteLooseFiles();
DeleteAppFolders();
Console.WriteLine("Removing old files...DONE");
Console.WriteLine("Migration complete. All config files are now in config/ directory");
}
private static void DeleteAppFolders()
{
foreach (var folderToDelete in AppFolders)
{
if (!new DirectoryInfo(Path.Join(Directory.GetCurrentDirectory(), folderToDelete)).Exists) continue;
DirectoryService.ClearAndDeleteDirectory(Path.Join(Directory.GetCurrentDirectory(), folderToDelete));
}
}
private static void DeleteLooseFiles()
{
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
.Where(f => f.Exists);
DirectoryService.DeleteFiles(configFiles.Select(f => f.FullName));
}
private static void CopyAppFolders()
{
Console.WriteLine("Moving folders to config");
foreach (var folderToMove in AppFolders)
{
if (new DirectoryInfo(Path.Join(ConfigDirectory, folderToMove)).Exists) continue;
try
{
DirectoryService.CopyDirectoryToDirectory(
Path.Join(Directory.GetCurrentDirectory(), folderToMove),
Path.Join(ConfigDirectory, folderToMove));
}
catch (Exception)
{
/* Swallow Exception */
}
}
Console.WriteLine("Moving folders to config...DONE");
}
private static void CopyLooseLeafFiles()
{
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
.Where(f => f.Exists);
// First step is to move all the files
Console.WriteLine("Moving files to config/");
foreach (var fileInfo in configFiles)
{
try
{
fileInfo.CopyTo(Path.Join(ConfigDirectory, fileInfo.Name));
}
catch (Exception)
{
/* Swallow exception when already exists */
}
}
Console.WriteLine("Moving files to config...DONE");
}
private static void UpdateConfiguration()
{
Console.WriteLine("Updating appsettings.json to new paths");
Configuration.DatabasePath = "config//kavita.db";
Configuration.LogPath = "config//logs/kavita.log";
Console.WriteLine("Updating appsettings.json to new paths...DONE");
}
}
}

View File

@ -3,6 +3,7 @@ using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Interfaces.Repositories;
using AutoMapper;
@ -22,6 +23,11 @@ namespace API.Data.Repositories
_mapper = mapper;
}
public void Add(CollectionTag tag)
{
_context.CollectionTag.Add(tag);
}
public void Remove(CollectionTag tag)
{
_context.CollectionTag.Remove(tag);

View File

@ -0,0 +1,20 @@
using API.Entities;
using API.Interfaces.Repositories;
namespace API.Data.Repositories
{
public class SeriesMetadataRepository : ISeriesMetadataRepository
{
private readonly DataContext _context;
public SeriesMetadataRepository(DataContext context)
{
_context = context;
}
public void Update(SeriesMetadata seriesMetadata)
{
_context.SeriesMetadata.Update(seriesMetadata);
}
}
}

View File

@ -4,6 +4,7 @@ using System.Linq;
using System.Threading.Tasks;
using API.Data.Scanner;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
using API.Entities;
using API.Extensions;
@ -41,6 +42,11 @@ namespace API.Data.Repositories
_context.Series.Remove(series);
}
public void Remove(IEnumerable<Series> series)
{
_context.Series.RemoveRange(series);
}
public async Task<bool> DoesSeriesNameExistInLibrary(string name)
{
var libraries = _context.Series
@ -171,6 +177,21 @@ namespace API.Data.Repositories
.SingleOrDefaultAsync();
}
/// <summary>
/// Returns Volumes, Metadata, and Collection Tags
/// </summary>
/// <param name="seriesIds"></param>
/// <returns></returns>
public async Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds)
{
return await _context.Series
.Include(s => s.Volumes)
.Include(s => s.Metadata)
.ThenInclude(m => m.CollectionTags)
.Where(s => seriesIds.Contains(s.Id))
.ToListAsync();
}
public async Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds)
{
var volumes = await _context.Volume
@ -454,15 +475,15 @@ namespace API.Data.Repositories
// TODO: Think about making this bigger depending on number of files a user has in said library
// and number of cores and amount of memory. We can then make an optimal choice
var totalSeries = await GetSeriesCount(libraryId);
var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
if (totalSeries < procCount * 2 || totalSeries < 50)
{
return new Tuple<int, int>(totalSeries, totalSeries);
}
return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
// var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
//
// if (totalSeries < procCount * 2 || totalSeries < 50)
// {
// return new Tuple<int, int>(totalSeries, totalSeries);
// }
//
// return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
return new Tuple<int, int>(totalSeries, 50);
}
public async Task<Chunk> GetChunkInfo(int libraryId = 0)
@ -485,5 +506,13 @@ namespace API.Data.Repositories
TotalChunks = totalChunks
};
}
public async Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds)
{
return await _context.SeriesMetadata
.Where(sm => seriesIds.Contains(sm.SeriesId))
.Include(sm => sm.CollectionTags)
.ToListAsync();
}
}
}

View File

@ -35,15 +35,6 @@ namespace API.Data.Repositories
return _mapper.Map<ServerSettingDto>(settings);
}
public ServerSettingDto GetSettingsDto()
{
var settings = _context.ServerSetting
.Select(x => x)
.AsNoTracking()
.ToList();
return _mapper.Map<ServerSettingDto>(settings);
}
public Task<ServerSetting> GetSettingAsync(ServerSettingKey key)
{
return _context.ServerSetting.SingleOrDefaultAsync(x => x.Key == key);

View File

@ -41,11 +41,11 @@ namespace API.Data
IList<ServerSetting> defaultSettings = new List<ServerSetting>()
{
new() {Key = ServerSettingKey.CacheDirectory, Value = CacheService.CacheDirectory},
new() {Key = ServerSettingKey.CacheDirectory, Value = DirectoryService.CacheDirectory},
new () {Key = ServerSettingKey.TaskScan, Value = "daily"},
new () {Key = ServerSettingKey.LoggingLevel, Value = "Information"}, // Not used from DB, but DB is sync with appSettings.json
new () {Key = ServerSettingKey.TaskBackup, Value = "weekly"},
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "backups/"))},
new () {Key = ServerSettingKey.BackupDirectory, Value = Path.GetFullPath(DirectoryService.BackupDirectory)},
new () {Key = ServerSettingKey.Port, Value = "5000"}, // Not used from DB, but DB is sync with appSettings.json
new () {Key = ServerSettingKey.AllowStatCollection, Value = "true"},
new () {Key = ServerSettingKey.EnableOpds, Value = "false"},
@ -69,6 +69,8 @@ namespace API.Data
Configuration.Port + string.Empty;
context.ServerSetting.First(s => s.Key == ServerSettingKey.LoggingLevel).Value =
Configuration.LogLevel + string.Empty;
context.ServerSetting.First(s => s.Key == ServerSettingKey.CacheDirectory).Value =
DirectoryService.CacheDirectory + string.Empty;
await context.SaveChangesAsync();

View File

@ -34,6 +34,7 @@ namespace API.Data
public IFileRepository FileRepository => new FileRepository(_context);
public IChapterRepository ChapterRepository => new ChapterRepository(_context, _mapper);
public IReadingListRepository ReadingListRepository => new ReadingListRepository(_context, _mapper);
public ISeriesMetadataRepository SeriesMetadataRepository => new SeriesMetadataRepository(_context);
/// <summary>
/// Commits changes to the DB. Completes the open transaction.

View File

@ -8,6 +8,9 @@ namespace API.Entities
{
[Key]
public ServerSettingKey Key { get; set; }
/// <summary>
/// The value of the Setting. Converter knows how to convert to the correct type
/// </summary>
public string Value { get; set; }
/// <inheritdoc />

View File

@ -1,24 +0,0 @@
using API.Interfaces.Services;
using API.Services.Clients;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
namespace API.Extensions
{
public static class ServiceCollectionExtensions
{
public static IServiceCollection AddStartupTask<T>(this IServiceCollection services)
where T : class, IStartupTask
=> services.AddTransient<IStartupTask, T>();
public static IServiceCollection AddStatsClient(this IServiceCollection services, IConfiguration configuration)
{
services.AddHttpClient<StatsApiClient>(client =>
{
client.DefaultRequestHeaders.Add("api-key", "MsnvA2DfQqxSK5jh");
});
return services;
}
}
}

View File

@ -1,6 +1,7 @@
using System.Collections.Generic;
using System.Linq;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Reader;
using API.DTOs.ReadingLists;
using API.DTOs.Settings;

View File

@ -15,6 +15,7 @@ namespace API.Interfaces
IFileRepository FileRepository { get; }
IChapterRepository ChapterRepository { get; }
IReadingListRepository ReadingListRepository { get; }
ISeriesMetadataRepository SeriesMetadataRepository { get; }
bool Commit();
Task<bool> CommitAsync();
bool HasChanges();

View File

@ -1,12 +1,14 @@
using System.Collections.Generic;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface ICollectionTagRepository
{
void Add(CollectionTag tag);
void Remove(CollectionTag tag);
Task<IEnumerable<CollectionTagDto>> GetAllTagDtosAsync();
Task<IEnumerable<CollectionTagDto>> SearchTagDtosAsync(string searchQuery);

View File

@ -0,0 +1,9 @@
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface ISeriesMetadataRepository
{
void Update(SeriesMetadata seriesMetadata);
}
}

View File

@ -13,6 +13,7 @@ namespace API.Interfaces.Repositories
void Attach(Series series);
void Update(Series series);
void Remove(Series series);
void Remove(IEnumerable<Series> series);
Task<bool> DoesSeriesNameExistInLibrary(string name);
/// <summary>
/// Adds user information like progress, ratings, etc
@ -33,6 +34,7 @@ namespace API.Interfaces.Repositories
Task<SeriesDto> GetSeriesDtoByIdAsync(int seriesId, int userId);
Task<bool> DeleteSeriesAsync(int seriesId);
Task<Series> GetSeriesByIdAsync(int seriesId);
Task<IList<Series>> GetSeriesByIdsAsync(IList<int> seriesIds);
Task<int[]> GetChapterIdsForSeriesAsync(int[] seriesIds);
Task<IDictionary<int, IList<int>>> GetChapterIdWithSeriesIdForSeriesAsync(int[] seriesIds);
/// <summary>
@ -54,5 +56,6 @@ namespace API.Interfaces.Repositories
Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams);
Task<Series> GetFullSeriesForSeriesIdAsync(int seriesId);
Task<Chunk> GetChunkInfo(int libraryId = 0);
Task<IList<SeriesMetadata>> GetSeriesMetadataForIdsAsync(IEnumerable<int> seriesIds);
}
}

View File

@ -10,7 +10,6 @@ namespace API.Interfaces.Repositories
{
void Update(ServerSetting settings);
Task<ServerSettingDto> GetSettingsDtoAsync();
ServerSettingDto GetSettingsDto();
Task<ServerSetting> GetSettingAsync(ServerSettingKey key);
Task<IEnumerable<ServerSetting>> GetSettingsAsync();

View File

@ -12,21 +12,9 @@ namespace API.Interfaces.Services
/// <param name="rootPath">Absolute path of directory to scan.</param>
/// <returns>List of folder names</returns>
IEnumerable<string> ListDirectory(string rootPath);
/// <summary>
/// Gets files in a directory. If searchPatternExpression is passed, will match the regex against for filtering.
/// </summary>
/// <param name="path"></param>
/// <param name="searchPatternExpression"></param>
/// <returns></returns>
string[] GetFilesWithExtension(string path, string searchPatternExpression = "");
Task<byte[]> ReadFileAsync(string path);
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
bool Exists(string directory);
IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly);
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*");
}
}

View File

@ -5,7 +5,7 @@ namespace API.Interfaces.Services
{
public interface IStatsService
{
Task PathData(ClientInfoDto clientInfoDto);
Task CollectAndSendStatsData();
Task RecordClientInfo(ClientInfoDto clientInfoDto);
Task Send();
}
}

View File

@ -24,11 +24,25 @@ namespace API.Parser
private const RegexOptions MatchOptions =
RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant;
public static readonly Regex FontSrcUrlRegex = new Regex(@"(src:url\(.{1})" + "([^\"']*)" + @"(.{1}\))",
/// <summary>
/// Matches against font-family css syntax. Does not match if url import has data: starting, as that is binary data
/// </summary>
/// <remarks>See here for some examples https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face</remarks>
public static readonly Regex FontSrcUrlRegex = new Regex(@"(?<Start>(src:\s?)?url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(?<End>.{1}\))",
MatchOptions, RegexTimeout);
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s[\"|'])(?<Filename>[\\w\\d/\\._-]+)([\"|'];?)",
/// <summary>
/// https://developer.mozilla.org/en-US/docs/Web/CSS/@import
/// </summary>
public static readonly Regex CssImportUrlRegex = new Regex("(@import\\s([\"|']|url\\([\"|']))(?<Filename>[^'\"]+)([\"|']\\)?);",
MatchOptions | RegexOptions.Multiline, RegexTimeout);
/// <summary>
/// Misc css image references, like background-image: url(), border-image, or list-style-image
/// </summary>
/// Original prepend: (background|border|list-style)-image:\s?)?
public static readonly Regex CssImageUrlRegex = new Regex(@"(url\((?!data:).(?!data:))" + "(?<Filename>(?!data:)[^\"']*)" + @"(.\))",
MatchOptions, RegexTimeout);
private static readonly string XmlRegexExtensions = @"\.xml";
private static readonly Regex ImageRegex = new Regex(ImageFileExtensions,
MatchOptions, RegexTimeout);
@ -212,7 +226,7 @@ namespace API.Parser
MatchOptions, RegexTimeout),
// Baketeriya ch01-05.zip, Akiiro Bousou Biyori - 01.jpg, Beelzebub_172_RHS.zip, Cynthia the Mission 29.rar, A Compendium of Ghosts - 031 - The Third Story_ Part 12 (Digital) (Cobalt001)
new Regex(
@"^(?!Vol\.?)(?<Series>.+?)( |_|-)(?<!-)(ch)?\d+-?\d*",
@"^(?!Vol\.?)(?!Chapter)(?<Series>.+?)(\s|_|-)(?<!-)(ch|chapter)?\.?\d+-?\d*",
MatchOptions, RegexTimeout),
// [BAA]_Darker_than_Black_c1 (This is very greedy, make sure it's close to last)
new Regex(
@ -533,14 +547,16 @@ namespace API.Parser
ret.Edition = edition;
}
var isSpecial = ParseMangaSpecial(fileName);
var isSpecial = type == LibraryType.Comic ? ParseComicSpecial(fileName) : ParseMangaSpecial(fileName);
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
if (ret.Chapters == DefaultChapter && ret.Volumes == DefaultVolume && !string.IsNullOrEmpty(isSpecial))
{
ret.IsSpecial = true;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
// If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name
if (HasSpecialMarker(fileName))
{
ret.IsSpecial = true;
@ -549,8 +565,6 @@ namespace API.Parser
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
// here is the issue. If we are a special with marker, we need to ensure we use the correct series name.
// we can do this by falling back
if (string.IsNullOrEmpty(ret.Series))
{
@ -594,8 +608,6 @@ namespace API.Parser
{
ret.Chapters = parsedChapter;
}
continue;
}
var series = ParseSeries(folder);

View File

@ -1,98 +1,127 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Services;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Server.Kestrel.Core;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
namespace API
{
public class Program
{
private static readonly int HttpPort = Configuration.Port;
public class Program
{
private static readonly int HttpPort = Configuration.Port;
protected Program()
{
}
protected Program()
{
}
public static async Task Main(string[] args)
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
public static async Task Main(string[] args)
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
var isDocker = new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker;
// Before anything, check if JWT has been generated properly or if user still has default
if (!Configuration.CheckIfJwtTokenSet() &&
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
{
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
var rBytes = new byte[128];
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
}
MigrateConfigFiles.Migrate(isDocker);
var host = CreateHostBuilder(args).Build();
// Before anything, check if JWT has been generated properly or if user still has default
if (!Configuration.CheckIfJwtTokenSet() &&
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") != Environments.Development)
{
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
var rBytes = new byte[128];
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
}
using var scope = host.Services.CreateScope();
var services = scope.ServiceProvider;
var host = CreateHostBuilder(args).Build();
try
{
var context = services.GetRequiredService<DataContext>();
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
using var scope = host.Services.CreateScope();
var services = scope.ServiceProvider;
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
try
{
// If this is a new install, tables wont exist yet
var context = services.GetRequiredService<DataContext>();
var roleManager = services.GetRequiredService<RoleManager<AppRole>>();
if (isDocker && new FileInfo("data/appsettings.json").Exists)
{
var logger = services.GetRequiredService<ILogger<Startup>>();
logger.LogCritical("WARNING! Mount point is incorrect, nothing here will persist. Please change your container mount from /kavita/data to /kavita/config");
return;
}
var requiresCoverImageMigration = !Directory.Exists(DirectoryService.CoverImageDirectory);
try
{
// If this is a new install, tables wont exist yet
if (requiresCoverImageMigration)
{
MigrateCoverImages.ExtractToImages(context);
}
}
catch (Exception)
{
requiresCoverImageMigration = false;
}
// Apply all migrations on startup
await context.Database.MigrateAsync();
if (requiresCoverImageMigration)
{
MigrateCoverImages.ExtractToImages(context);
await MigrateCoverImages.UpdateDatabaseWithImages(context);
}
await Seed.SeedRoles(roleManager);
await Seed.SeedSettings(context);
await Seed.SeedUserApiKeys(context);
}
catch (Exception )
catch (Exception ex)
{
requiresCoverImageMigration = false;
var logger = services.GetRequiredService<ILogger<Program>>();
logger.LogError(ex, "An error occurred during migration");
}
// Apply all migrations on startup
await context.Database.MigrateAsync();
await host.RunAsync();
}
if (requiresCoverImageMigration)
{
await MigrateCoverImages.UpdateDatabaseWithImages(context);
}
private static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((hostingContext, config) =>
{
config.Sources.Clear();
await Seed.SeedRoles(roleManager);
await Seed.SeedSettings(context);
await Seed.SeedUserApiKeys(context);
}
catch (Exception ex)
{
var logger = services.GetRequiredService<ILogger<Program>>();
logger.LogError(ex, "An error occurred during migration");
}
var env = hostingContext.HostingEnvironment;
await host.RunAsync();
}
config.AddJsonFile("config/appsettings.json", optional: true, reloadOnChange: false)
.AddJsonFile($"config/appsettings.{env.EnvironmentName}.json",
optional: true, reloadOnChange: false);
})
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((opts) =>
{
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
});
private static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((opts) =>
{
opts.ListenAnyIP(HttpPort, options => { options.Protocols = HttpProtocols.Http1AndHttp2; });
});
webBuilder.UseStartup<Startup>();
});
webBuilder.UseStartup<Startup>();
});
}
}
}

View File

@ -123,12 +123,24 @@ namespace API.Services
/// </summary>
/// <param name="entryFullNames"></param>
/// <returns>Entry name of match, null if no match</returns>
public string FirstFileEntry(IEnumerable<string> entryFullNames)
public static string FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
var result = entryFullNames.OrderBy(Path.GetFileName, new NaturalSortComparer())
.FirstOrDefault(x => !Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames.Where(x =>!Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x)
&& !x.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)).ToList();
if (fullNames.Count == 0) return null;
var nonNestedFile = fullNames.Where(entry => (Path.GetDirectoryName(entry) ?? string.Empty).Equals(archiveName))
.OrderBy(Path.GetFullPath, new NaturalSortComparer())
.FirstOrDefault();
if (!string.IsNullOrEmpty(nonNestedFile)) return nonNestedFile;
var result = fullNames
.OrderBy(Path.GetFileName, new NaturalSortComparer())
.FirstOrDefault();
return string.IsNullOrEmpty(result) ? null : result;
}
@ -158,7 +170,7 @@ namespace API.Services
using var archive = ZipFile.OpenRead(archivePath);
var entryNames = archive.Entries.Select(e => e.FullName).ToArray();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entry = archive.Entries.Single(e => e.FullName == entryName);
using var stream = entry.Open();
@ -169,7 +181,7 @@ namespace API.Services
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames, Path.GetFileName(archivePath));
var entry = archive.Entries.Single(e => e.Key == entryName);
using var stream = entry.OpenEntryStream();

View File

@ -140,15 +140,22 @@ namespace API.Services
}
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
stylesheetHtml =
Parser.Parser.CssImportUrlRegex.Replace(stylesheetHtml, "$1" + apiBase + prepend + "$2" + "$3");
var importMatches = Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml);
foreach (Match match in importMatches)
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + prepend + importFile);
}
// Check if there are any background images and rewrite those urls
EscapeCssImageReferences(ref stylesheetHtml, apiBase, book);
var styleContent = RemoveWhiteSpaceFromStylesheets(stylesheetHtml);
styleContent =
Parser.Parser.FontSrcUrlRegex.Replace(styleContent, "$1" + apiBase + "$2" + "$3");
styleContent = styleContent.Replace("body", ".reading-section");
if (string.IsNullOrEmpty(styleContent)) return string.Empty;
var stylesheet = await _cssParser.ParseAsync(styleContent);
foreach (var styleRule in stylesheet.StyleRules)
{
@ -165,6 +172,21 @@ namespace API.Services
return RemoveWhiteSpaceFromStylesheets(stylesheet.ToCss());
}
private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book)
{
var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
foreach (Match match in matches)
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
var key = CleanContentKeys(importFile);
if (!book.Content.AllFiles.ContainsKey(key)) continue;
stylesheetHtml = stylesheetHtml.Replace(importFile, apiBase + key);
}
}
public ComicInfo GetComicInfo(string filePath)
{
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
@ -488,15 +510,29 @@ namespace API.Services
private static string RemoveWhiteSpaceFromStylesheets(string body)
{
if (string.IsNullOrEmpty(body))
{
return string.Empty;
}
// Remove comments from CSS
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
body = Regex.Replace(body, @"[a-zA-Z]+#", "#");
body = Regex.Replace(body, @"[\n\r]+\s*", string.Empty);
body = Regex.Replace(body, @"\s+", " ");
body = Regex.Replace(body, @"\s?([:,;{}])\s?", "$1");
body = body.Replace(";}", "}");
try
{
body = body.Replace(";}", "}");
}
catch (Exception)
{
/* Swallow exception. Some css doesn't have style rules ending in ; */
}
body = Regex.Replace(body, @"([\s:]0)(px|pt|%|em)", "$1");
// Remove comments from CSS
body = Regex.Replace(body, @"/\*[\d\D]*?\*/", string.Empty);
return body;
}

View File

@ -21,7 +21,6 @@ namespace API.Services
private readonly IDirectoryService _directoryService;
private readonly IBookService _bookService;
private readonly NumericComparer _numericComparer;
public static readonly string CacheDirectory = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "cache/"));
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
IDirectoryService directoryService, IBookService bookService)
@ -38,7 +37,7 @@ namespace API.Services
{
if (!DirectoryService.ExistOrCreate(DirectoryService.CacheDirectory))
{
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", CacheDirectory);
_logger.LogError("Cache directory {CacheDirectory} is not accessible or does not exist. Creating...", DirectoryService.CacheDirectory);
}
}
@ -102,7 +101,7 @@ namespace API.Services
}
else
{
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
DirectoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath,
Parser.Parser.ImageFileExtensions);
}
@ -147,7 +146,7 @@ namespace API.Services
try
{
DirectoryService.ClearDirectory(CacheDirectory);
DirectoryService.ClearDirectory(DirectoryService.CacheDirectory);
}
catch (Exception ex)
{
@ -198,7 +197,7 @@ namespace API.Services
if (page <= (mangaFile.Pages + pagesSoFar))
{
var path = GetCachePath(chapter.Id);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
var files = DirectoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
Array.Sort(files, _numericComparer);
if (files.Length == 0)

View File

@ -1,55 +0,0 @@
using System;
using System.Net.Http;
using System.Net.Http.Json;
using System.Threading.Tasks;
using API.DTOs.Stats;
using Microsoft.Extensions.Logging;
namespace API.Services.Clients
{
public class StatsApiClient
{
private readonly HttpClient _client;
private readonly ILogger<StatsApiClient> _logger;
#pragma warning disable S1075
private const string ApiUrl = "http://stats.kavitareader.com";
#pragma warning restore S1075
public StatsApiClient(HttpClient client, ILogger<StatsApiClient> logger)
{
_client = client;
_logger = logger;
_client.Timeout = TimeSpan.FromSeconds(30);
}
public async Task SendDataToStatsServer(UsageStatisticsDto data)
{
var responseContent = string.Empty;
try
{
using var response = await _client.PostAsJsonAsync(ApiUrl + "/api/InstallationStats", data);
responseContent = await response.Content.ReadAsStringAsync();
response.EnsureSuccessStatusCode();
}
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
throw;
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
throw;
}
}
}
}

View File

@ -16,10 +16,12 @@ namespace API.Services
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "cache");
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "covers");
public static readonly string TempDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "temp");
public static readonly string LogDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "logs");
public static readonly string CacheDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "cache");
public static readonly string CoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "covers");
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
public static readonly string StatsDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "stats");
public DirectoryService(ILogger<DirectoryService> logger)
{
@ -95,7 +97,7 @@ namespace API.Services
return di.Exists;
}
public IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
public static IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (searchPatternExpression != string.Empty)
@ -134,13 +136,10 @@ namespace API.Services
/// <param name="searchPattern">Defaults to *, meaning all files</param>
/// <returns></returns>
/// <exception cref="DirectoryNotFoundException"></exception>
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "*")
public static bool CopyDirectoryToDirectory(string sourceDirName, string destDirName, string searchPattern = "")
{
if (string.IsNullOrEmpty(sourceDirName)) return false;
var di = new DirectoryInfo(sourceDirName);
if (!di.Exists) return false;
// Get the subdirectories for the specified directory.
var dir = new DirectoryInfo(sourceDirName);
@ -154,7 +153,7 @@ namespace API.Services
var dirs = dir.GetDirectories();
// If the destination directory doesn't exist, create it.
Directory.CreateDirectory(destDirName);
ExistOrCreate(destDirName);
// Get the files in the directory and copy them to the new location.
var files = GetFilesWithExtension(dir.FullName, searchPattern).Select(n => new FileInfo(n));
@ -176,7 +175,7 @@ namespace API.Services
public string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
public static string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
{
if (searchPatternExpression != string.Empty)
{

View File

@ -13,7 +13,6 @@ namespace API.Services
public class ImageService : IImageService
{
private readonly ILogger<ImageService> _logger;
private readonly IDirectoryService _directoryService;
public const string ChapterCoverImageRegex = @"v\d+_c\d+";
public const string SeriesCoverImageRegex = @"seres\d+";
public const string CollectionTagCoverImageRegex = @"tag\d+";
@ -24,10 +23,9 @@ namespace API.Services
/// </summary>
private const int ThumbnailWidth = 320;
public ImageService(ILogger<ImageService> logger, IDirectoryService directoryService)
public ImageService(ILogger<ImageService> logger)
{
_logger = logger;
_directoryService = directoryService;
}
/// <summary>
@ -44,9 +42,9 @@ namespace API.Services
return null;
}
var firstImage = _directoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
var firstImage = DirectoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
.OrderBy(f => f, new NaturalSortComparer()).FirstOrDefault();
return firstImage;
}

View File

@ -1,3 +1,4 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
@ -216,37 +217,45 @@ namespace API.Services
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogDebug($"[MetadataService] Refreshing Library {library.Name}. Total Items: {chunkInfo.TotalSize}. Total Chunks: {chunkInfo.TotalChunks} with {chunkInfo.ChunkSize} size.");
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
// This technically does
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogDebug($"[MetadataService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug($"[MetadataService] Fetched {nonLibrarySeries.Count} series for refresh");
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
Parallel.ForEach(nonLibrarySeries, series =>
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
try
{
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
UpdateMetadata(series, volumeUpdated || forceUpdate);
}
catch (Exception)
{
/* Swallow exception */
}
UpdateMetadata(series, volumeUpdated || forceUpdate);
});
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())

View File

@ -89,7 +89,7 @@ namespace API.Services
}
_logger.LogDebug("Scheduling stat collection daily");
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.CollectAndSendStatsData(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate(SendDataTask, () => _statsService.Send(), Cron.Daily, TimeZoneInfo.Local);
}
public void CancelStatsTasks()
@ -102,7 +102,7 @@ namespace API.Services
public void RunStatCollection()
{
_logger.LogInformation("Enqueuing stat collection");
BackgroundJob.Enqueue(() => _statsService.CollectAndSendStatsData());
BackgroundJob.Enqueue(() => _statsService.Send());
}
#endregion
@ -138,8 +138,7 @@ namespace API.Services
public void CleanupTemp()
{
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(tempDirectory));
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(DirectoryService.TempDirectory));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = true)

View File

@ -9,6 +9,7 @@ using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using Hangfire;
using Kavita.Common.EnvironmentInfo;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.Logging;
@ -19,8 +20,8 @@ namespace API.Services.Tasks
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<BackupService> _logger;
private readonly IDirectoryService _directoryService;
private readonly string _tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
private readonly string _logDirectory = Path.Join(Directory.GetCurrentDirectory(), "logs");
private readonly string _tempDirectory = DirectoryService.TempDirectory;
private readonly string _logDirectory = DirectoryService.LogDirectory;
private readonly IList<string> _backupFiles;
@ -33,15 +34,32 @@ namespace API.Services.Tasks
var maxRollingFiles = config.GetMaxRollingFiles();
var loggingSection = config.GetLoggingFileName();
var files = LogFiles(maxRollingFiles, loggingSection);
_backupFiles = new List<string>()
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal", // This wont always be there
};
_backupFiles = new List<string>()
{
"data/appsettings.json",
"data/Hangfire.db",
"data/Hangfire-log.db",
"data/kavita.db",
"data/kavita.db-shm", // This wont always be there
"data/kavita.db-wal" // This wont always be there
};
}
else
{
_backupFiles = new List<string>()
{
"appsettings.json",
"Hangfire.db",
"Hangfire-log.db",
"kavita.db",
"kavita.db-shm", // This wont always be there
"kavita.db-wal" // This wont always be there
};
}
foreach (var file in files.Select(f => (new FileInfo(f)).Name).ToList())
{
_backupFiles.Add(file);
@ -54,7 +72,7 @@ namespace API.Services.Tasks
var fi = new FileInfo(logFileName);
var files = maxRollingFiles > 0
? _directoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
? DirectoryService.GetFiles(_logDirectory, $@"{Path.GetFileNameWithoutExtension(fi.Name)}{multipleFileRegex}\.log")
: new[] {"kavita.log"};
return files;
}
@ -129,6 +147,11 @@ namespace API.Services.Tasks
{
// Swallow exception. This can be a duplicate cover being copied as chapter and volumes can share same file.
}
if (!DirectoryService.GetFiles(outputTempDir).Any())
{
DirectoryService.ClearAndDeleteDirectory(outputTempDir);
}
}
/// <summary>
@ -141,7 +164,7 @@ namespace API.Services.Tasks
var backupDirectory = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BackupDirectory)).Result.Value;
if (!_directoryService.Exists(backupDirectory)) return;
var deltaTime = DateTime.Today.Subtract(TimeSpan.FromDays(dayThreshold));
var allBackups = _directoryService.GetFiles(backupDirectory).ToList();
var allBackups = DirectoryService.GetFiles(backupDirectory).ToList();
var expiredBackups = allBackups.Select(filename => new FileInfo(filename))
.Where(f => f.CreationTime > deltaTime)
.ToList();

View File

@ -16,16 +16,14 @@ namespace API.Services.Tasks
private readonly ILogger<CleanupService> _logger;
private readonly IBackupService _backupService;
private readonly IUnitOfWork _unitOfWork;
private readonly IDirectoryService _directoryService;
public CleanupService(ICacheService cacheService, ILogger<CleanupService> logger,
IBackupService backupService, IUnitOfWork unitOfWork, IDirectoryService directoryService)
IBackupService backupService, IUnitOfWork unitOfWork)
{
_cacheService = cacheService;
_logger = logger;
_backupService = backupService;
_unitOfWork = unitOfWork;
_directoryService = directoryService;
}
public void CleanupCacheDirectory()
@ -42,7 +40,7 @@ namespace API.Services.Tasks
{
_logger.LogInformation("Starting Cleanup");
_logger.LogInformation("Cleaning temp directory");
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
var tempDirectory = DirectoryService.TempDirectory;
DirectoryService.ClearDirectory(tempDirectory);
CleanupCacheDirectory();
_logger.LogInformation("Cleaning old database backups");
@ -57,7 +55,7 @@ namespace API.Services.Tasks
private async Task DeleteSeriesCoverImages()
{
var images = await _unitOfWork.SeriesRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.SeriesCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -69,7 +67,7 @@ namespace API.Services.Tasks
private async Task DeleteChapterCoverImages()
{
var images = await _unitOfWork.ChapterRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.ChapterCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;
@ -81,7 +79,7 @@ namespace API.Services.Tasks
private async Task DeleteTagCoverImages()
{
var images = await _unitOfWork.CollectionTagRepository.GetAllCoverImagesAsync();
var files = _directoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
var files = DirectoryService.GetFiles(DirectoryService.CoverImageDirectory, ImageService.CollectionTagCoverImageRegex);
foreach (var file in files)
{
if (images.Contains(Path.GetFileName(file))) continue;

View File

@ -73,9 +73,13 @@ namespace API.Services.Tasks.Scanner
info = Parser.Parser.Parse(path, rootPath, type);
}
// If we couldn't match, log. But don't log if the file parses as a cover image
if (info == null)
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
}
return;
}
@ -133,13 +137,11 @@ namespace API.Services.Tasks.Scanner
public string MergeName(ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
_logger.LogDebug("Checking if we can merge {NormalizedSeries}", normalizedSeries);
var existingName =
_scannedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries && p.Key.Format == info.Format)
.Key;
if (existingName != null && !string.IsNullOrEmpty(existingName.Name))
{
_logger.LogDebug("Found duplicate parsed infos, merged {Original} into {Merged}", info.Series, existingName.Name);
return existingName.Name;
}

View File

@ -261,13 +261,15 @@ namespace API.Services.Tasks
var totalTime = 0L;
// Update existing series
_logger.LogDebug("[ScannerService] Updating existing series");
_logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size",
library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogDebug($"[ScannerService] Processing chunk {chunk} / {chunkInfo.TotalChunks} with size {chunkInfo.ChunkSize} Series ({chunk * chunkInfo.ChunkSize} - {(chunk + 1) * chunkInfo.ChunkSize}");
_logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams()
{
PageNumber = chunk,
@ -299,7 +301,21 @@ namespace API.Services.Tasks
UpdateSeries(series, parsedSeries);
});
await _unitOfWork.CommitAsync();
try
{
await _unitOfWork.CommitAsync();
}
catch (Exception ex)
{
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB. If debug mode, series to check will be printed", chunk);
foreach (var series in nonLibrarySeries)
{
_logger.LogDebug("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName);
}
await _messageHub.Clients.All.SendAsync(SignalREvents.ScanLibraryError,
MessageFactory.ScanLibraryError(library.Id));
continue;
}
_logger.LogInformation(
"[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name);
@ -320,12 +336,14 @@ namespace API.Services.Tasks
_logger.LogDebug("[ScannerService] Adding new series");
var newSeries = new List<Series>();
var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList();
_logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series",
allSeries.Count, parsedSeries.Count - allSeries.Count);
foreach (var (key, infos) in parsedSeries)
{
// Key is normalized already
Series existingSeries;
try
{
{// NOTE: Maybe use .Equals() here
existingSeries = allSeries.SingleOrDefault(s =>
(s.NormalizedName == key.NormalizedName || Parser.Parser.Normalize(s.OriginalName) == key.NormalizedName)
&& (s.Format == key.Format || s.Format == MangaFormat.Unknown));
@ -386,7 +404,7 @@ namespace API.Services.Tasks
}
}
_logger.LogDebug(
_logger.LogInformation(
"[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name);
}

View File

@ -1,6 +1,7 @@
using System;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Text.Json;
using System.Threading;
@ -9,9 +10,11 @@ using API.Data;
using API.DTOs.Stats;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services.Clients;
using Flurl.Http;
using Hangfire;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Http;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
@ -19,32 +22,65 @@ namespace API.Services.Tasks
{
public class StatsService : IStatsService
{
private const string TempFilePath = "stats/";
private const string TempFileName = "app_stats.json";
private const string StatFileName = "app_stats.json";
private readonly StatsApiClient _client;
private readonly DataContext _dbContext;
private readonly ILogger<StatsService> _logger;
private readonly IUnitOfWork _unitOfWork;
public StatsService(StatsApiClient client, DataContext dbContext, ILogger<StatsService> logger,
#pragma warning disable S1075
private const string ApiUrl = "http://stats.kavitareader.com";
#pragma warning restore S1075
private static readonly string StatsFilePath = Path.Combine(DirectoryService.StatsDirectory, StatFileName);
private static bool FileExists => File.Exists(StatsFilePath);
public StatsService(DataContext dbContext, ILogger<StatsService> logger,
IUnitOfWork unitOfWork)
{
_client = client;
_dbContext = dbContext;
_logger = logger;
_unitOfWork = unitOfWork;
}
private static string FinalPath => Path.Combine(Directory.GetCurrentDirectory(), TempFilePath, TempFileName);
private static bool FileExists => File.Exists(FinalPath);
public async Task PathData(ClientInfoDto clientInfoDto)
/// <summary>
/// Due to all instances firing this at the same time, we can DDOS our server. This task when fired will schedule the task to be run
/// randomly over a 6 hour spread
/// </summary>
public async Task Send()
{
_logger.LogDebug("Pathing client data to the file");
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
{
return;
}
var rnd = new Random();
var offset = rnd.Next(0, 6);
if (offset == 0)
{
await SendData();
}
else
{
_logger.LogInformation("KavitaStats upload has been schedule to run in {Offset} hours", offset);
BackgroundJob.Schedule(() => SendData(), DateTimeOffset.Now.AddHours(offset));
}
}
/// <summary>
/// This must be public for Hangfire. Do not call this directly.
/// </summary>
// ReSharper disable once MemberCanBePrivate.Global
public async Task SendData()
{
await CollectRelevantData();
await FinalizeStats();
}
public async Task RecordClientInfo(ClientInfoDto clientInfoDto)
{
var statisticsDto = await GetData();
statisticsDto.AddClientInfo(clientInfoDto);
await SaveFile(statisticsDto);
@ -52,12 +88,7 @@ namespace API.Services.Tasks
private async Task CollectRelevantData()
{
_logger.LogDebug("Collecting data from the server and database");
_logger.LogDebug("Collecting usage info");
var usageInfo = await GetUsageInfo();
_logger.LogDebug("Collecting server info");
var serverInfo = GetServerInfo();
await PathData(serverInfo, usageInfo);
@ -67,39 +98,68 @@ namespace API.Services.Tasks
{
try
{
_logger.LogDebug("Finalizing Stats collection flow");
var data = await GetExistingData<UsageStatisticsDto>();
var successful = await SendDataToStatsServer(data);
_logger.LogDebug("Sending data to the Stats server");
await _client.SendDataToStatsServer(data);
_logger.LogDebug("Deleting the file from disk");
if (FileExists) File.Delete(FinalPath);
if (successful)
{
ResetStats();
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Error Finalizing Stats collection flow");
throw;
_logger.LogError(ex, "There was an exception while sending data to KavitaStats");
}
}
public async Task CollectAndSendStatsData()
private async Task<bool> SendDataToStatsServer(UsageStatisticsDto data)
{
var allowStatCollection = (await _unitOfWork.SettingsRepository.GetSettingsDtoAsync()).AllowStatCollection;
if (!allowStatCollection)
var responseContent = string.Empty;
try
{
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
var response = await (ApiUrl + "/api/InstallationStats")
.WithHeader("Accept", "application/json")
.WithHeader("User-Agent", "Kavita")
.WithHeader("x-api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("api-key", "MsnvA2DfQqxSK5jh")
.WithHeader("x-kavita-version", BuildInfo.Version)
.WithTimeout(TimeSpan.FromSeconds(30))
.PostJsonAsync(data);
if (response.StatusCode != StatusCodes.Status200OK)
{
_logger.LogError("KavitaStats did not respond successfully. {Content}", response);
return false;
}
return true;
}
await CollectRelevantData();
await FinalizeStats();
catch (HttpRequestException e)
{
var info = new
{
dataSent = data,
response = responseContent
};
_logger.LogError(e, "KavitaStats did not respond successfully. {Content}", info);
}
catch (Exception e)
{
_logger.LogError(e, "An error happened during the request to KavitaStats");
}
return false;
}
private static void ResetStats()
{
if (FileExists) File.Delete(StatsFilePath);
}
private async Task PathData(ServerInfoDto serverInfoDto, UsageInfoDto usageInfoDto)
{
_logger.LogDebug("Pathing server and usage info to the file");
var data = await GetData();
data.ServerInfo = serverInfoDto;
@ -110,7 +170,7 @@ namespace API.Services.Tasks
await SaveFile(data);
}
private async ValueTask<UsageStatisticsDto> GetData()
private static async ValueTask<UsageStatisticsDto> GetData()
{
if (!FileExists) return new UsageStatisticsDto {InstallId = HashUtil.AnonymousToken()};
@ -156,39 +216,17 @@ namespace API.Services.Tasks
return serverInfo;
}
private async Task<T> GetExistingData<T>()
private static async Task<T> GetExistingData<T>()
{
_logger.LogInformation("Fetching existing data from file");
var existingDataJson = await GetFileDataAsString();
_logger.LogInformation("Deserializing data from file to object");
var existingData = JsonSerializer.Deserialize<T>(existingDataJson);
return existingData;
var json = await File.ReadAllTextAsync(StatsFilePath);
return JsonSerializer.Deserialize<T>(json);
}
private async Task<string> GetFileDataAsString()
private static async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogInformation("Reading file from disk");
return await File.ReadAllTextAsync(FinalPath);
}
DirectoryService.ExistOrCreate(DirectoryService.StatsDirectory);
private async Task SaveFile(UsageStatisticsDto statisticsDto)
{
_logger.LogDebug("Saving file");
var finalDirectory = FinalPath.Replace(TempFileName, string.Empty);
if (!Directory.Exists(finalDirectory))
{
_logger.LogDebug("Creating tmp directory");
Directory.CreateDirectory(finalDirectory);
}
_logger.LogDebug("Serializing data to write");
var dataJson = JsonSerializer.Serialize(statisticsDto);
_logger.LogDebug("Writing file to the disk");
await File.WriteAllTextAsync(FinalPath, dataJson);
await File.WriteAllTextAsync(StatsFilePath, JsonSerializer.Serialize(statisticsDto));
}
}
}

View File

@ -96,5 +96,17 @@ namespace API.SignalR
}
};
}
public static SignalRMessage ScanLibraryError(int libraryId)
{
return new SignalRMessage
{
Name = SignalREvents.ScanLibraryError,
Body = new
{
LibraryId = libraryId,
}
};
}
}
}

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Interfaces;
@ -27,7 +28,7 @@ namespace API.SignalR.Presence
_unitOfWork = unitOfWork;
}
public Task UserConnected(string username, string connectionId)
public async Task UserConnected(string username, string connectionId)
{
lock (OnlineUsers)
{
@ -41,7 +42,10 @@ namespace API.SignalR.Presence
}
}
return Task.CompletedTask;
// Update the last active for the user
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username);
user.LastActive = DateTime.Now;
await _unitOfWork.CommitAsync();
}
public Task UserDisconnected(string username, string connectionId)

View File

@ -11,5 +11,6 @@
public const string ScanLibraryProgress = "ScanLibraryProgress";
public const string OnlineUsers = "OnlineUsers";
public const string SeriesAddedToCollection = "SeriesAddedToCollection";
public const string ScanLibraryError = "ScanLibraryError";
}
}

View File

@ -24,6 +24,7 @@ using Microsoft.AspNetCore.StaticFiles;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.OpenApi.Models;
namespace API
@ -106,7 +107,11 @@ namespace API
services.AddResponseCaching();
services.AddStatsClient(_config);
services.Configure<ForwardedHeadersOptions>(options =>
{
options.ForwardedHeaders =
ForwardedHeaders.All;
});
services.AddHangfire(configuration => configuration
.UseSimpleAssemblyNameTypeSerializer()
@ -139,7 +144,10 @@ namespace API
app.UseResponseCompression();
app.UseForwardedHeaders();
app.UseForwardedHeaders(new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.All
});
app.UseRouting();
@ -210,6 +218,15 @@ namespace API
applicationLifetime.ApplicationStopping.Register(OnShutdown);
applicationLifetime.ApplicationStarted.Register(() =>
{
try
{
var logger = serviceProvider.GetRequiredService<ILogger<Startup>>();
logger.LogInformation("Kavita - v{Version}", BuildInfo.Version);
}
catch (Exception)
{
/* Swallow Exception */
}
Console.WriteLine($"Kavita - v{BuildInfo.Version}");
});
}

View File

@ -1,21 +1,21 @@
{
"ConnectionStrings": {
"DefaultConnection": "Data source=kavita.db"
"DefaultConnection": "Data source=config//kavita.db"
},
"TokenKey": "super secret unguessable key",
"Logging": {
"LogLevel": {
"Default": "Debug",
"Default": "Information",
"Microsoft": "Information",
"Microsoft.Hosting.Lifetime": "Error",
"Hangfire": "Information",
"Microsoft.AspNetCore.Hosting.Internal.WebHost": "Information"
},
"File": {
"Path": "logs/kavita.log",
"Path": "config//logs/kavita.log",
"Append": "True",
"FileSizeLimitBytes": 10485760,
"MaxRollingFiles": 5
"FileSizeLimitBytes": 26214400,
"MaxRollingFiles": 2
}
},
"Port": 5000

View File

@ -20,19 +20,14 @@ COPY --from=copytask /files/wwwroot /kavita/wwwroot
#Installs program dependencies
RUN apt-get update \
&& apt-get install -y libicu-dev libssl1.1 pwgen libgdiplus \
&& apt-get install -y libicu-dev libssl1.1 libgdiplus \
&& rm -rf /var/lib/apt/lists/*
#Creates the data directory
RUN mkdir /kavita/data
RUN sed -i 's/Data source=kavita.db/Data source=data\/kavita.db/g' /kavita/appsettings.json
COPY entrypoint.sh /entrypoint.sh
EXPOSE 5000
WORKDIR /kavita
ENTRYPOINT ["/bin/bash"]
ENTRYPOINT [ "/bin/bash" ]
CMD ["/entrypoint.sh"]

View File

@ -2,4 +2,6 @@
1. Unzip the archive to a directory that is writable. If on windows, do not place in Program Files.
2. (Linux only) Chmod and Chown so Kavita can write to the directory you placed in.
3. Run Kavita executable.
4. Open localhost:5000 and setup your account and libraries in the UI.
4. Open localhost:5000 and setup your account and libraries in the UI.
If updating, copy everything but the config/ directory over. Restart Kavita.

View File

@ -0,0 +1,7 @@
namespace Kavita.Common
{
public class AppSettingsConfig
{
}
}

View File

@ -6,236 +6,349 @@ using Microsoft.Extensions.Hosting;
namespace Kavita.Common
{
public static class Configuration
{
private static readonly string AppSettingsFilename = GetAppSettingFilename();
public static string Branch
{
get => GetBranch(GetAppSettingFilename());
set => SetBranch(GetAppSettingFilename(), value);
}
public static class Configuration
{
public static readonly string AppSettingsFilename = Path.Join("config", GetAppSettingFilename());
public static int Port
{
get => GetPort(GetAppSettingFilename());
set => SetPort(GetAppSettingFilename(), value);
}
public static string Branch
{
get => GetBranch(GetAppSettingFilename());
set => SetBranch(GetAppSettingFilename(), value);
}
public static string JwtToken
{
get => GetJwtToken(GetAppSettingFilename());
set => SetJwtToken(GetAppSettingFilename(), value);
}
public static int Port
{
get => GetPort(GetAppSettingFilename());
set => SetPort(GetAppSettingFilename(), value);
}
public static string LogLevel
{
get => GetLogLevel(GetAppSettingFilename());
set => SetLogLevel(GetAppSettingFilename(), value);
}
public static string JwtToken
{
get => GetJwtToken(GetAppSettingFilename());
set => SetJwtToken(GetAppSettingFilename(), value);
}
private static string GetAppSettingFilename()
{
if (!string.IsNullOrEmpty(AppSettingsFilename))
{
return AppSettingsFilename;
}
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
var isDevelopment = environment == Environments.Development;
return "appsettings" + (isDevelopment ? ".Development" : "") + ".json";
}
public static string LogLevel
{
get => GetLogLevel(GetAppSettingFilename());
set => SetLogLevel(GetAppSettingFilename(), value);
}
#region JWT Token
public static string LogPath
{
get => GetLoggingFile(GetAppSettingFilename());
set => SetLoggingFile(GetAppSettingFilename(), value);
}
private static string GetJwtToken(string filePath)
{
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "TokenKey";
public static string DatabasePath
{
get => GetDatabasePath(GetAppSettingFilename());
set => SetDatabasePath(GetAppSettingFilename(), value);
}
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
private static string GetAppSettingFilename()
{
if (!string.IsNullOrEmpty(AppSettingsFilename))
{
return tokenElement.GetString();
return AppSettingsFilename;
}
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
var isDevelopment = environment == Environments.Development;
return "appsettings" + (isDevelopment ? ".Development" : string.Empty) + ".json";
}
#region JWT Token
private static string GetJwtToken(string filePath)
{
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "TokenKey";
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
{
return tokenElement.GetString();
}
return string.Empty;
}
catch (Exception ex)
{
Console.WriteLine("Error reading app settings: " + ex.Message);
}
return string.Empty;
}
catch (Exception ex)
{
Console.WriteLine("Error reading app settings: " + ex.Message);
}
}
return string.Empty;
}
private static void SetJwtToken(string filePath, string token)
{
try
{
var currentToken = GetJwtToken(filePath);
var json = File.ReadAllText(filePath)
.Replace("\"TokenKey\": \"" + currentToken, "\"TokenKey\": \"" + token);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow exception */
}
}
private static void SetJwtToken(string filePath, string token)
{
try
{
var currentToken = GetJwtToken(filePath);
var json = File.ReadAllText(filePath)
.Replace("\"TokenKey\": \"" + currentToken, "\"TokenKey\": \"" + token);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow exception */
}
}
public static bool CheckIfJwtTokenSet()
{
try
{
return GetJwtToken(GetAppSettingFilename()) != "super secret unguessable key";
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
public static bool CheckIfJwtTokenSet()
{
try
{
return GetJwtToken(GetAppSettingFilename()) != "super secret unguessable key";
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
return false;
}
return false;
}
#endregion
#region Port
#endregion
private static void SetPort(string filePath, int port)
{
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
return;
}
#region Port
try
{
var currentPort = GetPort(filePath);
var json = File.ReadAllText(filePath).Replace("\"Port\": " + currentPort, "\"Port\": " + port);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
private static void SetPort(string filePath, int port)
{
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
return;
}
private static int GetPort(string filePath)
{
const int defaultPort = 5000;
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
return defaultPort;
}
try
{
var currentPort = GetPort(filePath);
var json = File.ReadAllText(filePath).Replace("\"Port\": " + currentPort, "\"Port\": " + port);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "Port";
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
{
return tokenElement.GetInt32();
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
private static int GetPort(string filePath)
{
const int defaultPort = 5000;
if (new OsInfo(Array.Empty<IOsVersionAdapter>()).IsDocker)
{
return defaultPort;
}
}
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "Port";
#endregion
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
#region LogLevel
private static void SetLogLevel(string filePath, string logLevel)
{
try
{
return tokenElement.GetInt32();
var currentLevel = GetLogLevel(filePath);
var json = File.ReadAllText(filePath)
.Replace($"\"Default\": \"{currentLevel}\"", $"\"Default\": \"{logLevel}\"");
File.WriteAllText(filePath, json);
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
return defaultPort;
}
#endregion
#region LogLevel
private static void SetLogLevel(string filePath, string logLevel)
{
try
{
var currentLevel = GetLogLevel(filePath);
var json = File.ReadAllText(filePath)
.Replace($"\"Default\": \"{currentLevel}\"", $"\"Default\": \"{logLevel}\"");
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
private static string GetLogLevel(string filePath)
{
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
catch (Exception)
{
foreach (var property in tokenElement.EnumerateObject())
{
if (!property.Name.Equals("LogLevel")) continue;
foreach (var logProperty in property.Value.EnumerateObject())
{
if (logProperty.Name.Equals("Default"))
{
return logProperty.Value.GetString();
}
}
}
/* Swallow Exception */
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
}
return "Information";
}
#endregion
private static string GetBranch(string filePath)
{
const string defaultBranch = "main";
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "Branch";
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
private static string GetLogLevel(string filePath)
{
try
{
return tokenElement.GetString();
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
{
foreach (var property in tokenElement.EnumerateObject())
{
if (!property.Name.Equals("LogLevel")) continue;
foreach (var logProperty in property.Value.EnumerateObject())
{
if (logProperty.Name.Equals("Default"))
{
return logProperty.Value.GetString();
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
}
catch (Exception ex)
{
Console.WriteLine("Error reading app settings: " + ex.Message);
}
return defaultBranch;
}
return "Information";
}
private static void SetBranch(string filePath, string updatedBranch)
{
try
{
var currentBranch = GetBranch(filePath);
var json = File.ReadAllText(filePath)
.Replace("\"Branch\": " + currentBranch, "\"Branch\": " + updatedBranch);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
}
#endregion
private static string GetBranch(string filePath)
{
const string defaultBranch = "main";
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
const string key = "Branch";
if (jsonObj.TryGetProperty(key, out JsonElement tokenElement))
{
return tokenElement.GetString();
}
}
catch (Exception ex)
{
Console.WriteLine("Error reading app settings: " + ex.Message);
}
return defaultBranch;
}
private static void SetBranch(string filePath, string updatedBranch)
{
try
{
var currentBranch = GetBranch(filePath);
var json = File.ReadAllText(filePath)
.Replace("\"Branch\": " + currentBranch, "\"Branch\": " + updatedBranch);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
private static string GetLoggingFile(string filePath)
{
const string defaultFile = "config/logs/kavita.log";
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
if (jsonObj.TryGetProperty("Logging", out JsonElement tokenElement))
{
foreach (var property in tokenElement.EnumerateObject())
{
if (!property.Name.Equals("File")) continue;
foreach (var logProperty in property.Value.EnumerateObject())
{
if (logProperty.Name.Equals("Path"))
{
return logProperty.Value.GetString();
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
return defaultFile;
}
/// <summary>
/// This should NEVER be called except by <see cref="MigrateConfigFiles"/>
/// </summary>
/// <param name="filePath"></param>
/// <param name="directory"></param>
private static void SetLoggingFile(string filePath, string directory)
{
try
{
var currentFile = GetLoggingFile(filePath);
var json = File.ReadAllText(filePath)
.Replace("\"Path\": \"" + currentFile + "\"", "\"Path\": \"" + directory + "\"");
File.WriteAllText(filePath, json);
}
catch (Exception ex)
{
/* Swallow Exception */
Console.WriteLine(ex);
}
}
private static string GetDatabasePath(string filePath)
{
const string defaultFile = "config/kavita.db";
try
{
var json = File.ReadAllText(filePath);
var jsonObj = JsonSerializer.Deserialize<dynamic>(json);
if (jsonObj.TryGetProperty("ConnectionStrings", out JsonElement tokenElement))
{
foreach (var property in tokenElement.EnumerateObject())
{
if (!property.Name.Equals("DefaultConnection")) continue;
return property.Value.GetString();
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error writing app settings: " + ex.Message);
}
return defaultFile;
}
/// <summary>
/// This should NEVER be called except by <see cref="MigrateConfigFiles"/>
/// </summary>
/// <param name="filePath"></param>
/// <param name="updatedPath"></param>
private static void SetDatabasePath(string filePath, string updatedPath)
{
try
{
var existingString = GetDatabasePath(filePath);
var json = File.ReadAllText(filePath)
.Replace(existingString,
"Data source=" + updatedPath);
File.WriteAllText(filePath, json);
}
catch (Exception)
{
/* Swallow Exception */
}
}
}
}

View File

@ -41,12 +41,13 @@ namespace Kavita.Common.EnvironmentInfo
break;
}
}
}
public OsInfo(IEnumerable<IOsVersionAdapter> versionAdapters)
{
OsVersionModel osInfo = null;
foreach (var osVersionAdapter in versionAdapters.Where(c => c.Enabled))
{
try
@ -57,13 +58,13 @@ namespace Kavita.Common.EnvironmentInfo
{
Console.WriteLine("Couldn't get OS Version info: " + e.Message);
}
if (osInfo != null)
{
break;
}
}
if (osInfo != null)
{
Name = osInfo.Name;
@ -75,7 +76,7 @@ namespace Kavita.Common.EnvironmentInfo
Name = Os.ToString();
FullName = Name;
}
if (IsLinux && File.Exists("/proc/1/cgroup") && File.ReadAllText("/proc/1/cgroup").Contains("/docker/"))
{
IsDocker = true;
@ -145,4 +146,4 @@ namespace Kavita.Common.EnvironmentInfo
LinuxMusl,
Bsd
}
}
}

View File

@ -4,7 +4,7 @@
<TargetFramework>net5.0</TargetFramework>
<Company>kavitareader.com</Company>
<Product>Kavita</Product>
<AssemblyVersion>0.4.7.0</AssemblyVersion>
<AssemblyVersion>0.4.8.1</AssemblyVersion>
<NeutralLanguage>en</NeutralLanguage>
</PropertyGroup>

View File

@ -2,4 +2,5 @@
<s:String x:Key="/Default/CodeInspection/ExcludedFiles/FilesAndFoldersToSkip2/=1BC0273F_002DFEBE_002D4DA1_002DBC04_002D3A3167E4C86C_002Fd_003AData_002Fd_003AMigrations/@EntryIndexedValue">ExplicitlyExcluded</s:String>
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunLongAnalysisInSwa/@EntryValue">True</s:Boolean>
<s:Boolean x:Key="/Default/CodeInspection/Highlighting/RunValueAnalysisInNullableWarningsEnabledContext2/@EntryValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Opds/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Opds/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=rewinded/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>

View File

@ -48,7 +48,7 @@ Password: Demouser64
- Place in a directory that is writable. If on windows, do not place in Program Files
- Linux users must ensure the directory & kavita.db is writable by Kavita (might require starting server once)
- Run Kavita
- If you are updating, do not copy appsettings.json from the new version over. It will override your TokenKey and you will have to reauthenticate on your devices.
- If you are updating, copy everything over into install location. All Kavita data is stored in config/, so nothing will be overwritten.
- Open localhost:5000 and setup your account and libraries in the UI.
### Docker
Running your Kavita server in docker is super easy! Barely an inconvenience. You can run it with this command:
@ -56,7 +56,7 @@ Running your Kavita server in docker is super easy! Barely an inconvenience. You
```
docker run --name kavita -p 5000:5000 \
-v /your/manga/directory:/manga \
-v /kavita/data/directory:/kavita/data \
-v /kavita/data/directory:/kavita/config \
--restart unless-stopped \
-d kizaing/kavita:latest
```
@ -64,19 +64,20 @@ docker run --name kavita -p 5000:5000 \
You can also run it via the docker-compose file:
```
version: '3.9'
version: '3'
services:
kavita:
image: kizaing/kavita:latest
container_name: kavita
volumes:
- ./manga:/manga
- ./data:/kavita/data
- ./config:/kavita/config
ports:
- "5000:5000"
restart: unless-stopped
```
**Note: Kavita is under heavy development and is being updated all the time, so the tag for current builds is `:nightly`. The `:latest` tag will be the latest stable release. There is also the `:alpine` tag if you want a smaller image, but it is only available for x64 systems.**
**Note: Kavita is under heavy development and is being updated all the time, so the tag for current builds is `:nightly`. The `:latest` tag will be the latest stable release.**
## Feature Requests
Got a great idea? Throw it up on the FeatHub or vote on another idea. Please check the [Project Board](https://github.com/Kareadita/Kavita/projects) first for a list of planned features.

View File

@ -111,11 +111,7 @@ export class ErrorInterceptor implements HttpInterceptor {
// NOTE: Signin has error.error or error.statusText available.
// if statement is due to http/2 spec issue: https://github.com/angular/angular/issues/23334
this.accountService.currentUser$.pipe(take(1)).subscribe(user => {
if (user) {
this.toastr.error(error.statusText === 'OK' ? 'Unauthorized' : error.statusText, error.status);
}
this.accountService.logout();
});
}
}

View File

@ -6,7 +6,7 @@ export interface SearchResult {
libraryName: string;
name: string;
originalName: string;
localizedName: string;
sortName: string;
coverImage: string; // byte64 encoded (not used)
format: MangaFormat;
}

View File

@ -19,7 +19,8 @@ export enum Action {
Download = 7,
Bookmarks = 8,
IncognitoRead = 9,
AddToReadingList = 10
AddToReadingList = 10,
AddToCollection = 11
}
export interface ActionItem<T> {
@ -90,6 +91,13 @@ export class ActionFactoryService {
requiresAdmin: true
});
this.seriesActions.push({
action: Action.AddToCollection,
title: 'Add to Collection',
callback: this.dummyCallback,
requiresAdmin: true
});
this.seriesActions.push({
action: Action.Edit,
title: 'Edit',
@ -209,7 +217,7 @@ export class ActionFactoryService {
title: 'Add to Reading List',
callback: this.dummyCallback,
requiresAdmin: false
},
}
];
this.volumeActions = [

View File

@ -4,6 +4,7 @@ import { ToastrService } from 'ngx-toastr';
import { Subject } from 'rxjs';
import { take } from 'rxjs/operators';
import { BookmarksModalComponent } from '../cards/_modals/bookmarks-modal/bookmarks-modal.component';
import { BulkAddToCollectionComponent } from '../cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component';
import { AddToListModalComponent, ADD_FLOW } from '../reading-list/_modals/add-to-list-modal/add-to-list-modal.component';
import { EditReadingListModalComponent } from '../reading-list/_modals/edit-reading-list-modal/edit-reading-list-modal.component';
import { ConfirmService } from '../shared/confirm.service';
@ -34,6 +35,7 @@ export class ActionService implements OnDestroy {
private readonly onDestroy = new Subject<void>();
private bookmarkModalRef: NgbModalRef | null = null;
private readingListModalRef: NgbModalRef | null = null;
private collectionModalRef: NgbModalRef | null = null;
constructor(private libraryService: LibraryService, private seriesService: SeriesService,
private readerService: ReaderService, private toastr: ToastrService, private modalService: NgbModal,
@ -358,6 +360,32 @@ export class ActionService implements OnDestroy {
});
}
/**
* Adds a set of series to a collection tag
* @param series
* @param callback
* @returns
*/
addMultipleSeriesToCollectionTag(series: Array<Series>, callback?: VoidActionCallback) {
if (this.collectionModalRef != null) { return; }
this.collectionModalRef = this.modalService.open(BulkAddToCollectionComponent, { scrollable: true, size: 'md' });
this.collectionModalRef.componentInstance.seriesIds = series.map(v => v.id);
this.collectionModalRef.componentInstance.title = 'New Collection';
this.collectionModalRef.closed.pipe(take(1)).subscribe(() => {
this.collectionModalRef = null;
if (callback) {
callback();
}
});
this.collectionModalRef.dismissed.pipe(take(1)).subscribe(() => {
this.collectionModalRef = null;
if (callback) {
callback();
}
});
}
addSeriesToReadingList(series: Series, callback?: SeriesActionCallback) {
if (this.readingListModalRef != null) { return; }
this.readingListModalRef = this.modalService.open(AddToListModalComponent, { scrollable: true, size: 'md' });
@ -439,4 +467,21 @@ export class ActionService implements OnDestroy {
});
}
/**
* Mark all chapters and the volumes as Read. All volumes and chapters must belong to a series
* @param seriesId Series Id
* @param volumes Volumes, should have id, chapters and pagesRead populated
* @param chapters? Chapters, should have id
* @param callback Optional callback to perform actions after API completes
*/
deleteMultipleSeries(seriesIds: Array<Series>, callback?: VoidActionCallback) {
this.seriesService.deleteMultipleSeries(seriesIds.map(s => s.id)).pipe(take(1)).subscribe(() => {
this.toastr.success('Series deleted');
if (callback) {
callback();
}
});
}
}

View File

@ -35,4 +35,8 @@ export class CollectionTagService {
updateSeriesForTag(tag: CollectionTag, seriesIdsToRemove: Array<number>) {
return this.httpClient.post(this.baseUrl + 'collection/update-series', {tag, seriesIdsToRemove}, {responseType: 'text' as 'json'});
}
addByMultiple(tagId: number, seriesIds: Array<number>, tagTitle: string = '') {
return this.httpClient.post(this.baseUrl + 'collection/update-for-series', {collectionTagId: tagId, collectionTagTitle: tagTitle, seriesIds}, {responseType: 'text' as 'json'});
}
}

View File

@ -18,7 +18,8 @@ export enum EVENTS {
SeriesAdded = 'SeriesAdded',
ScanLibraryProgress = 'ScanLibraryProgress',
OnlineUsers = 'OnlineUsers',
SeriesAddedToCollection = 'SeriesAddedToCollection'
SeriesAddedToCollection = 'SeriesAddedToCollection',
ScanLibraryError = 'ScanLibraryError'
}
export interface Message<T> {
@ -93,6 +94,16 @@ export class MessageHubService {
});
});
this.hubConnection.on(EVENTS.ScanLibraryError, resp => {
this.messagesSource.next({
event: EVENTS.ScanLibraryError,
payload: resp.body
});
if (this.isAdmin) {
this.toastr.error('Library Scan had a critical error. Some series were not saved. Check logs');
}
});
this.hubConnection.on(EVENTS.SeriesAdded, resp => {
this.messagesSource.next({
event: EVENTS.SeriesAdded,

View File

@ -80,6 +80,10 @@ export class SeriesService {
return this.httpClient.delete<boolean>(this.baseUrl + 'series/' + seriesId);
}
deleteMultipleSeries(seriesIds: Array<number>) {
return this.httpClient.post<boolean>(this.baseUrl + 'series/delete-multiple', {seriesIds});
}
updateRating(seriesId: number, userRating: number, userReview: string) {
return this.httpClient.post(this.baseUrl + 'series/update-rating', {seriesId, userRating, userReview});
}

View File

@ -56,13 +56,13 @@ export class ManageSettingsComponent implements OnInit {
async saveSettings() {
const modelSettings = this.settingsForm.value;
if (this.settingsForm.get('enableAuthentication')?.value === false) {
if (this.settingsForm.get('enableAuthentication')?.dirty && this.settingsForm.get('enableAuthentication')?.value === false) {
if (!await this.confirmService.confirm('Disabling Authentication opens your server up to unauthorized access and possible hacking. Are you sure you want to continue with this?')) {
return;
}
}
const informUserAfterAuthenticationEnabled = this.settingsForm.get('enableAuthentication')?.value && !this.serverSettings.enableAuthentication;
const informUserAfterAuthenticationEnabled = this.settingsForm.get('enableAuthentication')?.dirty && this.settingsForm.get('enableAuthentication')?.value && !this.serverSettings.enableAuthentication;
this.settingsService.updateServerSettings(modelSettings).pipe(take(1)).subscribe(async (settings: ServerSettings) => {
this.serverSettings = settings;

View File

@ -2,7 +2,7 @@
<div class="float-right">
<div class="d-inline-block" ngbDropdown #myDrop="ngbDropdown">
<button class="btn btn-outline-primary mr-2" id="dropdownManual" ngbDropdownAnchor (focus)="myDrop.open()">
<button class="btn btn-outline-primary mr-2" id="dropdownManual" ngbDropdownToggle>
<ng-container *ngIf="backupDBInProgress || clearCacheInProgress || isCheckingForUpdate || downloadLogsInProgress">
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span>
<span class="sr-only">Loading...</span>

View File

@ -1,5 +1,6 @@
import { HttpClient } from '@angular/common/http';
import { Injectable } from '@angular/core';
import { map } from 'rxjs/operators';
import { environment } from 'src/environments/environment';
import { ServerSettings } from './_models/server-settings';
@ -37,6 +38,8 @@ export class SettingsService {
}
getAuthenticationEnabled() {
return this.http.get<boolean>(this.baseUrl + 'settings/authentication-enabled', {responseType: 'text' as 'json'});
return this.http.get<string>(this.baseUrl + 'settings/authentication-enabled', {responseType: 'text' as 'json'}).pipe(map((res: string) => {
return res === 'true';
}));
}
}

View File

@ -111,6 +111,9 @@
<ng-container [ngTemplateOutlet]="actionBar"></ng-container>
</div>
</div>
<!-- <div *ngIf="page !== undefined && scrollbarNeeded">
<ng-container [ngTemplateOutlet]="actionBar"></ng-container>
</div> -->
<ng-template #actionBar>
<div class="reading-bar row no-gutters justify-content-between">
@ -122,7 +125,7 @@
</button>
<button *ngIf="!this.adhocPageHistory.isEmpty()" class="btn btn-outline-secondary btn-icon col-2 col-xs-1" (click)="goBack()" title="Go Back"><i class="fa fa-reply" aria-hidden="true"></i><span class="phone-hidden">&nbsp;Go Back</span></button>
<button class="btn btn-secondary col-2 col-xs-1" (click)="toggleDrawer()"><i class="fa fa-bars" aria-hidden="true"></i><span class="phone-hidden">&nbsp;Settings</span></button>
<div class="book-title col-2 phone-hidden">{{bookTitle}} <span *ngIf="incognitoMode">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
<div class="book-title col-2 phone-hidden">{{bookTitle}} <span *ngIf="incognitoMode" (click)="turnOffIncognito()" role="button" aria-label="Incognito mode is on. Toggle to turn off.">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
<button class="btn btn-secondary col-2 col-xs-1" (click)="closeReader()"><i class="fa fa-times-circle" aria-hidden="true"></i><span class="phone-hidden">&nbsp;Close</span></button>
<button class="btn btn-outline-secondary btn-icon col-2 col-xs-1"
[disabled]="IsNextDisabled"

View File

@ -155,6 +155,11 @@ $primary-color: #0062cc;
.reading-section {
height: 100vh;
width: 100%;
}
.book-content {
position: relative;
}
.drawer-body {

View File

@ -160,7 +160,11 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
readerStyles: string = '';
darkModeStyleElem!: HTMLElement;
topOffset: number = 0; // Offset for drawer and rendering canvas
scrollbarNeeded = false; // Used for showing/hiding bottom action bar
/**
* Used for showing/hiding bottom action bar. Calculates if there is enough scroll to show it.
* Will hide if all content in book is absolute positioned
*/
scrollbarNeeded = false;
readingDirection: ReadingDirection = ReadingDirection.LeftToRight;
private readonly onDestroy = new Subject<void>();
@ -713,7 +717,7 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
setupPage(part?: string | undefined, scrollTop?: number | undefined) {
this.isLoading = false;
this.scrollbarNeeded = this.readingSectionElemRef.nativeElement.scrollHeight > this.readingSectionElemRef.nativeElement.clientHeight;
this.scrollbarNeeded = this.readingHtml.nativeElement.clientHeight > this.readingSectionElemRef.nativeElement.clientHeight;
// Find all the part ids and their top offset
this.setupPageAnchors();
@ -995,4 +999,15 @@ export class BookReaderComponent implements OnInit, AfterViewInit, OnDestroy {
return '';
}
/**
* Turns off Incognito mode. This can only happen once if the user clicks the icon. This will modify URL state
*/
turnOffIncognito() {
this.incognitoMode = false;
const newRoute = this.readerService.getNextChapterUrl(this.router.url, this.chapterId, this.incognitoMode, this.readingListMode, this.readingListId);
window.history.replaceState({}, '', newRoute);
this.toastr.info('Incognito mode is off. Progress will now start being tracked.');
this.readerService.saveProgress(this.seriesId, this.volumeId, this.chapterId, this.pageNum).pipe(take(1)).subscribe(() => {/* No operation */});
}
}

View File

@ -0,0 +1,46 @@
<div class="modal-header">
<h4 class="modal-title" id="modal-basic-title">Add to Collection</h4>
<button type="button" class="close" aria-label="Close" (click)="close()">
<span aria-hidden="true">&times;</span>
</button>
</div>
<form style="width: 100%" [formGroup]="listForm">
<div class="modal-body">
<div class="form-group" *ngIf="lists.length >= 5">
<label for="filter">Filter</label>
<div class="input-group">
<input id="filter" autocomplete="off" class="form-control" formControlName="filterQuery" type="text" aria-describedby="reset-input">
<div class="input-group-append">
<button class="btn btn-outline-secondary" type="button" id="reset-input" (click)="listForm.get('filterQuery')?.setValue('');">Clear</button>
</div>
</div>
</div>
<ul class="list-group">
<li class="list-group-item clickable" tabindex="0" role="button" *ngFor="let collectionTag of lists | filter: filterList; let i = index" (click)="addToCollection(collectionTag)">
{{collectionTag.title}} <i class="fa fa-angle-double-up" *ngIf="collectionTag.promoted" title="Promoted"></i>
</li>
<li class="list-group-item" *ngIf="lists.length === 0 && !loading">No collections created yet</li>
<li class="list-group-item" *ngIf="loading">
<div class="spinner-border text-secondary" role="status">
<span class="sr-only">Loading...</span>
</div>
</li>
</ul>
</div>
<div class="modal-footer" style="justify-content: normal">
<div style="width: 100%;">
<div class="form-row">
<div class="col-9 col-lg-10">
<label class="sr-only" for="add-rlist">Collection</label>
<input width="100%" #title ngbAutofocus type="text" class="form-control mb-2" id="add-rlist" formControlName="title">
</div>
<div class="col-2">
<button type="submit" class="btn btn-primary" (click)="create()">Create</button>
</div>
</div>
</div>
</div>
</form>

View File

@ -0,0 +1,7 @@
.clickable {
cursor: pointer;
}
.clickable:hover, .clickable:focus {
background-color: lightgreen;
}

View File

@ -0,0 +1,79 @@
import { Component, ElementRef, Input, OnInit, ViewChild } from '@angular/core';
import { FormGroup, FormControl } from '@angular/forms';
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
import { ToastrService } from 'ngx-toastr';
import { CollectionTag } from 'src/app/_models/collection-tag';
import { ReadingList } from 'src/app/_models/reading-list';
import { CollectionTagService } from 'src/app/_services/collection-tag.service';
@Component({
selector: 'app-bulk-add-to-collection',
templateUrl: './bulk-add-to-collection.component.html',
styleUrls: ['./bulk-add-to-collection.component.scss']
})
export class BulkAddToCollectionComponent implements OnInit {
@Input() title!: string;
/**
* Series Ids to add to Collection Tag
*/
@Input() seriesIds: Array<number> = [];
/**
* All existing collections sorted by recent use date
*/
lists: Array<CollectionTag> = [];
loading: boolean = false;
listForm: FormGroup = new FormGroup({});
@ViewChild('title') inputElem!: ElementRef<HTMLInputElement>;
constructor(private modal: NgbActiveModal, private collectionService: CollectionTagService, private toastr: ToastrService) { }
ngOnInit(): void {
this.listForm.addControl('title', new FormControl(this.title, []));
this.listForm.addControl('filterQuery', new FormControl('', []));
this.loading = true;
this.collectionService.allTags().subscribe(tags => {
this.lists = tags;
this.loading = false;
});
}
ngAfterViewInit() {
// Shift focus to input
if (this.inputElem) {
this.inputElem.nativeElement.select();
}
}
close() {
this.modal.close();
}
create() {
const tagName = this.listForm.value.title;
this.collectionService.addByMultiple(0, this.seriesIds, tagName).subscribe(() => {
this.toastr.success('Series added to ' + tagName + ' collection');
this.modal.close();
});
}
addToCollection(tag: CollectionTag) {
if (this.seriesIds.length === 0) return;
this.collectionService.addByMultiple(tag.id, this.seriesIds, '').subscribe(() => {
this.toastr.success('Series added to ' + tag.title + ' collection');
this.modal.close();
});
}
filterList = (listItem: ReadingList) => {
return listItem.title.toLowerCase().indexOf((this.listForm.value.filterQuery || '').toLowerCase()) >= 0;
}
}

View File

@ -127,7 +127,7 @@ export class BulkSelectionService {
getActions(callback: (action: Action, data: any) => void) {
// checks if series is present. If so, returns only series actions
// else returns volume/chapter items
const allowedActions = [Action.AddToReadingList, Action.MarkAsRead, Action.MarkAsUnread];
const allowedActions = [Action.AddToReadingList, Action.MarkAsRead, Action.MarkAsUnread, Action.AddToCollection, Action.Delete];
if (Object.keys(this.selectedCards).filter(item => item === 'series').length > 0) {
return this.actionFactory.getSeriesActions(callback).filter(item => allowedActions.includes(item.action));
}

View File

@ -16,10 +16,11 @@ import { CardItemComponent } from './card-item/card-item.component';
import { SharedModule } from '../shared/shared.module';
import { RouterModule } from '@angular/router';
import { TypeaheadModule } from '../typeahead/typeahead.module';
import { BrowserModule } from '@angular/platform-browser';
import { CardDetailLayoutComponent } from './card-detail-layout/card-detail-layout.component';
import { CardDetailsModalComponent } from './_modals/card-details-modal/card-details-modal.component';
import { BulkOperationsComponent } from './bulk-operations/bulk-operations.component';
import { BulkAddToCollectionComponent } from './_modals/bulk-add-to-collection/bulk-add-to-collection.component';
import { PipeModule } from '../pipe/pipe.module';
@ -36,11 +37,11 @@ import { BulkOperationsComponent } from './bulk-operations/bulk-operations.compo
CardActionablesComponent,
CardDetailLayoutComponent,
CardDetailsModalComponent,
BulkOperationsComponent
BulkOperationsComponent,
BulkAddToCollectionComponent
],
imports: [
CommonModule,
//BrowserModule,
RouterModule,
ReactiveFormsModule,
FormsModule, // EditCollectionsModal
@ -58,6 +59,7 @@ import { BulkOperationsComponent } from './bulk-operations/bulk-operations.compo
NgbDropdownModule,
NgbProgressbarModule,
NgxFileDropModule, // Cover Chooser
PipeModule // filter for BulkAddToCollectionComponent
],
exports: [
CardItemComponent,

View File

@ -55,6 +55,11 @@ export class CollectionDetailComponent implements OnInit, OnDestroy {
this.bulkSelectionService.deselectAll();
});
break;
case Action.AddToCollection:
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
this.bulkSelectionService.deselectAll();
});
break;
case Action.MarkAsRead:
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
this.loadPage();
@ -67,6 +72,12 @@ export class CollectionDetailComponent implements OnInit, OnDestroy {
this.bulkSelectionService.deselectAll();
});
break;
case Action.Delete:
this.actionService.deleteMultipleSeries(selectedSeries, () => {
this.loadPage();
this.bulkSelectionService.deselectAll();
});
break;
}
}

View File

@ -1,14 +1,14 @@
<ng-container>
<app-card-detail-layout header="In Progress"
<app-bulk-operations [actionCallback]="bulkActionCallback"></app-bulk-operations>
<app-card-detail-layout header="In Progress"
[isLoading]="isLoading"
[items]="recentlyAdded"
[items]="series"
[filters]="filters"
[pagination]="pagination"
(pageChange)="onPageChange($event)"
(applyFilter)="updateFilter($event)"
>
<ng-template #cardItem let-item let-position="idx">
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="loadPage()"></app-series-card>
</ng-template>
</app-card-detail-layout>
</ng-container>
<ng-template #cardItem let-item let-position="idx">
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
</ng-template>
</app-card-detail-layout>

View File

@ -1,11 +1,15 @@
import { Component, OnInit } from '@angular/core';
import { Component, HostListener, OnInit } from '@angular/core';
import { Title } from '@angular/platform-browser';
import { Router, ActivatedRoute } from '@angular/router';
import { take } from 'rxjs/operators';
import { BulkSelectionService } from '../cards/bulk-selection.service';
import { UpdateFilterEvent } from '../cards/card-detail-layout/card-detail-layout.component';
import { KEY_CODES } from '../shared/_services/utility.service';
import { Pagination } from '../_models/pagination';
import { Series } from '../_models/series';
import { FilterItem, SeriesFilter, mangaFormatFilters } from '../_models/series-filter';
import { Action } from '../_services/action-factory.service';
import { ActionService } from '../_services/action.service';
import { SeriesService } from '../_services/series.service';
@Component({
@ -16,7 +20,7 @@ import { SeriesService } from '../_services/series.service';
export class InProgressComponent implements OnInit {
isLoading: boolean = true;
recentlyAdded: Series[] = [];
series: Series[] = [];
pagination!: Pagination;
libraryId!: number;
filters: Array<FilterItem> = mangaFormatFilters;
@ -24,7 +28,8 @@ export class InProgressComponent implements OnInit {
mangaFormat: null
};
constructor(private router: Router, private route: ActivatedRoute, private seriesService: SeriesService, private titleService: Title) {
constructor(private router: Router, private route: ActivatedRoute, private seriesService: SeriesService, private titleService: Title,
private actionService: ActionService, public bulkSelectionService: BulkSelectionService) {
this.router.routeReuseStrategy.shouldReuseRoute = () => false;
this.titleService.setTitle('Kavita - In Progress');
if (this.pagination === undefined || this.pagination === null) {
@ -33,6 +38,20 @@ export class InProgressComponent implements OnInit {
this.loadPage();
}
@HostListener('document:keydown.shift', ['$event'])
handleKeypress(event: KeyboardEvent) {
if (event.key === KEY_CODES.SHIFT) {
this.bulkSelectionService.isShiftDown = true;
}
}
@HostListener('document:keyup.shift', ['$event'])
handleKeyUp(event: KeyboardEvent) {
if (event.key === KEY_CODES.SHIFT) {
this.bulkSelectionService.isShiftDown = false;
}
}
ngOnInit() {}
seriesClicked(series: Series) {
@ -61,7 +80,7 @@ export class InProgressComponent implements OnInit {
}
this.isLoading = true;
this.seriesService.getInProgress(this.libraryId, this.pagination?.currentPage, this.pagination?.itemsPerPage, this.filter).pipe(take(1)).subscribe(series => {
this.recentlyAdded = series.result;
this.series = series.result;
this.pagination = series.pagination;
this.isLoading = false;
window.scrollTo(0, 0);
@ -73,4 +92,40 @@ export class InProgressComponent implements OnInit {
return urlParams.get('page');
}
bulkActionCallback = (action: Action, data: any) => {
const selectedSeriesIndexies = this.bulkSelectionService.getSelectedCardsForSource('series');
const selectedSeries = this.series.filter((series, index: number) => selectedSeriesIndexies.includes(index + ''));
switch (action) {
case Action.AddToReadingList:
this.actionService.addMultipleSeriesToReadingList(selectedSeries, () => {
this.bulkSelectionService.deselectAll();
});
break;
case Action.AddToCollection:
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
this.bulkSelectionService.deselectAll();
});
break;
case Action.MarkAsRead:
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
this.loadPage();
this.bulkSelectionService.deselectAll();
});
break;
case Action.MarkAsUnread:
this.actionService.markMultipleSeriesAsUnread(selectedSeries, () => {
this.loadPage();
this.bulkSelectionService.deselectAll();
});
break;
case Action.Delete:
this.actionService.deleteMultipleSeries(selectedSeries, () => {
this.loadPage();
this.bulkSelectionService.deselectAll();
});
break;
}
}
}

View File

@ -9,6 +9,6 @@
(pageChange)="onPageChange($event)"
>
<ng-template #cardItem let-item let-position="idx">
<app-series-card [data]="item" [libraryId]="libraryId" [suppressLibraryLink]="true" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
<app-series-card [data]="item" [libraryId]="libraryId" [suppressLibraryLink]="true" (reload)="loadPage()" (selection)="bulkSelectionService.handleCardSelection('series', position, series.length, $event)" [selected]="bulkSelectionService.isCardSelected('series', position)" [allowSelection]="true"></app-series-card>
</ng-template>
</app-card-detail-layout>

View File

@ -46,6 +46,11 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
this.bulkSelectionService.deselectAll();
});
break;
case Action.AddToCollection:
this.actionService.addMultipleSeriesToCollectionTag(selectedSeries, () => {
this.bulkSelectionService.deselectAll();
});
break;
case Action.MarkAsRead:
this.actionService.markMultipleSeriesAsRead(selectedSeries, () => {
this.loadPage();
@ -59,6 +64,12 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
this.bulkSelectionService.deselectAll();
});
break;
case Action.Delete:
this.actionService.deleteMultipleSeries(selectedSeries, () => {
this.loadPage();
this.bulkSelectionService.deselectAll();
});
break;
}
}

View File

@ -13,7 +13,7 @@
<app-carousel-reel [items]="recentlyAdded" title="Recently Added" (sectionClick)="handleSectionClick($event)">
<ng-template #carouselItem let-item let-position="idx">
<app-series-card [data]="item" [libraryId]="item.libraryId" (reload)="reloadTags()" (dataChanged)="loadRecentlyAdded()"></app-series-card>
<app-series-card [data]="item" [libraryId]="item.libraryId" (dataChanged)="loadRecentlyAdded()"></app-series-card>
</ng-template>
</app-carousel-reel>

View File

@ -6,6 +6,7 @@ import { Subject } from 'rxjs';
import { take, takeUntil } from 'rxjs/operators';
import { EditCollectionTagsComponent } from '../cards/_modals/edit-collection-tags/edit-collection-tags.component';
import { CollectionTag } from '../_models/collection-tag';
import { SeriesAddedEvent } from '../_models/events/series-added-event';
import { InProgressChapter } from '../_models/in-progress-chapter';
import { Library } from '../_models/library';
import { Series } from '../_models/series';
@ -15,6 +16,7 @@ import { Action, ActionFactoryService, ActionItem } from '../_services/action-fa
import { CollectionTagService } from '../_services/collection-tag.service';
import { ImageService } from '../_services/image.service';
import { LibraryService } from '../_services/library.service';
import { EVENTS, MessageHubService } from '../_services/message-hub.service';
import { SeriesService } from '../_services/series.service';
@Component({
@ -32,17 +34,24 @@ export class LibraryComponent implements OnInit, OnDestroy {
recentlyAdded: Series[] = [];
inProgress: Series[] = [];
continueReading: InProgressChapter[] = [];
// collectionTags: CollectionTag[] = [];
// collectionTagActions: ActionItem<CollectionTag>[] = [];
private readonly onDestroy = new Subject<void>();
seriesTrackBy = (index: number, item: any) => `${item.name}_${item.pagesRead}`;
constructor(public accountService: AccountService, private libraryService: LibraryService,
private seriesService: SeriesService, private actionFactoryService: ActionFactoryService,
private collectionService: CollectionTagService, private router: Router,
private modalService: NgbModal, private titleService: Title, public imageService: ImageService) { }
private seriesService: SeriesService, private router: Router,
private titleService: Title, public imageService: ImageService,
private messageHub: MessageHubService) {
this.messageHub.messages$.pipe(takeUntil(this.onDestroy)).subscribe(res => {
if (res.event == EVENTS.SeriesAdded) {
const seriesAddedEvent = res.payload as SeriesAddedEvent;
this.seriesService.getSeries(seriesAddedEvent.seriesId).subscribe(series => {
this.recentlyAdded.unshift(series);
});
}
});
}
ngOnInit(): void {
this.titleService.setTitle('Kavita - Dashboard');
@ -56,8 +65,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
});
});
//this.collectionTagActions = this.actionFactoryService.getCollectionTagActions(this.handleCollectionActionCallback.bind(this));
this.reloadSeries();
}
@ -68,10 +75,7 @@ export class LibraryComponent implements OnInit, OnDestroy {
reloadSeries() {
this.loadRecentlyAdded();
this.loadInProgress();
this.reloadTags();
}
reloadInProgress(series: Series | boolean) {
@ -85,7 +89,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
}
this.loadInProgress();
this.reloadTags();
}
loadInProgress() {
@ -100,12 +103,6 @@ export class LibraryComponent implements OnInit, OnDestroy {
});
}
reloadTags() {
// this.collectionService.allTags().pipe(takeUntil(this.onDestroy)).subscribe(tags => {
// this.collectionTags = tags;
// });
}
handleSectionClick(sectionTitle: string) {
if (sectionTitle.toLowerCase() === 'collections') {
this.router.navigate(['collections']);
@ -115,26 +112,4 @@ export class LibraryComponent implements OnInit, OnDestroy {
this.router.navigate(['in-progress']);
}
}
loadCollection(item: CollectionTag) {
//this.router.navigate(['collections', item.id]);
}
// handleCollectionActionCallback(action: Action, collectionTag: CollectionTag) {
// switch (action) {
// case(Action.Edit):
// const modalRef = this.modalService.open(EditCollectionTagsComponent, { size: 'lg', scrollable: true });
// modalRef.componentInstance.tag = collectionTag;
// modalRef.closed.subscribe((results: {success: boolean, coverImageUpdated: boolean}) => {
// this.reloadTags();
// if (results.coverImageUpdated) {
// collectionTag.coverImage = this.imageService.randomize(collectionTag.coverImage);
// }
// });
// break;
// default:
// break;
// }
// }
}

View File

@ -7,7 +7,7 @@
</button>
<div>
<div style="font-weight: bold;">{{title}} <span *ngIf="incognitoMode">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode</span>)</span></div>
<div style="font-weight: bold;">{{title}} <span class="clickable" *ngIf="incognitoMode" (click)="turnOffIncognito()" role="button" aria-label="Incognito mode is on. Toggle to turn off.">(<i class="fa fa-glasses" aria-hidden="true"></i><span class="sr-only">Incognito Mode:</span>)</span></div>
<div class="subtitle">
{{subtitle}}
</div>

View File

@ -1113,4 +1113,15 @@ export class MangaReaderComponent implements OnInit, AfterViewInit, OnDestroy {
}
}
/**
* Turns off Incognito mode. This can only happen once if the user clicks the icon. This will modify URL state
*/
turnOffIncognito() {
this.incognitoMode = false;
const newRoute = this.readerService.getNextChapterUrl(this.router.url, this.chapterId, this.incognitoMode, this.readingListMode, this.readingListId);
window.history.replaceState({}, '', newRoute);
this.toastr.info('Incognito mode is off. Progress will now start being tracked.');
this.readerService.saveProgress(this.seriesId, this.volumeId, this.chapterId, this.pageNum).pipe(take(1)).subscribe(() => {/* No operation */});
}
}

View File

@ -23,6 +23,7 @@
(selected)='clickSearchResult($event)'
(inputChanged)='onChangeSearch($event)'
[isLoading]="isLoading"
[customFilter]="customFilter"
[debounceTime]="debounceTime"
[itemTemplate]="itemTemplate"
[notFoundTemplate]="notFoundTemplate">
@ -35,7 +36,7 @@
</div>
<div class="ml-1">
<app-series-format [format]="item.format"></app-series-format>
<span *ngIf="item.name.toLowerCase().indexOf(searchTerm) >= 0; else localizedName" [innerHTML]="item.name"></span>
<span *ngIf="item.name.toLowerCase().trim().indexOf(searchTerm) >= 0; else localizedName" [innerHTML]="item.name"></span>
<ng-template #localizedName>
<span [innerHTML]="item.localizedName"></span>
</ng-template>

View File

@ -3,6 +3,7 @@ import { Component, HostListener, Inject, OnDestroy, OnInit, ViewChild } from '@
import { Router } from '@angular/router';
import { Subject } from 'rxjs';
import { takeUntil } from 'rxjs/operators';
import { isTemplateSpan } from 'typescript';
import { ScrollService } from '../scroll.service';
import { SearchResult } from '../_models/search-result';
import { AccountService } from '../_services/account.service';
@ -24,6 +25,16 @@ export class NavHeaderComponent implements OnInit, OnDestroy {
imageStyles = {width: '24px', 'margin-top': '5px'};
searchResults: SearchResult[] = [];
searchTerm = '';
customFilter: (items: SearchResult[], query: string) => SearchResult[] = (items: SearchResult[], query: string) => {
const normalizedQuery = query.trim().toLowerCase();
const matches = items.filter(item => {
const normalizedSeriesName = item.name.toLowerCase().trim();
const normalizedOriginalName = item.originalName.toLowerCase().trim();
const normalizedLocalizedName = item.localizedName.toLowerCase().trim();
return normalizedSeriesName.indexOf(normalizedQuery) >= 0 || normalizedOriginalName.indexOf(normalizedQuery) >= 0 || normalizedLocalizedName.indexOf(normalizedQuery) >= 0;
});
return matches;
};
backToTopNeeded = false;

Some files were not shown because too many files have changed in this diff Show More