v0.5.6 - Performance Part 2 (Is that a new scan loop?) (#1500)

* New Scan Loop (#1447)

* Staging the code for the new scan loop.

* Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real.

* Started writing unit test for new loop code

* Implemented a basic method to scan a folder path with ignore support (not implemented, code in place)

* Added some code to the parser to build out the idea of processing series in batches based on some top level folder.

* Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue.

* Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support).

* Wrote some notes on update library scan loop.

* Removed migration for merge

* Reapplied the SeriesFolder migration after merge

* Refactored a check that used multiple db calls into one.

* Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then.

* Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned.

* Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them.

* Fixed an issue where ignore files nested wouldn't stack with higher level ignores

* Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking.

* Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it.

* Refactored ScanFiles out to Directory Service.

* Refactored more code out to keep the code clean.

* More unit tests

* Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work).

* Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning.

* Prep for unit tests (updating broken ones with new implementations)

* Just some notes. Not sure I want to finish this work.

* Refactored the LibraryWatcher with some comments and state variables.

* Undid the migrations in case I don't move forward with this branch

* Started to clean the code and prepare for finishing this work.

* Fixed a bad merge

* Updated signatures to cleanup the code and commit to the new strategy for scanning.

* Swapped out the code with async processing of series on a small library

* The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations.

* Refactored UpdateSeries out of Scanner and into a dedicated file.

* Refactored how ProcessTasks are awaited to allow more async

* Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush

* Moved where we start to stopwatch to encapsulate the full scan

* Cleaned up SignalR events to report correctly (still needs a redesign)

* Remove the "remove" code until I figure it out

* Put in extremely expensive series deletion code for library scan.

* Have Genre and Tag update the DB immediately to avoid dup issues

* Taking a break

* Moving to a lock with People was successful. Need to apply to others.

* Refactored code for series level and tag and genre with new locking strategy.

* New scan loop works. Next up optimization

* Swapped out the Kavita log with svg for faster load

* Refactored metadata updates to occur when the series are being updated.

* Code cleanup

* Added a new type of generic message (Info) to inform the user.

* Code cleanup

* Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds.

Fixed a bug where File Analysis was running everytime for each non-epub file.

* Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet.

* Some code cleanup

* Added experimental signalr update code to have a more natural refresh of library-detail page

* Hooked in ability to send new series events to UI

* Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series.

* Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors.  Added --event-widget-info-bg-color

* Remove --drawer-background-color since it's not used

* When new series added, inject directly into the view.

* Some debug code cleanup

* Fixed up the unit tests

* Ensure all config directories exist on startup

* Disabled Library Watching (that will go in next build)

* Ensure update for series is admin only

* Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again.

* Removed SeriesFolder migration

* Added the SeriesFolder migration

* Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail.

* The scan optimizations now work for NTFS systems.

* Removed a TODO

* Migrated all the times to use DateTime.Now and not Utc.

* Refactored some repo calls to use the includes flag pattern

* Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed.

* Added another optimization which will use just folder attribute of last write time if the drive is not NTFS.

* Fixed a unit test

* Some code cleanup

* Bump versions by dotnet-bump-version.

* Misc UI Fixes (#1450)

* Fixed collection cover images not rendering

* added a try/catch on sending email, so we fail silently if it doesn't send.

* Fixed Go Back not returning to last scroll position due to layoutmode change resetting, despite nothing changing.

* Fixed a bug where when turning between pages on default mode, the height calculations could get skewed.

* Fixed a missing case for card item where it wouldn't show tooltip title for series.

* Bump versions by dotnet-bump-version.

* New Scan Loop Fixes (#1452)

* Refactored ScanSeries to avoid a lot of extra work and fixed a bug where Scan Series would invoke the processing twice.

Refactored the series selection code during process such that we use Localized Name as well, for cases where the original name was changed.

Undid an optimization around Last Write time, since Linux file systems match how NTFS works.

* Fixed part of the query

* Added a NormalizedLocalizedName for quick searching in which a series needs grouping. Reworked scan loop code a bit to ensure we don't do extra work.

Tweaked the widget logic to help display better and not show "Nothing going on here".

* Fixed a bug where archives with ._ files would be counted as valid files, while they are actually just metadata files on Mac's.

* Fixed a broken unit test

* Bump versions by dotnet-bump-version.

* Simplify parent lookup with Directory.GetParent (#1455)

* Simplify parent lookup with Directory.GetParent

* Address comments

* Bump versions by dotnet-bump-version.

* Scan Loop Fixes (#1459)

* Added Last Folder Scanned time to series info modal.

Tweaked the info event detail modal to have a primary and thus be auto-dismissable

* Added an error event when multiple series are found in processing a series.

* Fixed a bug where a series could get stuck with other series due to a bad select query.

Started adding the force flag hook for the UI and designing the confirm.

Confirm service now also has ability to hide the close button.

Updated error events and logging in the loop, to be more informative

* Fixed a bug where confirm service wasn't showing the proper body content.

* Hooked up force scan series

* refresh metadata now has force update

* Fixed up the messaging with the prompt on scan, hooked it up properly in the scan library to avoid the check if the whole library needs to even be scanned. Fixed a bug where NormalizedLocalizedName wasn't being calculated on new entities.

Started adding unit tests for this problematic repo method.

* Fixed a bug where we updated NormalizedLocalizedName before we set it.

* Send an info to the UI when series are spread between multiple library level folders.

* Added some logger output when there are no files found in a folder. Return early if there are no files found, so we can avoid some small loops of code.

* Fixed an issue where multiple series in a folder with localized series would cause unintended grouping. This is not supported and hence we will warn them and allow the bad grouping.

* Added a case where scan series fails due to the folder being removed. We will now log an error

* Normalize paths when finding the highest directory till root.

* Fixed an issue with Scan Series where changing a series' folder to a different path but the original series folder existed with another series in it, would cause the series to not be deleted.

* Fixed some bugs around specials causing a series merge issue on scan series.

* Removed a bug marker

* Cleaned up some of the scan loop and removed a test I don't need.

* Remove any prompts for force flow, it doesn't work well. Leave the API as is though.

* Fixed up a check for duplicate ScanLibrary calls

* Bump versions by dotnet-bump-version.

* Scroll Resume (#1460)

* When we navigate from a page then back, resume back on the last scroll key (if clicked)

* Resume jump key position when navigating back to a page. Removed some extra blank space on collection detail when a collection doesn't have a summary or cover image.

* Ignore progress events on series cards

* Added a url to swagger for /, which could be reverse proxy url

* Bump versions by dotnet-bump-version.

* Misc UI fixes (#1461)

* Misc fixes

- Fixed modal being stretched when not needed.
- Fixed Logo vertical align
- Fixed drawer content scroll, and from it being squished due to overridden by bootstrap.

* series detail cover image stretch fix

- Fixes: Fixes series detail cover image being stretched on larger resolutions

* fixing empty lists scrollbar

* Fixing want to read error

* fixing unnecessary scrollbar

* Fixing recently updated tooltip

* Bump versions by dotnet-bump-version.

* Folder Watching (#1467)

* Hooked in a server setting to enable/disable folder watching

* Validated the file rename change event

* Validated delete file works

* Tweaked some logic to determine if a change occurs on a folder or a file.

* Added a note for an upcoming branch

* Some minor changes in the loop that just shift where code runs.

* Implemented ScanFolder api

* Ensure we restart watchers when we modify a library folder.

* Fixed a unit test

* Bump versions by dotnet-bump-version.

* More Scan Loop Bugfixes (#1471)

* Updated scan time for watcher to 30 seconds for non-dev. Moved ScanFolder off the Scan queue as it doesn't need to be there. Updated loggers

* Fixed jumpbar missing

* Tweaked the messaging for CoverGen

* When we return early due to nothing being done on library and series scan, make sure we kick off other tasks that need to occur.

* Fixed a foreign constraint issue on Volumes when we were adding to a new series.

* Fixed a case where when picking normalized series, capitalization differences wouldn't stack when they should.

* Reduced the logging output on dev and prod settings.

* Fixed a bug in the code that finds the highest directory from a file, where we were not checking against a normalized path.

* Cleaned up some code

* Fixed broken unit tests

* Bump versions by dotnet-bump-version.

* More Scan Loop Fixes (#1473)

* Added a ToList() to avoid a bug where a person could be removed from a list while iterating over the list.

* When deleting a series, want to read page will now automatically remove that series from the view.

* Fixed a series lookup which was ignoring format

* Ignore XML comment warnings

* Removed a note since it was already working that way

* Fixed unit test

* Bump versions by dotnet-bump-version.

* Misc UI Fixes (#1477)

* Tweaked a Migration to log correctly only if something is going to be done.

* Refactored Reading List Controller code into a dedicated service and cleaned up some methods that aren't needed anymore.

* Fixed a bug where adding a new item to a reading list wasn't adding it at the end.

* Fixed an issue where collection page would re-render the same covers on multiple items.

* Fixed a missing margin-top which made the page extras drawer not render correctly and hence unclosable on small screens.

* Added some timeout on manage users screen to give data time to flush.

Added a dedicated token log for account flows, in case url encoding plays a part (but from testing it doesn't).

* Reverted back to building for ES6 instead of es2020 for old Safari 12.5.5 browsers (10MB difference in build size).

* Cleaned up the logic in removing series not found during scan loop.

* Tweaked the timings for Library Watcher to 1 min and reprocess queue every 30 seconds.

* Bump versions by dotnet-bump-version.

* Added fixes for libvips (#1479)

* Bump versions by dotnet-bump-version.

* Tachiyomi + Fixes (#1481)

* Fixed a bootstrap bug

* Fixed repeating images on collection detail

* Fixed up some logic in library watcher which wasn't processing all of the queue.

* When parsing non-epubs in Book library, use Manga parsing for Volume support to better support Light Novels

* Fixed some bugs with the tachiyomi plugin api's for progress tracking

* Bump versions by dotnet-bump-version.

* Adding Health controller (#1480)

* Adding Health controller

- Added: Added API endpoint for a health check to streamline docker healthy status.

* review comment fixes

* Bump versions by dotnet-bump-version.

* Simplify Folder Watcher (#1484)

* Refactored Library Watcher to use Hangfire under the hood.

* Support .kavitaignore at root level.

* Refactored a lot of the library watching code to process faster and handle when FileSystemWatcher runs out of internal buffer space. It's still not perfect, but good enough for basic use.

* Make folder watching as experimental and default it to off by default.

* Revert #1479

* Tweaked the messaging for OPDS to remove a note about download role.

Moved some code closer to where it's used.

* Cleaned up how the events widget reports

* Fixed a null issue when deleting series in the UI

* Cleaned up some debug code

* Added more information for when we skip a scan

* Cleaned up some logging messages in CoverGen tasks

* More log message tweaks

* Added some debug to help identify a rare issue

* Fixed a bug where save bookmarks as webp could get reset to false when saving other server settings

* Updated some documentation on library watcher.

* Make LibraryWatcher fire every 5 mins

* Bump versions by dotnet-bump-version.

* Sort series by chapter number only when some chapters have no volume (#1487)

* Sort series by chapter number only when some chapters have no volume information

* Implement a Default static instance of ChapterSortComparer

* Further use Default static Comparers

* Add missing ToLit() as per comments

* SQLite Hangfire  (#1488)

* Update to use SQLIte for Hangfire to retain information on tasks

* Updated all external links to have noopener noreferrer

* When watching folders, ensure the folders exist before creating watchers.

* Tweaked the messaging for Email Service and added link to the project.

* Bump versions by dotnet-bump-version.

* Bump versions by dotnet-bump-version.

* Fixed typeahead not working correctly (#1490)

* Bump versions by dotnet-bump-version.

* Release Testing Day 1 (#1491)

* Fixed a bug where typeahead wouldn't automatically show results on relationship screen without an additional click.

* Tweaked the code which checks if a modification occured to check on seconds rather than minutes

* Clear cache will now clear temp/ directory as well.

* Fixed an issue where Chrome was caching api responses when it shouldn't had.

* Added a cleanup temp code

* Ensure genres get removed during series scan when removed from metadata.

* Fixed a bug where all epubs with a volume would show as Volume 0 in reading list

* When a scan is in progress, don't let the user delete the library.

* Bump versions by dotnet-bump-version.

* Scan Loop Last Write Time Change (#1492)

* Refactored invite user flow to separate error handling on create user flow and email flow. This should help users that have unique situations.

* Switch to using files to check LastWriteTime. Debug code in for Robbie to test on rclone

* Updated Parser namespace. Changed the LastWriteTime to check all files and folders.

* Bump versions by dotnet-bump-version.

* Release Testing Day 2 (#1493)

* Added a no data section to collection detail.

* Remove an optimization for skipping the whole library scan as it wasn't reliable

* When resetting password, ensure the input is colored correctly

* Fixed setting new password after resetting, throwing an error despite it actually being successful.

Fixed incorrect messaging for Password Reset page.

* Fixed a bug where reset password would show the side nav button and skew the page.

Updated a lot of references to use Typed version for formcontrols.

* Removed a migration from 0.5.0, 6 releases ago.

* Added a null check so we don't throw an exception when connecting with signalR on unauthenticated users.

* Bump versions by dotnet-bump-version.

* Fixed a bug where a series with a relationship couldn't be deleted. (#1495)

* Bump versions by dotnet-bump-version.

* Release Testing Day 3 (#1496)

* Tweaked log messaging for library scan when no files were scanned.

* When a theme that is set gets removed due to a scan, inform the user to refresh.

* Fixed a typo and make Darkness -> Brightness

* Make download theme files allowed to be invoked by non-authenticated users, to allow new users to get the default theme.

* Hide all series side nav item if there are no libraries exposed to the user

* Fixed an API for Tachiyomi when syncing progress

* Fixed dashboard not responding to Series Removed and Added events.

Ensure we send SeriesRemoved events when they are deleted.

* Reverted Hangfire SQLite due to aborted jobs being resumed, when they shouldnt. Fixed some scan loop issues where cover gen wouldn't be invoked always on new libraries.

* Bump versions by dotnet-bump-version.

* Updating series detail cover style (#1498)

# FIxed
- Fixed: Fixed an issue with series detail cover when scaled down.

* Bump versions by dotnet-bump-version.

* Version bump

* v0.5.6 Release (#1499)

Co-authored-by: tjarls <tjarls@gmail.com>
Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
Joseph Milazzo 2022-09-02 07:52:51 -05:00 committed by GitHub
parent b40fcdb6a6
commit 150e67031a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
207 changed files with 8183 additions and 2218 deletions

1
.gitignore vendored
View File

@ -485,7 +485,6 @@ Thumbs.db
ssl/
# App specific
appsettings.json
/API/kavita.db
/API/kavita.db-shm
/API/kavita.db-wal

View File

@ -1,69 +0,0 @@
using System.IO;
using System.IO.Abstractions;
using System.Threading.Tasks;
using API.Entities.Enums;
using API.Parser;
using API.Services;
using API.Services.Tasks.Scanner;
using API.SignalR;
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Order;
using Microsoft.Extensions.Logging;
using NSubstitute;
namespace API.Benchmark
{
[MemoryDiagnoser]
[Orderer(SummaryOrderPolicy.FastestToSlowest)]
[RankColumn]
//[SimpleJob(launchCount: 1, warmupCount: 3, targetCount: 5, invocationCount: 100, id: "Test"), ShortRunJob]
public class ParseScannedFilesBenchmarks
{
private readonly ParseScannedFiles _parseScannedFiles;
private readonly ILogger<ParseScannedFiles> _logger = Substitute.For<ILogger<ParseScannedFiles>>();
private readonly ILogger<BookService> _bookLogger = Substitute.For<ILogger<BookService>>();
private readonly IArchiveService _archiveService = Substitute.For<ArchiveService>();
public ParseScannedFilesBenchmarks()
{
var directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem());
_parseScannedFiles = new ParseScannedFiles(
Substitute.For<ILogger>(),
directoryService,
new ReadingItemService(_archiveService, new BookService(_bookLogger, directoryService, new ImageService(Substitute.For<ILogger<ImageService>>(), directoryService)), Substitute.For<ImageService>(), directoryService),
Substitute.For<IEventHub>());
}
// [Benchmark]
// public void Test()
// {
// var libraryPath = Path.Join(Directory.GetCurrentDirectory(),
// "../../../Services/Test Data/ScannerService/Manga");
// var parsedSeries = _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new string[] {libraryPath},
// out var totalFiles, out var scanElapsedTime);
// }
/// <summary>
/// Generate a list of Series and another list with
/// </summary>
[Benchmark]
public async Task MergeName()
{
var libraryPath = Path.Join(Directory.GetCurrentDirectory(),
"../../../Services/Test Data/ScannerService/Manga");
var p1 = new ParserInfo()
{
Chapters = "0",
Edition = "",
Format = MangaFormat.Archive,
FullFilePath = Path.Join(libraryPath, "A Town Where You Live", "A_Town_Where_You_Live_v01.zip"),
IsSpecial = false,
Series = "A Town Where You Live",
Title = "A Town Where You Live",
Volumes = "1"
};
await _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new [] {libraryPath}, "Manga");
_parseScannedFiles.MergeName(p1);
}
}
}

View File

@ -12,7 +12,7 @@ namespace API.Tests.Entities
[InlineData("Darker than Black")]
public void CreateSeries(string name)
{
var key = API.Parser.Parser.Normalize(name);
var key = API.Services.Tasks.Scanner.Parser.Parser.Normalize(name);
var series = DbFactory.Series(name);
Assert.Equal(0, series.Id);
Assert.Equal(0, series.Pages);

View File

@ -14,7 +14,7 @@ namespace API.Tests.Extensions
{
public class ParserInfoListExtensions
{
private readonly DefaultParser _defaultParser;
private readonly IDefaultParser _defaultParser;
public ParserInfoListExtensions()
{
_defaultParser =

View File

@ -28,7 +28,7 @@ namespace API.Tests.Extensions
Name = seriesInput[0],
LocalizedName = seriesInput[1],
OriginalName = seriesInput[2],
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]),
Metadata = new SeriesMetadata()
};
@ -52,14 +52,14 @@ namespace API.Tests.Extensions
Name = seriesInput[0],
LocalizedName = seriesInput[1],
OriginalName = seriesInput[2],
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]),
Metadata = new SeriesMetadata(),
};
var parserInfos = list.Select(s => new ParsedSeries()
{
Name = s,
NormalizedName = API.Parser.Parser.Normalize(s),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(s),
}).ToList();
// This doesn't do any checks against format
@ -78,7 +78,7 @@ namespace API.Tests.Extensions
Name = seriesInput[0],
LocalizedName = seriesInput[1],
OriginalName = seriesInput[2],
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]),
Metadata = new SeriesMetadata()
};
var info = new ParserInfo();

View File

@ -18,7 +18,7 @@ namespace API.Tests.Helpers
Name = name,
SortName = name,
LocalizedName = name,
NormalizedName = API.Parser.Parser.Normalize(name),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(name),
Volumes = new List<Volume>(),
Metadata = new SeriesMetadata()
};
@ -31,7 +31,7 @@ namespace API.Tests.Helpers
return new Volume()
{
Name = volumeNumber,
Number = (int) API.Parser.Parser.MinNumberFromRange(volumeNumber),
Number = (int) API.Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(volumeNumber),
Pages = pages,
Chapters = chaps
};
@ -43,7 +43,7 @@ namespace API.Tests.Helpers
{
IsSpecial = isSpecial,
Range = range,
Number = API.Parser.Parser.MinNumberFromRange(range) + string.Empty,
Number = API.Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(range) + string.Empty,
Files = files ?? new List<MangaFile>(),
Pages = pageCount,
@ -73,7 +73,7 @@ namespace API.Tests.Helpers
return new CollectionTag()
{
Id = id,
NormalizedTitle = API.Parser.Parser.Normalize(title).ToUpper(),
NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize(title).ToUpper(),
Title = title,
Summary = summary,
Promoted = promoted

View File

@ -26,19 +26,19 @@ namespace API.Tests.Helpers
};
}
public static void AddToParsedInfo(IDictionary<ParsedSeries, List<ParserInfo>> collectedSeries, ParserInfo info)
public static void AddToParsedInfo(IDictionary<ParsedSeries, IList<ParserInfo>> collectedSeries, ParserInfo info)
{
var existingKey = collectedSeries.Keys.FirstOrDefault(ps =>
ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series));
ps.Format == info.Format && ps.NormalizedName == API.Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series));
existingKey ??= new ParsedSeries()
{
Format = info.Format,
Name = info.Series,
NormalizedName = API.Parser.Parser.Normalize(info.Series)
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series)
};
if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>))
{
((ConcurrentDictionary<ParsedSeries, List<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
((ConcurrentDictionary<ParsedSeries, IList<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
if (!oldValue.Contains(info))

View File

@ -16,7 +16,7 @@ public class ParserInfoHelperTests
[Fact]
public void SeriesHasMatchingParserInfoFormat_ShouldBeFalse()
{
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
@ -34,7 +34,7 @@ public class ParserInfoHelperTests
Name = "1"
}
},
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"),
Metadata = new SeriesMetadata(),
Format = MangaFormat.Epub
};
@ -45,7 +45,7 @@ public class ParserInfoHelperTests
[Fact]
public void SeriesHasMatchingParserInfoFormat_ShouldBeTrue()
{
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
@ -63,7 +63,7 @@ public class ParserInfoHelperTests
Name = "1"
}
},
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"),
Metadata = new SeriesMetadata(),
Format = MangaFormat.Epub
};

View File

@ -22,21 +22,21 @@ public class SeriesHelperTests
{
Format = MangaFormat.Archive,
Name = "Darker than Black",
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Darker than Black".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Darker than Black".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
}
@ -50,21 +50,21 @@ public class SeriesHelperTests
{
Format = MangaFormat.Image,
Name = "Darker than Black",
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Darker than Black".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Darker than Black".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black")
}));
}
@ -78,28 +78,28 @@ public class SeriesHelperTests
{
Format = MangaFormat.Image,
Name = "Something Random",
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "SomethingRandom".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("SomethingRandom")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("SomethingRandom")
}));
}
@ -113,28 +113,28 @@ public class SeriesHelperTests
{
Format = MangaFormat.Image,
Name = "Something Random",
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "SomethingRandom".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("SomethingRandom")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("SomethingRandom")
}));
}
@ -148,14 +148,14 @@ public class SeriesHelperTests
{
Format = MangaFormat.Archive,
Name = "My Dress-Up Darling",
NormalizedName = API.Parser.Parser.Normalize("My Dress-Up Darling")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("My Dress-Up Darling")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Sono Bisque Doll wa Koi wo Suru".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Sono Bisque Doll wa Koi wo Suru")
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Sono Bisque Doll wa Koi wo Suru")
}));
}
#endregion

View File

@ -7,16 +7,18 @@ namespace API.Tests.Parser
[Theory]
[InlineData("Gifting The Wonderful World With Blessings! - 3 Side Stories [yuNS][Unknown]", "Gifting The Wonderful World With Blessings!")]
[InlineData("BBC Focus 00 The Science of Happiness 2nd Edition (2018)", "BBC Focus 00 The Science of Happiness 2nd Edition")]
[InlineData("Faust - Volume 01 [Del Rey][Scans_Compressed]", "Faust")]
public void ParseSeriesTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseSeries(filename));
}
[Theory]
[InlineData("Harrison, Kim - Dates from Hell - Hollows Vol 2.5.epub", "2.5")]
[InlineData("Faust - Volume 01 [Del Rey][Scans_Compressed]", "1")]
public void ParseVolumeTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseVolume(filename));
}
// [Theory]

View File

@ -79,7 +79,7 @@ namespace API.Tests.Parser
[InlineData("Fables 2010 Vol. 1 Legends in Exile", "Fables 2010")]
public void ParseComicSeriesTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(filename));
}
[Theory]
@ -126,7 +126,7 @@ namespace API.Tests.Parser
[InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "1")]
public void ParseComicVolumeTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(filename));
}
[Theory]
@ -171,7 +171,7 @@ namespace API.Tests.Parser
[InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "0")]
public void ParseComicChapterTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(filename));
}
@ -190,7 +190,7 @@ namespace API.Tests.Parser
[InlineData("Adventure Time 2013_-_Annual #001 (2013)", true)]
public void ParseComicSpecialTest(string input, bool expected)
{
Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseComicSpecial(input)));
Assert.Equal(expected, !string.IsNullOrEmpty(API.Services.Tasks.Scanner.Parser.Parser.ParseComicSpecial(input)));
}
}
}

View File

@ -75,7 +75,7 @@ namespace API.Tests.Parser
[InlineData("スライム倒して300年、知らないうちにレベルMAXになってました 1-3巻", "1-3")]
public void ParseVolumeTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseVolume(filename));
}
[Theory]
@ -180,9 +180,10 @@ namespace API.Tests.Parser
[InlineData("Highschool of the Dead - Full Color Edition v02 [Uasaha] (Yen Press)", "Highschool of the Dead - Full Color Edition")]
[InlineData("諌山創] 23", "] ")]
[InlineData("(一般コミック) [奥浩哉] 09", "")]
[InlineData("Highschool of the Dead - 02", "Highschool of the Dead")]
public void ParseSeriesTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseSeries(filename));
}
[Theory]
@ -260,7 +261,7 @@ namespace API.Tests.Parser
[InlineData("[ハレム] SMごっこ 10", "10")]
public void ParseChaptersTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseChapter(filename));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseChapter(filename));
}
@ -276,7 +277,7 @@ namespace API.Tests.Parser
[InlineData("Love Hina Omnibus v05 (2015) (Digital-HD) (Asgard-Empire).cbz", "Omnibus")]
public void ParseEditionTest(string input, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseEdition(input));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseEdition(input));
}
[Theory]
[InlineData("Beelzebub Special OneShot - Minna no Kochikame x Beelzebub (2016) [Mangastream].cbz", true)]
@ -295,7 +296,7 @@ namespace API.Tests.Parser
[InlineData("The League of Extra-ordinary Gentlemen", false)]
public void ParseMangaSpecialTest(string input, bool expected)
{
Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseMangaSpecial(input)));
Assert.Equal(expected, !string.IsNullOrEmpty(API.Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(input)));
}
[Theory]
@ -304,14 +305,14 @@ namespace API.Tests.Parser
[InlineData("image.txt", MangaFormat.Unknown)]
public void ParseFormatTest(string inputFile, MangaFormat expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseFormat(inputFile));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseFormat(inputFile));
}
[Theory]
[InlineData("Gifting The Wonderful World With Blessings! - 3 Side Stories [yuNS][Unknown].epub", "Side Stories")]
public void ParseSpecialTest(string inputFile, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseMangaSpecial(inputFile));
Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(inputFile));
}

View File

@ -1,6 +1,6 @@
using System.Linq;
using Xunit;
using static API.Parser.Parser;
using static API.Services.Tasks.Scanner.Parser.Parser;
namespace API.Tests.Parser
{
@ -223,7 +223,7 @@ namespace API.Tests.Parser
[InlineData("/manga/1/1/1", "/manga/1/1/1")]
[InlineData("/manga/1/1/1.jpg", "/manga/1/1/1.jpg")]
[InlineData(@"/manga/1/1\1.jpg", @"/manga/1/1/1.jpg")]
[InlineData("/manga/1/1//1", "/manga/1/1//1")]
[InlineData("/manga/1/1//1", "/manga/1/1/1")]
[InlineData("/manga/1\\1\\1", "/manga/1/1/1")]
[InlineData("C:/manga/1\\1\\1.jpg", "C:/manga/1/1/1.jpg")]
public void NormalizePathTest(string inputPath, string expected)

View File

@ -0,0 +1,160 @@
using System.Collections.Generic;
using System.Data.Common;
using System.IO.Abstractions.TestingHelpers;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using API.Services;
using AutoMapper;
using Microsoft.Data.Sqlite;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.Extensions.Logging;
using NSubstitute;
using Xunit;
namespace API.Tests.Repository;
public class SeriesRepositoryTests
{
private readonly IUnitOfWork _unitOfWork;
private readonly DbConnection _connection;
private readonly DataContext _context;
private const string CacheDirectory = "C:/kavita/config/cache/";
private const string CoverImageDirectory = "C:/kavita/config/covers/";
private const string BackupDirectory = "C:/kavita/config/backups/";
private const string DataDirectory = "C:/data/";
public SeriesRepositoryTests()
{
var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options;
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
_context = new DataContext(contextOptions);
Task.Run(SeedDb).GetAwaiter().GetResult();
var config = new MapperConfiguration(cfg => cfg.AddProfile<AutoMapperProfiles>());
var mapper = config.CreateMapper();
_unitOfWork = new UnitOfWork(_context, mapper, null);
}
#region Setup
private static DbConnection CreateInMemoryDatabase()
{
var connection = new SqliteConnection("Filename=:memory:");
connection.Open();
return connection;
}
private async Task<bool> SeedDb()
{
await _context.Database.MigrateAsync();
var filesystem = CreateFileSystem();
await Seed.SeedSettings(_context,
new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
setting.Value = CacheDirectory;
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
setting.Value = BackupDirectory;
_context.ServerSetting.Update(setting);
var lib = new Library()
{
Name = "Manga", Folders = new List<FolderPath>() {new FolderPath() {Path = "C:/data/"}}
};
_context.AppUser.Add(new AppUser()
{
UserName = "majora2007",
Libraries = new List<Library>()
{
lib
}
});
return await _context.SaveChangesAsync() > 0;
}
private async Task ResetDb()
{
_context.Series.RemoveRange(_context.Series.ToList());
_context.AppUserRating.RemoveRange(_context.AppUserRating.ToList());
_context.Genre.RemoveRange(_context.Genre.ToList());
_context.CollectionTag.RemoveRange(_context.CollectionTag.ToList());
_context.Person.RemoveRange(_context.Person.ToList());
await _context.SaveChangesAsync();
}
private static MockFileSystem CreateFileSystem()
{
var fileSystem = new MockFileSystem();
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
fileSystem.AddDirectory("C:/kavita/config/");
fileSystem.AddDirectory(CacheDirectory);
fileSystem.AddDirectory(CoverImageDirectory);
fileSystem.AddDirectory(BackupDirectory);
fileSystem.AddDirectory(DataDirectory);
return fileSystem;
}
#endregion
private async Task SetupSeriesData()
{
var library = new Library()
{
Name = "Manga",
Type = LibraryType.Manga,
Folders = new List<FolderPath>()
{
new FolderPath() {Path = "C:/data/manga/"}
}
};
var s = DbFactory.Series("The Idaten Deities Know Only Peace", "Heion Sedai no Idaten-tachi");
s.Format = MangaFormat.Archive;
library.Series = new List<Series>()
{
s,
};
_unitOfWork.LibraryRepository.Add(library);
await _unitOfWork.CommitAsync();
}
[InlineData("Heion Sedai no Idaten-tachi", "", MangaFormat.Archive, "The Idaten Deities Know Only Peace")] // Matching on localized name in DB
[InlineData("Heion Sedai no Idaten-tachi", "", MangaFormat.Pdf, null)]
public async Task GetFullSeriesByAnyName_Should(string seriesName, string localizedName, string? expected)
{
var firstSeries = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(1);
var series =
await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(seriesName, localizedName,
1, MangaFormat.Unknown);
if (expected == null)
{
Assert.Null(series);
}
else
{
Assert.NotNull(series);
Assert.Equal(expected, series.Name);
}
}
}

View File

@ -68,6 +68,7 @@ namespace API.Tests.Services
[InlineData("macos_none.zip", 0)]
[InlineData("macos_one.zip", 1)]
[InlineData("macos_native.zip", 21)]
[InlineData("macos_withdotunder_one.zip", 1)]
public void GetNumberOfPagesFromArchiveTest(string archivePath, int expected)
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives");
@ -197,7 +198,7 @@ namespace API.Tests.Services
var imageService = new ImageService(Substitute.For<ILogger<ImageService>>(), _directoryService);
var archiveService = Substitute.For<ArchiveService>(_logger,
new DirectoryService(_directoryServiceLogger, new FileSystem()), imageService);
var testDirectory = API.Parser.Parser.NormalizePath(Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages")));
var testDirectory = API.Services.Tasks.Scanner.Parser.Parser.NormalizePath(Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages")));
var outputDir = Path.Join(testDirectory, "output");
_directoryService.ClearDirectory(outputDir);

View File

@ -147,7 +147,7 @@ public class BackupServiceTests
var backupLogFiles = backupService.GetLogFiles(0, LogDirectory).ToList();
Assert.Single(backupLogFiles);
Assert.Equal(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log"), API.Parser.Parser.NormalizePath(backupLogFiles.First()));
Assert.Equal(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log"), API.Services.Tasks.Scanner.Parser.Parser.NormalizePath(backupLogFiles.First()));
}
[Fact]
@ -168,8 +168,8 @@ public class BackupServiceTests
var backupService = new BackupService(_logger, _unitOfWork, ds, configuration, _messageHub);
var backupLogFiles = backupService.GetLogFiles(1, LogDirectory).Select(API.Parser.Parser.NormalizePath).ToList();
Assert.NotEmpty(backupLogFiles.Where(file => file.Equals(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log")) || file.Equals(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita1.log"))));
var backupLogFiles = backupService.GetLogFiles(1, LogDirectory).Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList();
Assert.NotEmpty(backupLogFiles.Where(file => file.Equals(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log")) || file.Equals(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita1.log"))));
}

View File

@ -401,9 +401,79 @@ public class BookmarkServiceTests
var files = await bookmarkService.GetBookmarkFilesById(new[] {1});
var actualFiles = ds.GetFiles(BookmarkDirectory, searchOption: SearchOption.AllDirectories);
Assert.Equal(files.Select(API.Parser.Parser.NormalizePath).ToList(), actualFiles.Select(API.Parser.Parser.NormalizePath).ToList());
Assert.Equal(files.Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList(), actualFiles.Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList());
}
#endregion
#region Misc
[Fact]
public async Task ShouldNotDeleteBookmarkOnChapterDeletion()
{
var filesystem = CreateFileSystem();
filesystem.AddFile($"{CacheDirectory}1/0001.jpg", new MockFileData("123"));
filesystem.AddFile($"{BookmarkDirectory}1/1/0001.jpg", new MockFileData("123"));
// Delete all Series to reset state
await ResetDB();
_context.Series.Add(new Series()
{
Name = "Test",
Library = new Library() {
Name = "Test LIb",
Type = LibraryType.Manga,
},
Volumes = new List<Volume>()
{
new Volume()
{
Chapters = new List<Chapter>()
{
new Chapter()
{
}
}
}
}
});
_context.AppUser.Add(new AppUser()
{
UserName = "Joe",
Bookmarks = new List<AppUserBookmark>()
{
new AppUserBookmark()
{
Page = 1,
ChapterId = 1,
FileName = $"1/1/0001.jpg",
SeriesId = 1,
VolumeId = 1
}
}
});
await _context.SaveChangesAsync();
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
var bookmarkService = Create(ds);
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Bookmarks);
var vol = await _unitOfWork.VolumeRepository.GetVolumeAsync(1);
vol.Chapters = new List<Chapter>();
_unitOfWork.VolumeRepository.Update(vol);
await _unitOfWork.CommitAsync();
Assert.Equal(1, ds.GetFiles(BookmarkDirectory, searchOption:SearchOption.AllDirectories).Count());
Assert.NotNull(await _unitOfWork.UserRepository.GetBookmarkAsync(1));
}
#endregion
}

View File

@ -55,6 +55,11 @@ namespace API.Tests.Services
{
throw new System.NotImplementedException();
}
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
{
throw new System.NotImplementedException();
}
}
public class CacheServiceTests
{

View File

@ -312,13 +312,13 @@ public class CleanupServiceTests
new ReadingList()
{
Title = "Something",
NormalizedTitle = API.Parser.Parser.Normalize("Something"),
NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something"),
CoverImage = $"{ImageService.GetReadingListFormat(1)}.jpg"
},
new ReadingList()
{
Title = "Something 2",
NormalizedTitle = API.Parser.Parser.Normalize("Something 2"),
NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something 2"),
CoverImage = $"{ImageService.GetReadingListFormat(2)}.jpg"
}
}

View File

@ -34,7 +34,7 @@ namespace API.Tests.Services
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var files = new List<string>();
var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
API.Parser.Parser.ArchiveFileExtensions, _logger);
API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions, _logger);
Assert.Equal(28, fileCount);
Assert.Equal(28, files.Count);
@ -59,7 +59,7 @@ namespace API.Tests.Services
try
{
var fileCount = ds.TraverseTreeParallelForEach("/manga/", s => files.Add(s),
API.Parser.Parser.ImageFileExtensions, _logger);
API.Services.Tasks.Scanner.Parser.Parser.ImageFileExtensions, _logger);
Assert.Equal(1, fileCount);
}
catch (Exception ex)
@ -90,7 +90,7 @@ namespace API.Tests.Services
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var files = new List<string>();
var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
API.Parser.Parser.ArchiveFileExtensions, _logger);
API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions, _logger);
Assert.Equal(28, fileCount);
Assert.Equal(28, files.Count);
@ -111,7 +111,7 @@ namespace API.Tests.Services
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var files = ds.GetFilesWithExtension(testDirectory, API.Parser.Parser.ArchiveFileExtensions);
var files = ds.GetFilesWithExtension(testDirectory, API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions);
Assert.Equal(10, files.Length);
Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip"));
@ -150,7 +150,7 @@ namespace API.Tests.Services
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var files = ds.GetFiles(testDirectory, API.Parser.Parser.ArchiveFileExtensions).ToList();
var files = ds.GetFiles(testDirectory, API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions).ToList();
Assert.Equal(10, files.Count());
Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip"));
@ -586,12 +586,12 @@ namespace API.Tests.Services
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
ds.CopyFilesToDirectory(new []{MockUnixSupport.Path($"{testDirectory}file.zip")}, "/manga/output/");
ds.CopyFilesToDirectory(new []{MockUnixSupport.Path($"{testDirectory}file.zip")}, "/manga/output/");
var outputFiles = ds.GetFiles("/manga/output/").Select(API.Parser.Parser.NormalizePath).ToList();
var outputFiles = ds.GetFiles("/manga/output/").Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList();
Assert.Equal(4, outputFiles.Count()); // we have 2 already there and 2 copies
// For some reason, this has C:/ on directory even though everything is emulated (System.IO.Abstractions issue, not changing)
// https://github.com/TestableIO/System.IO.Abstractions/issues/831
Assert.True(outputFiles.Contains(API.Parser.Parser.NormalizePath("/manga/output/file (3).zip"))
|| outputFiles.Contains(API.Parser.Parser.NormalizePath("C:/manga/output/file (3).zip")));
Assert.True(outputFiles.Contains(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath("/manga/output/file (3).zip"))
|| outputFiles.Contains(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath("C:/manga/output/file (3).zip")));
}
#endregion
@ -677,6 +677,8 @@ namespace API.Tests.Services
[InlineData(new [] {"C:/Manga/"}, new [] {"C:/Manga/Love Hina/Vol. 01.cbz"}, "C:/Manga/Love Hina")]
[InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/Dir 2/"}, new [] {"C:/Manga/Dir 1/Love Hina/Vol. 01.cbz"}, "C:/Manga/Dir 1/Love Hina")]
[InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/"}, new [] {"D:/Manga/Love Hina/Vol. 01.cbz", "D:/Manga/Vol. 01.cbz"}, "")]
[InlineData(new [] {"C:/Manga/"}, new [] {"C:/Manga//Love Hina/Vol. 01.cbz"}, "C:/Manga/Love Hina")]
[InlineData(new [] {@"C:\mount\drive\Library\Test Library\Comics\"}, new [] {@"C:\mount\drive\Library\Test Library\Comics\Bruce Lee (1994)\Bruce Lee #001 (1994).cbz"}, @"C:/mount/drive/Library/Test Library/Comics/Bruce Lee (1994)")]
public void FindHighestDirectoriesFromFilesTest(string[] rootDirectories, string[] files, string expectedDirectory)
{
var fileSystem = new MockFileSystem();
@ -841,5 +843,158 @@ namespace API.Tests.Services
Assert.Equal(expected, DirectoryService.GetHumanReadableBytes(bytes));
}
#endregion
#region ScanFiles
[Fact]
public Task ScanFiles_ShouldFindNoFiles_AllAreIgnored()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("*.*"));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var allFiles = ds.ScanFiles("C:/Data/");
Assert.Equal(0, allFiles.Count);
return Task.CompletedTask;
}
[Fact]
public Task ScanFiles_ShouldFindNoNestedFiles_IgnoreNestedFiles()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*"));
fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var allFiles = ds.ScanFiles("C:/Data/");
Assert.Equal(1, allFiles.Count); // Ignore files are not counted in files, only valid extensions
return Task.CompletedTask;
}
[Fact]
public Task ScanFiles_NestedIgnore_IgnoreNestedFilesInOneDirectoryOnly()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddDirectory("C:/Data/Specials/");
fileSystem.AddDirectory("C:/Data/Specials/ArtBooks/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*"));
fileSystem.AddFile("C:/Data/Specials/.kavitaignore", new MockFileData("**/ArtBooks/*"));
fileSystem.AddFile("C:/Data/Specials/Hi.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Specials/ArtBooks/art book 01.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var allFiles = ds.ScanFiles("C:/Data/");
Assert.Equal(2, allFiles.Count); // Ignore files are not counted in files, only valid extensions
return Task.CompletedTask;
}
[Fact]
public Task ScanFiles_ShouldFindAllFiles()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.txt", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Nothing.pdf", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var allFiles = ds.ScanFiles("C:/Data/");
Assert.Equal(5, allFiles.Count);
return Task.CompletedTask;
}
#endregion
#region GetAllDirectories
[Fact]
public void GetAllDirectories_ShouldFindAllNestedDirectories()
{
const string testDirectory = "C:/manga/base/";
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1"));
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 2"));
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "A"));
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "B"));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
Assert.Equal(2, ds.GetAllDirectories(fileSystem.Path.Join(testDirectory, "folder 1")).Count());
}
#endregion
#region GetParentDirectory
[Theory]
[InlineData(@"C:/file.txt", "C:/")]
[InlineData(@"C:/folder/file.txt", "C:/folder")]
[InlineData(@"C:/folder/subfolder/file.txt", "C:/folder/subfolder")]
public void GetParentDirectoryName_ShouldFindParentOfFiles(string path, string expected)
{
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ path, new MockFileData(string.Empty)}
});
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
Assert.Equal(expected, ds.GetParentDirectoryName(path));
}
[Theory]
[InlineData(@"C:/folder", "C:/")]
[InlineData(@"C:/folder/subfolder", "C:/folder")]
[InlineData(@"C:/folder/subfolder/another", "C:/folder/subfolder")]
public void GetParentDirectoryName_ShouldFindParentOfDirectories(string path, string expected)
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory(path);
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
Assert.Equal(expected, ds.GetParentDirectoryName(path));
}
#endregion
}
}

View File

@ -1,4 +1,5 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Data.Common;
using System.IO.Abstractions.TestingHelpers;
@ -14,6 +15,8 @@ using API.Services.Tasks.Scanner;
using API.SignalR;
using API.Tests.Helpers;
using AutoMapper;
using DotNet.Globbing;
using Flurl.Util;
using Microsoft.Data.Sqlite;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
@ -25,9 +28,9 @@ namespace API.Tests.Services;
internal class MockReadingItemService : IReadingItemService
{
private readonly DefaultParser _defaultParser;
private readonly IDefaultParser _defaultParser;
public MockReadingItemService(DefaultParser defaultParser)
public MockReadingItemService(IDefaultParser defaultParser)
{
_defaultParser = defaultParser;
}
@ -56,6 +59,11 @@ internal class MockReadingItemService : IReadingItemService
{
return _defaultParser.Parse(path, rootPath, type);
}
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
{
return _defaultParser.Parse(path, rootPath, type);
}
}
public class ParseScannedFilesTests
@ -148,138 +156,73 @@ public class ParseScannedFilesTests
#endregion
#region GetInfosByName
[Fact]
public void GetInfosByName_ShouldReturnGivenMatchingSeriesName()
{
var fileSystem = new MockFileSystem();
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var infos = new List<ParserInfo>()
{
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
};
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
{
{
new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Accel World",
NormalizedName = API.Parser.Parser.Normalize("Accel World")
},
infos
},
{
new ParsedSeries()
{
Format = MangaFormat.Pdf,
Name = "Accel World",
NormalizedName = API.Parser.Parser.Normalize("Accel World")
},
new List<ParserInfo>()
}
};
var series = DbFactory.Series("Accel World");
series.Format = MangaFormat.Pdf;
Assert.Empty(ParseScannedFiles.GetInfosByName(parsedSeries, series));
series.Format = MangaFormat.Archive;
Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count());
}
[Fact]
public void GetInfosByName_ShouldReturnGivenMatchingNormalizedSeriesName()
{
var fileSystem = new MockFileSystem();
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var infos = new List<ParserInfo>()
{
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
};
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
{
{
new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Accel World",
NormalizedName = API.Parser.Parser.Normalize("Accel World")
},
infos
},
{
new ParsedSeries()
{
Format = MangaFormat.Pdf,
Name = "Accel World",
NormalizedName = API.Parser.Parser.Normalize("Accel World")
},
new List<ParserInfo>()
}
};
var series = DbFactory.Series("accel world");
series.Format = MangaFormat.Archive;
Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count());
}
#endregion
#region MergeName
[Fact]
public async Task MergeName_ShouldMergeMatchingFormatAndName()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false)));
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false)));
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false)));
}
[Fact]
public async Task MergeName_ShouldMerge_MismatchedFormatSameName()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false)));
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false)));
}
// NOTE: I don't think I can test MergeName as it relies on Tracking Files, which is more complicated than I need
// [Fact]
// public async Task MergeName_ShouldMergeMatchingFormatAndName()
// {
// var fileSystem = new MockFileSystem();
// fileSystem.AddDirectory("C:/Data/");
// fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
// fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
// fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
//
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
// var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
// new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
//
// var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>();
// var parsedFiles = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
//
// void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
// {
// var skippedScan = parsedInfo.Item1;
// var parsedFiles = parsedInfo.Item2;
// if (parsedFiles.Count == 0) return;
//
// var foundParsedSeries = new ParsedSeries()
// {
// Name = parsedFiles.First().Series,
// NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series),
// Format = parsedFiles.First().Format
// };
//
// parsedSeries.Add(foundParsedSeries, parsedFiles);
// }
//
// await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName",
// false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles);
//
// Assert.Equal("Accel World",
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false)));
// Assert.Equal("Accel World",
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false)));
// Assert.Equal("Accel World",
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false)));
// }
//
// [Fact]
// public async Task MergeName_ShouldMerge_MismatchedFormatSameName()
// {
// var fileSystem = new MockFileSystem();
// fileSystem.AddDirectory("C:/Data/");
// fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
// fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
// fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
//
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
// var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
// new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
//
//
// await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
//
// Assert.Equal("Accel World",
// psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false)));
// Assert.Equal("Accel World",
// psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false)));
// }
#endregion
@ -299,14 +242,150 @@ public class ParseScannedFilesTests
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>();
void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
{
var skippedScan = parsedInfo.Item1;
var parsedFiles = parsedInfo.Item2;
if (parsedFiles.Count == 0) return;
var foundParsedSeries = new ParsedSeries()
{
Name = parsedFiles.First().Series,
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(parsedFiles.First().Series),
Format = parsedFiles.First().Format
};
parsedSeries.Add(foundParsedSeries, parsedFiles);
}
await psf.ScanLibrariesForSeries(LibraryType.Manga,
new List<string>() {"C:/Data/"}, "libraryName", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles);
var parsedSeries = await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
Assert.Equal(3, parsedSeries.Values.Count);
Assert.NotEmpty(parsedSeries.Keys.Where(p => p.Format == MangaFormat.Archive && p.Name.Equals("Accel World")));
}
#endregion
#region ProcessFiles
private static MockFileSystem CreateTestFilesystem()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
return fileSystem;
}
[Fact]
public async Task ProcessFiles_ForLibraryMode_OnlyCallsFolderActionForEachTopLevelFolder()
{
var fileSystem = CreateTestFilesystem();
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var directoriesSeen = new HashSet<string>();
await psf.ProcessFiles("C:/Data/", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),
(files, directoryPath) =>
{
directoriesSeen.Add(directoryPath);
return Task.CompletedTask;
});
Assert.Equal(2, directoriesSeen.Count);
}
[Fact]
public async Task ProcessFiles_ForNonLibraryMode_CallsFolderActionOnce()
{
var fileSystem = CreateTestFilesystem();
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var directoriesSeen = new HashSet<string>();
await psf.ProcessFiles("C:/Data/", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, directoryPath) =>
{
directoriesSeen.Add(directoryPath);
return Task.CompletedTask;
});
Assert.Single(directoriesSeen);
directoriesSeen.TryGetValue("C:/Data/", out var actual);
Assert.Equal("C:/Data/", actual);
}
[Fact]
public async Task ProcessFiles_ShouldCallFolderActionTwice()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var callCount = 0;
await psf.ProcessFiles("C:/Data", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) =>
{
callCount++;
return Task.CompletedTask;
});
Assert.Equal(2, callCount);
}
/// <summary>
/// Due to this not being a library, it's going to consider everything under C:/Data as being one folder aka a series folder
/// </summary>
[Fact]
public async Task ProcessFiles_ShouldCallFolderActionOnce()
{
var fileSystem = new MockFileSystem();
fileSystem.AddDirectory("C:/Data/");
fileSystem.AddDirectory("C:/Data/Accel World");
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
var callCount = 0;
await psf.ProcessFiles("C:/Data", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) =>
{
callCount++;
return Task.CompletedTask;
});
Assert.Equal(1, callCount);
}
#endregion
}

View File

@ -0,0 +1,109 @@
using System.Collections.Generic;
using System.Data.Common;
using System.IO.Abstractions.TestingHelpers;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using API.Services;
using AutoMapper;
using Microsoft.Data.Sqlite;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
using NSubstitute;
using Xunit;
namespace API.Tests.Services;
public class ReadingListServiceTests
{
private readonly IUnitOfWork _unitOfWork;
private readonly IReadingListService _readingListService;
private readonly DataContext _context;
private const string CacheDirectory = "C:/kavita/config/cache/";
private const string CoverImageDirectory = "C:/kavita/config/covers/";
private const string BackupDirectory = "C:/kavita/config/backups/";
private const string DataDirectory = "C:/data/";
public ReadingListServiceTests()
{
var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options;
_context = new DataContext(contextOptions);
Task.Run(SeedDb).GetAwaiter().GetResult();
var config = new MapperConfiguration(cfg => cfg.AddProfile<AutoMapperProfiles>());
var mapper = config.CreateMapper();
_unitOfWork = new UnitOfWork(_context, mapper, null);
_readingListService = new ReadingListService(_unitOfWork, Substitute.For<ILogger<ReadingListService>>());
}
#region Setup
private static DbConnection CreateInMemoryDatabase()
{
var connection = new SqliteConnection("Filename=:memory:");
connection.Open();
return connection;
}
private async Task<bool> SeedDb()
{
await _context.Database.MigrateAsync();
var filesystem = CreateFileSystem();
await Seed.SeedSettings(_context,
new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
setting.Value = CacheDirectory;
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
setting.Value = BackupDirectory;
_context.ServerSetting.Update(setting);
_context.Library.Add(new Library()
{
Name = "Manga", Folders = new List<FolderPath>() {new FolderPath() {Path = "C:/data/"}}
});
return await _context.SaveChangesAsync() > 0;
}
private async Task ResetDb()
{
_context.Series.RemoveRange(_context.Series.ToList());
await _context.SaveChangesAsync();
}
private static MockFileSystem CreateFileSystem()
{
var fileSystem = new MockFileSystem();
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
fileSystem.AddDirectory("C:/kavita/config/");
fileSystem.AddDirectory(CacheDirectory);
fileSystem.AddDirectory(CoverImageDirectory);
fileSystem.AddDirectory(BackupDirectory);
fileSystem.AddDirectory(DataDirectory);
return fileSystem;
}
#endregion
#region RemoveFullyReadItems
// TODO: Implement all methods here
#endregion
}

View File

@ -16,7 +16,7 @@ namespace API.Tests.Services
[Fact]
public void FindSeriesNotOnDisk_Should_Remove1()
{
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
@ -36,7 +36,7 @@ namespace API.Tests.Services
Name = "1"
}
},
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"),
Metadata = new SeriesMetadata(),
Format = MangaFormat.Epub
}
@ -48,7 +48,7 @@ namespace API.Tests.Services
[Fact]
public void FindSeriesNotOnDisk_Should_RemoveNothing_Test()
{
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive});
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive});
@ -61,7 +61,7 @@ namespace API.Tests.Services
Name = "Cage of Eden",
LocalizedName = "Cage of Eden",
OriginalName = "Cage of Eden",
NormalizedName = API.Parser.Parser.Normalize("Cage of Eden"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Cage of Eden"),
Metadata = new SeriesMetadata(),
Format = MangaFormat.Archive
},
@ -70,7 +70,7 @@ namespace API.Tests.Services
Name = "Darker Than Black",
LocalizedName = "Darker Than Black",
OriginalName = "Darker Than Black",
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"),
Metadata = new SeriesMetadata(),
Format = MangaFormat.Archive
}
@ -125,6 +125,8 @@ namespace API.Tests.Services
// }
// TODO: I want a test for UpdateSeries where if I have chapter 10 and now it's mapping into Vol 2 Chapter 10,
// if I can do it without deleting the underlying chapter (aka id change)
}
}

View File

@ -157,7 +157,7 @@ public class SiteThemeServiceTests
await siteThemeService.Scan();
var customThemes = (await _unitOfWork.SiteThemeRepository.GetThemeDtos()).Where(t =>
API.Parser.Parser.Normalize(t.Name).Equals(API.Parser.Parser.Normalize("custom")));
API.Services.Tasks.Scanner.Parser.Parser.Normalize(t.Name).Equals(API.Services.Tasks.Scanner.Parser.Parser.Normalize("custom")));
Assert.Single(customThemes);
}
@ -177,7 +177,7 @@ public class SiteThemeServiceTests
await siteThemeService.Scan();
var customThemes = (await _unitOfWork.SiteThemeRepository.GetThemeDtos()).Where(t =>
API.Parser.Parser.Normalize(t.Name).Equals(API.Parser.Parser.Normalize("custom")));
API.Services.Tasks.Scanner.Parser.Parser.Normalize(t.Name).Equals(API.Services.Tasks.Scanner.Parser.Parser.Normalize("custom")));
Assert.Empty(customThemes);
}
@ -194,7 +194,7 @@ public class SiteThemeServiceTests
_context.SiteTheme.Add(new SiteTheme()
{
Name = "Custom",
NormalizedName = API.Parser.Parser.Normalize("Custom"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"),
Provider = ThemeProvider.User,
FileName = "custom.css",
IsDefault = false
@ -219,7 +219,7 @@ public class SiteThemeServiceTests
_context.SiteTheme.Add(new SiteTheme()
{
Name = "Custom",
NormalizedName = API.Parser.Parser.Normalize("Custom"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"),
Provider = ThemeProvider.User,
FileName = "custom.css",
IsDefault = false
@ -247,7 +247,7 @@ public class SiteThemeServiceTests
_context.SiteTheme.Add(new SiteTheme()
{
Name = "Custom",
NormalizedName = API.Parser.Parser.Normalize("Custom"),
NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"),
Provider = ThemeProvider.User,
FileName = "custom.css",
IsDefault = false

View File

@ -19,6 +19,12 @@
<NoWarn>1701;1702;1591</NoWarn>
</PropertyGroup>
<!-- Ignore XML comments -->
<PropertyGroup>
<GenerateDocumentationFile>True</GenerateDocumentationFile>
<NoWarn>$(NoWarn);1591</NoWarn>
</PropertyGroup>
<PropertyGroup>
<SatelliteResourceLanguages>en</SatelliteResourceLanguages>
</PropertyGroup>
@ -48,6 +54,7 @@
<PackageReference Include="Hangfire.AspNetCore" Version="1.7.30" />
<PackageReference Include="Hangfire.MaximumConcurrentExecutions" Version="1.1.0" />
<PackageReference Include="Hangfire.MemoryStorage.Core" Version="1.4.0" />
<PackageReference Include="Hangfire.Storage.SQLite" Version="0.3.2" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.43" />
<PackageReference Include="MarkdownDeep.NET.Core" Version="1.5.0.4" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="6.0.7" />

View File

@ -23,6 +23,8 @@ namespace API.Comparators
return x.CompareTo(y);
}
public static readonly ChapterSortComparer Default = new ChapterSortComparer();
}
/// <summary>
@ -44,6 +46,8 @@ namespace API.Comparators
return x.CompareTo(y);
}
public static readonly ChapterSortComparerZeroFirst Default = new ChapterSortComparerZeroFirst();
}
public class SortComparerZeroLast : IComparer<double>

View File

@ -354,7 +354,7 @@ namespace API.Controllers
lib.AppUsers.Remove(user);
}
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList();
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList();
}
foreach (var lib in libraries)
@ -458,11 +458,11 @@ namespace API.Controllers
{
_logger.LogInformation("{UserName} is being registered as admin. Granting access to all libraries",
user.UserName);
libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync()).ToList();
libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync(LibraryIncludes.AppUser)).ToList();
}
else
{
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList();
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList();
}
foreach (var lib in libraries)
@ -472,37 +472,55 @@ namespace API.Controllers
}
var token = await _userManager.GenerateEmailConfirmationTokenAsync(user);
if (string.IsNullOrEmpty(token)) return BadRequest("There was an issue sending email");
if (string.IsNullOrEmpty(token))
{
_logger.LogError("There was an issue generating a token for the email");
return BadRequest("There was an creating the invite user");
}
user.ConfirmationToken = token;
await _unitOfWork.CommitAsync();
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an error during invite user flow, unable to create user. Deleting user for retry");
_unitOfWork.UserRepository.Delete(user);
await _unitOfWork.CommitAsync();
}
var emailLink = GenerateEmailLink(token, "confirm-email", dto.Email);
try
{
var emailLink = GenerateEmailLink(user.ConfirmationToken, "confirm-email", dto.Email);
_logger.LogCritical("[Invite User]: Email Link for {UserName}: {Link}", user.UserName, emailLink);
_logger.LogCritical("[Invite User]: Token {UserName}: {Token}", user.UserName, user.ConfirmationToken);
var host = _environment.IsDevelopment() ? "localhost:4200" : Request.Host.ToString();
var accessible = await _emailService.CheckIfAccessible(host);
if (accessible)
{
await _emailService.SendConfirmationEmail(new ConfirmationEmailDto()
try
{
EmailAddress = dto.Email,
InvitingUser = adminUser.UserName,
ServerConfirmationLink = emailLink
});
await _emailService.SendConfirmationEmail(new ConfirmationEmailDto()
{
EmailAddress = dto.Email,
InvitingUser = adminUser.UserName,
ServerConfirmationLink = emailLink
});
}
catch (Exception)
{
/* Swallow exception */
}
}
user.ConfirmationToken = token;
await _unitOfWork.CommitAsync();
return Ok(new InviteUserResponse
{
EmailLink = emailLink,
EmailSent = accessible
});
}
catch (Exception)
catch (Exception ex)
{
_unitOfWork.UserRepository.Delete(user);
await _unitOfWork.CommitAsync();
_logger.LogError(ex, "There was an error during invite user flow, unable to send an email");
}
return BadRequest("There was an error setting up your account. Please check the logs");
@ -561,17 +579,26 @@ namespace API.Controllers
[HttpPost("confirm-password-reset")]
public async Task<ActionResult<string>> ConfirmForgotPassword(ConfirmPasswordResetDto dto)
{
var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(dto.Email);
if (user == null)
try
{
return BadRequest("Invalid Details");
var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(dto.Email);
if (user == null)
{
return BadRequest("Invalid Details");
}
var result = await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider,
"ResetPassword", dto.Token);
if (!result) return BadRequest("Unable to reset password, your email token is not correct.");
var errors = await _accountService.ChangeUserPassword(user, dto.Password);
return errors.Any() ? BadRequest(errors) : Ok("Password updated");
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an unexpected error when confirming new password");
return BadRequest("There was an unexpected error when confirming new password");
}
var result = await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider, "ResetPassword", dto.Token);
if (!result) return BadRequest("Unable to reset password, your email token is not correct.");
var errors = await _accountService.ChangeUserPassword(user, dto.Password);
return errors.Any() ? BadRequest(errors) : Ok("Password updated");
}
@ -597,8 +624,10 @@ namespace API.Controllers
if (!roles.Any(r => r is PolicyConstants.AdminRole or PolicyConstants.ChangePasswordRole))
return Unauthorized("You are not permitted to this operation.");
var emailLink = GenerateEmailLink(await _userManager.GeneratePasswordResetTokenAsync(user), "confirm-reset-password", user.Email);
var token = await _userManager.GeneratePasswordResetTokenAsync(user);
var emailLink = GenerateEmailLink(token, "confirm-reset-password", user.Email);
_logger.LogCritical("[Forgot Password]: Email Link for {UserName}: {Link}", user.UserName, emailLink);
_logger.LogCritical("[Forgot Password]: Token {UserName}: {Token}", user.UserName, token);
var host = _environment.IsDevelopment() ? "localhost:4200" : Request.Host.ToString();
if (await _emailService.CheckIfAccessible(host))
{
@ -651,8 +680,10 @@ namespace API.Controllers
"This user needs to migrate. Have them log out and login to trigger a migration flow");
if (user.EmailConfirmed) return BadRequest("User already confirmed");
var emailLink = GenerateEmailLink(await _userManager.GenerateEmailConfirmationTokenAsync(user), "confirm-email", user.Email);
var token = await _userManager.GenerateEmailConfirmationTokenAsync(user);
var emailLink = GenerateEmailLink(token, "confirm-email", user.Email);
_logger.LogCritical("[Email Migration]: Email Link: {Link}", emailLink);
_logger.LogCritical("[Email Migration]: Token {UserName}: {Token}", user.UserName, token);
await _emailService.SendMigrationEmail(new EmailMigrationDto()
{
EmailAddress = user.Email,
@ -729,6 +760,8 @@ namespace API.Controllers
var result = await _userManager.ConfirmEmailAsync(user, token);
if (result.Succeeded) return true;
_logger.LogCritical("[Account] Email validation failed");
if (!result.Errors.Any()) return false;

View File

@ -76,7 +76,7 @@ namespace API.Controllers
existingTag.Promoted = updatedTag.Promoted;
existingTag.Title = updatedTag.Title.Trim();
existingTag.NormalizedTitle = Parser.Parser.Normalize(updatedTag.Title).ToUpper();
existingTag.NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(updatedTag.Title).ToUpper();
existingTag.Summary = updatedTag.Summary.Trim();
if (_unitOfWork.HasChanges())

View File

@ -0,0 +1,17 @@
using System;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
namespace API.Controllers;
[AllowAnonymous]
public class HealthController : BaseApiController
{
[HttpGet()]
public ActionResult GetHealth()
{
return Ok("Ok");
}
}

View File

@ -13,11 +13,14 @@ using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Services;
using API.Services.Tasks.Scanner;
using API.SignalR;
using AutoMapper;
using Kavita.Common;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using TaskScheduler = API.Services.TaskScheduler;
namespace API.Controllers
{
@ -30,10 +33,11 @@ namespace API.Controllers
private readonly ITaskScheduler _taskScheduler;
private readonly IUnitOfWork _unitOfWork;
private readonly IEventHub _eventHub;
private readonly ILibraryWatcher _libraryWatcher;
public LibraryController(IDirectoryService directoryService,
ILogger<LibraryController> logger, IMapper mapper, ITaskScheduler taskScheduler,
IUnitOfWork unitOfWork, IEventHub eventHub)
IUnitOfWork unitOfWork, IEventHub eventHub, ILibraryWatcher libraryWatcher)
{
_directoryService = directoryService;
_logger = logger;
@ -41,6 +45,7 @@ namespace API.Controllers
_taskScheduler = taskScheduler;
_unitOfWork = unitOfWork;
_eventHub = eventHub;
_libraryWatcher = libraryWatcher;
}
/// <summary>
@ -77,6 +82,7 @@ namespace API.Controllers
if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical issue. Please try again.");
_logger.LogInformation("Created a new library: {LibraryName}", library.Name);
await _libraryWatcher.RestartWatching();
_taskScheduler.ScanLibrary(library.Id);
await _eventHub.SendMessageAsync(MessageFactory.LibraryModified,
MessageFactory.LibraryModifiedEvent(library.Id, "create"), false);
@ -129,7 +135,7 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(updateLibraryForUserDto.Username);
if (user == null) return BadRequest("Could not validate user");
var libraryString = String.Join(",", updateLibraryForUserDto.SelectedLibraries.Select(x => x.Name));
var libraryString = string.Join(",", updateLibraryForUserDto.SelectedLibraries.Select(x => x.Name));
_logger.LogInformation("Granting user {UserName} access to: {Libraries}", updateLibraryForUserDto.Username, libraryString);
var allLibraries = await _unitOfWork.LibraryRepository.GetLibrariesAsync();
@ -168,17 +174,17 @@ namespace API.Controllers
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("scan")]
public ActionResult Scan(int libraryId)
public ActionResult Scan(int libraryId, bool force = false)
{
_taskScheduler.ScanLibrary(libraryId);
_taskScheduler.ScanLibrary(libraryId, force);
return Ok();
}
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("refresh-metadata")]
public ActionResult RefreshMetadata(int libraryId)
public ActionResult RefreshMetadata(int libraryId, bool force = true)
{
_taskScheduler.RefreshMetadata(libraryId);
_taskScheduler.RefreshMetadata(libraryId, force);
return Ok();
}
@ -196,6 +202,37 @@ namespace API.Controllers
return Ok(await _unitOfWork.LibraryRepository.GetLibraryDtosForUsernameAsync(User.GetUsername()));
}
/// <summary>
/// Given a valid path, will invoke either a Scan Series or Scan Library. If the folder does not exist within Kavita, the request will be ignored
/// </summary>
/// <param name="dto"></param>
/// <returns></returns>
[AllowAnonymous]
[HttpPost("scan-folder")]
public async Task<ActionResult> ScanFolder(ScanFolderDto dto)
{
var userId = await _unitOfWork.UserRepository.GetUserIdByApiKeyAsync(dto.ApiKey);
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId);
// Validate user has Admin privileges
var isAdmin = await _unitOfWork.UserRepository.IsUserAdminAsync(user);
if (!isAdmin) return BadRequest("API key must belong to an admin");
if (dto.FolderPath.Contains("..")) return BadRequest("Invalid Path");
dto.FolderPath = Services.Tasks.Scanner.Parser.Parser.NormalizePath(dto.FolderPath);
var libraryFolder = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
.SelectMany(l => l.Folders)
.Distinct()
.Select(Services.Tasks.Scanner.Parser.Parser.NormalizePath);
var seriesFolder = _directoryService.FindHighestDirectoriesFromFiles(libraryFolder,
new List<string>() {dto.FolderPath});
_taskScheduler.ScanFolder(seriesFolder.Keys.Count == 1 ? seriesFolder.Keys.First() : dto.FolderPath);
return Ok();
}
[Authorize(Policy = "RequireAdminRole")]
[HttpDelete("delete")]
public async Task<ActionResult<bool>> DeleteLibrary(int libraryId)
@ -207,10 +244,16 @@ namespace API.Controllers
var chapterIds =
await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(seriesIds);
try
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
if (TaskScheduler.HasScanTaskRunningForLibrary(libraryId))
{
// TODO: Figure out how to cancel a job
_logger.LogInformation("User is attempting to delete a library while a scan is in progress");
return BadRequest(
"You cannot delete a library while a scan is in progress. Please wait for scan to continue then try to delete");
}
_unitOfWork.LibraryRepository.Delete(library);
await _unitOfWork.CommitAsync();
@ -221,6 +264,8 @@ namespace API.Controllers
_taskScheduler.CleanupChapters(chapterIds);
}
await _libraryWatcher.RestartWatching();
foreach (var seriesId in seriesIds)
{
await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved,
@ -264,6 +309,7 @@ namespace API.Controllers
if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical issue updating the library.");
if (originalFolders.Count != libraryForUserDto.Folders.Count() || typeUpdate)
{
await _libraryWatcher.RestartWatching();
_taskScheduler.ScanLibrary(library.Id);
}

View File

@ -6,6 +6,7 @@ using System.Threading.Tasks;
using System.Xml.Serialization;
using API.Comparators;
using API.Data;
using API.Data.Repositories;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
@ -305,7 +306,7 @@ public class OpdsController : BaseApiController
var userId = await GetUser(apiKey);
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId);
var userWithLists = await _unitOfWork.UserRepository.GetUserWithReadingListsByUsernameAsync(user.UserName);
var userWithLists = await _unitOfWork.UserRepository.GetUserByUsernameAsync(user.UserName, AppUserIncludes.ReadingListsWithItems);
var readingList = userWithLists.ReadingLists.SingleOrDefault(t => t.Id == readingListId);
if (readingList == null)
{

View File

@ -1,4 +1,5 @@
using System.Threading.Tasks;
using System.ComponentModel.DataAnnotations;
using System.Threading.Tasks;
using API.Data;
using API.DTOs;
using API.Services;
@ -24,12 +25,13 @@ namespace API.Controllers
/// <summary>
/// Authenticate with the Server given an apiKey. This will log you in by returning the user object and the JWT token.
/// </summary>
/// <param name="apiKey"></param>
/// <remarks>This API is not fully built out and may require more information in later releases</remarks>
/// <param name="apiKey">API key which will be used to authenticate and return a valid user token back</param>
/// <param name="pluginName">Name of the Plugin</param>
/// <returns></returns>
[AllowAnonymous]
[HttpPost("authenticate")]
public async Task<ActionResult<UserDto>> Authenticate(string apiKey, string pluginName)
public async Task<ActionResult<UserDto>> Authenticate([Required] string apiKey, [Required] string pluginName)
{
// NOTE: In order to log information about plugins, we need some Plugin Description information for each request
// Should log into access table so we can tell the user

View File

@ -11,6 +11,7 @@ using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Services;
using API.Services.Tasks;
using Hangfire;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
@ -60,6 +61,7 @@ namespace API.Controllers
try
{
var path = _cacheService.GetCachedFile(chapter);
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"Pdf doesn't exist when it should.");
@ -90,7 +92,7 @@ namespace API.Controllers
try
{
var path = _cacheService.GetCachedPagePath(chapter, page);
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}");
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}. Try refreshing to allow re-cache.");
var format = Path.GetExtension(path).Replace(".", "");
return PhysicalFile(path, "image/" + format, Path.GetFileName(path), true);
@ -177,17 +179,17 @@ namespace API.Controllers
info.Title += " - " + info.ChapterTitle;
}
if (info.IsSpecial && dto.VolumeNumber.Equals(Parser.Parser.DefaultVolume))
if (info.IsSpecial && dto.VolumeNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume))
{
info.Subtitle = info.FileName;
} else if (!info.IsSpecial && info.VolumeNumber.Equals(Parser.Parser.DefaultVolume))
} else if (!info.IsSpecial && info.VolumeNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume))
{
info.Subtitle = _readerService.FormatChapterName(info.LibraryType, true, true) + info.ChapterNumber;
}
else
{
info.Subtitle = "Volume " + info.VolumeNumber;
if (!info.ChapterNumber.Equals(Parser.Parser.DefaultChapter))
if (!info.ChapterNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter))
{
info.Subtitle += " " + _readerService.FormatChapterName(info.LibraryType, true, true) +
info.ChapterNumber;

View File

@ -8,6 +8,7 @@ using API.DTOs.ReadingLists;
using API.Entities;
using API.Extensions;
using API.Helpers;
using API.Services;
using API.SignalR;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
@ -19,12 +20,14 @@ namespace API.Controllers
{
private readonly IUnitOfWork _unitOfWork;
private readonly IEventHub _eventHub;
private readonly IReadingListService _readingListService;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public ReadingListController(IUnitOfWork unitOfWork, IEventHub eventHub)
public ReadingListController(IUnitOfWork unitOfWork, IEventHub eventHub, IReadingListService readingListService)
{
_unitOfWork = unitOfWork;
_eventHub = eventHub;
_readingListService = readingListService;
}
/// <summary>
@ -55,6 +58,11 @@ namespace API.Controllers
return Ok(items);
}
/// <summary>
/// Returns all Reading Lists the user has access to that have a series within it.
/// </summary>
/// <param name="seriesId"></param>
/// <returns></returns>
[HttpGet("lists-for-series")]
public async Task<ActionResult<IEnumerable<ReadingListDto>>> GetListsForSeries(int seriesId)
{
@ -78,17 +86,6 @@ namespace API.Controllers
return Ok(items);
}
private async Task<AppUser?> UserHasReadingListAccess(int readingListId)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(),
AppUserIncludes.ReadingLists);
if (user.ReadingLists.SingleOrDefault(rl => rl.Id == readingListId) == null && !await _unitOfWork.UserRepository.IsUserAdminAsync(user))
{
return null;
}
return user;
}
/// <summary>
/// Updates an items position
@ -99,25 +96,14 @@ namespace API.Controllers
public async Task<ActionResult> UpdateListItemPosition(UpdateReadingListPosition dto)
{
// Make sure UI buffers events
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
}
var items = (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(dto.ReadingListId)).ToList();
var item = items.Find(r => r.Id == dto.ReadingListItemId);
items.Remove(item);
items.Insert(dto.ToPosition, item);
for (var i = 0; i < items.Count; i++)
{
items[i].Order = i;
}
if (await _readingListService.UpdateReadingListItemPosition(dto)) return Ok("Updated");
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
return Ok("Updated");
}
return BadRequest("Couldn't update position");
}
@ -130,25 +116,13 @@ namespace API.Controllers
[HttpPost("delete-item")]
public async Task<ActionResult> DeleteListItem(UpdateReadingListPosition dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
}
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId);
readingList.Items = readingList.Items.Where(r => r.Id != dto.ReadingListItemId).ToList();
var index = 0;
foreach (var readingListItem in readingList.Items)
{
readingListItem.Order = index;
index++;
}
if (!_unitOfWork.HasChanges()) return Ok();
if (await _unitOfWork.CommitAsync())
if (await _readingListService.DeleteReadingListItem(dto))
{
return Ok("Updated");
}
@ -164,34 +138,16 @@ namespace API.Controllers
[HttpPost("remove-read")]
public async Task<ActionResult> DeleteReadFromList([FromQuery] int readingListId)
{
var user = await UserHasReadingListAccess(readingListId);
var user = await _readingListService.UserHasReadingListAccess(readingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
}
var items = await _unitOfWork.ReadingListRepository.GetReadingListItemDtosByIdAsync(readingListId, user.Id);
items = await _unitOfWork.ReadingListRepository.AddReadingProgressModifiers(user.Id, items.ToList());
// Collect all Ids to remove
var itemIdsToRemove = items.Where(item => item.PagesRead == item.PagesTotal).Select(item => item.Id);
try
if (await _readingListService.RemoveFullyReadItems(readingListId, user))
{
var listItems =
(await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(readingListId)).Where(r =>
itemIdsToRemove.Contains(r.Id));
_unitOfWork.ReadingListRepository.BulkRemove(listItems);
if (!_unitOfWork.HasChanges()) return Ok("Nothing to remove");
await _unitOfWork.CommitAsync();
return Ok("Updated");
}
catch
{
await _unitOfWork.RollbackAsync();
}
return BadRequest("Could not remove read items");
}
@ -204,20 +160,13 @@ namespace API.Controllers
[HttpDelete]
public async Task<ActionResult> DeleteList([FromQuery] int readingListId)
{
var user = await UserHasReadingListAccess(readingListId);
var user = await _readingListService.UserHasReadingListAccess(readingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
}
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(readingListId);
user.ReadingLists.Remove(readingList);
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
return Ok("Deleted");
}
if (await _readingListService.DeleteReadingList(readingListId, user)) return Ok("List was deleted");
return BadRequest("There was an issue deleting reading list");
}
@ -230,7 +179,8 @@ namespace API.Controllers
[HttpPost("create")]
public async Task<ActionResult<ReadingListDto>> CreateList(CreateReadingListDto dto)
{
var user = await _unitOfWork.UserRepository.GetUserWithReadingListsByUsernameAsync(User.GetUsername());
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), AppUserIncludes.ReadingListsWithItems);
// When creating, we need to make sure Title is unique
var hasExisting = user.ReadingLists.Any(l => l.Title.Equals(dto.Title));
@ -260,7 +210,7 @@ namespace API.Controllers
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId);
if (readingList == null) return BadRequest("List does not exist");
var user = await UserHasReadingListAccess(readingList.Id);
var user = await _readingListService.UserHasReadingListAccess(readingList.Id, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -269,7 +219,7 @@ namespace API.Controllers
if (!string.IsNullOrEmpty(dto.Title))
{
readingList.Title = dto.Title; // Should I check if this is unique?
readingList.NormalizedTitle = Parser.Parser.Normalize(readingList.Title);
readingList.NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(readingList.Title);
}
if (!string.IsNullOrEmpty(dto.Title))
{
@ -308,7 +258,7 @@ namespace API.Controllers
[HttpPost("update-by-series")]
public async Task<ActionResult> UpdateListBySeries(UpdateReadingListBySeriesDto dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -320,7 +270,7 @@ namespace API.Controllers
await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new [] {dto.SeriesId});
// If there are adds, tell tracking this has been modified
if (await AddChaptersToReadingList(dto.SeriesId, chapterIdsForSeries, readingList))
if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIdsForSeries, readingList))
{
_unitOfWork.ReadingListRepository.Update(readingList);
}
@ -350,7 +300,7 @@ namespace API.Controllers
[HttpPost("update-by-multiple")]
public async Task<ActionResult> UpdateListByMultiple(UpdateReadingListByMultipleDto dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -365,7 +315,7 @@ namespace API.Controllers
}
// If there are adds, tell tracking this has been modified
if (await AddChaptersToReadingList(dto.SeriesId, chapterIds, readingList))
if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIds, readingList))
{
_unitOfWork.ReadingListRepository.Update(readingList);
}
@ -394,7 +344,7 @@ namespace API.Controllers
[HttpPost("update-by-multiple-series")]
public async Task<ActionResult> UpdateListByMultipleSeries(UpdateReadingListByMultipleSeriesDto dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -407,7 +357,7 @@ namespace API.Controllers
foreach (var seriesId in ids.Keys)
{
// If there are adds, tell tracking this has been modified
if (await AddChaptersToReadingList(seriesId, ids[seriesId], readingList))
if (await _readingListService.AddChaptersToReadingList(seriesId, ids[seriesId], readingList))
{
_unitOfWork.ReadingListRepository.Update(readingList);
}
@ -432,7 +382,7 @@ namespace API.Controllers
[HttpPost("update-by-volume")]
public async Task<ActionResult> UpdateListByVolume(UpdateReadingListByVolumeDto dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -444,7 +394,7 @@ namespace API.Controllers
(await _unitOfWork.ChapterRepository.GetChaptersAsync(dto.VolumeId)).Select(c => c.Id).ToList();
// If there are adds, tell tracking this has been modified
if (await AddChaptersToReadingList(dto.SeriesId, chapterIdsForVolume, readingList))
if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIdsForVolume, readingList))
{
_unitOfWork.ReadingListRepository.Update(readingList);
}
@ -468,7 +418,7 @@ namespace API.Controllers
[HttpPost("update-by-chapter")]
public async Task<ActionResult> UpdateListByChapter(UpdateReadingListByChapterDto dto)
{
var user = await UserHasReadingListAccess(dto.ReadingListId);
var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername());
if (user == null)
{
return BadRequest("You do not have permissions on this reading list or the list doesn't exist");
@ -477,7 +427,7 @@ namespace API.Controllers
if (readingList == null) return BadRequest("Reading List does not exist");
// If there are adds, tell tracking this has been modified
if (await AddChaptersToReadingList(dto.SeriesId, new List<int>() { dto.ChapterId }, readingList))
if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, new List<int>() { dto.ChapterId }, readingList))
{
_unitOfWork.ReadingListRepository.Update(readingList);
}
@ -498,39 +448,7 @@ namespace API.Controllers
return Ok("Nothing to do");
}
/// <summary>
/// Adds a list of Chapters as reading list items to the passed reading list.
/// </summary>
/// <param name="seriesId"></param>
/// <param name="chapterIds"></param>
/// <param name="readingList"></param>
/// <returns>True if new chapters were added</returns>
private async Task<bool> AddChaptersToReadingList(int seriesId, IList<int> chapterIds,
ReadingList readingList)
{
// TODO: Move to ReadingListService and Unit Test
readingList.Items ??= new List<ReadingListItem>();
var lastOrder = 0;
if (readingList.Items.Any())
{
lastOrder = readingList.Items.DefaultIfEmpty().Max(rli => rli.Order);
}
var existingChapterExists = readingList.Items.Select(rli => rli.ChapterId).ToHashSet();
var chaptersForSeries = (await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds))
.OrderBy(c => Parser.Parser.MinNumberFromRange(c.Volume.Name))
.ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting);
var index = lastOrder + 1;
foreach (var chapter in chaptersForSeries)
{
if (existingChapterExists.Contains(chapter.Id)) continue;
readingList.Items.Add(DbFactory.ReadingListItem(index, seriesId, chapter.VolumeId, chapter.Id));
index += 1;
}
return index > lastOrder + 1;
}
/// <summary>
/// Returns the next chapter within the reading list

View File

@ -156,12 +156,14 @@ namespace API.Controllers
}
series.Name = updateSeries.Name.Trim();
series.NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name);
if (!string.IsNullOrEmpty(updateSeries.SortName.Trim()))
{
series.SortName = updateSeries.SortName.Trim();
}
series.LocalizedName = updateSeries.LocalizedName.Trim();
series.NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName);
series.NameLocked = updateSeries.NameLocked;
series.SortNameLocked = updateSeries.SortNameLocked;

View File

@ -10,6 +10,7 @@ using API.Entities.Enums;
using API.Extensions;
using API.Helpers.Converters;
using API.Services;
using API.Services.Tasks.Scanner;
using AutoMapper;
using Flurl.Http;
using Kavita.Common;
@ -29,9 +30,10 @@ namespace API.Controllers
private readonly IDirectoryService _directoryService;
private readonly IMapper _mapper;
private readonly IEmailService _emailService;
private readonly ILibraryWatcher _libraryWatcher;
public SettingsController(ILogger<SettingsController> logger, IUnitOfWork unitOfWork, ITaskScheduler taskScheduler,
IDirectoryService directoryService, IMapper mapper, IEmailService emailService)
IDirectoryService directoryService, IMapper mapper, IEmailService emailService, ILibraryWatcher libraryWatcher)
{
_logger = logger;
_unitOfWork = unitOfWork;
@ -39,6 +41,7 @@ namespace API.Controllers
_directoryService = directoryService;
_mapper = mapper;
_emailService = emailService;
_libraryWatcher = libraryWatcher;
}
[AllowAnonymous]
@ -227,6 +230,21 @@ namespace API.Controllers
_unitOfWork.SettingsRepository.Update(setting);
}
if (setting.Key == ServerSettingKey.EnableFolderWatching && updateSettingsDto.EnableFolderWatching + string.Empty != setting.Value)
{
setting.Value = updateSettingsDto.EnableFolderWatching + string.Empty;
_unitOfWork.SettingsRepository.Update(setting);
if (updateSettingsDto.EnableFolderWatching)
{
await _libraryWatcher.StartWatching();
}
else
{
_libraryWatcher.StopWatching();
}
}
}
if (!_unitOfWork.HasChanges()) return Ok(updateSettingsDto);

View File

@ -49,9 +49,8 @@ public class TachiyomiController : BaseApiController
// If prevChapterId is -1, this means either nothing is read or everything is read.
if (prevChapterId == -1)
{
var userWithProgress = await _unitOfWork.UserRepository.GetUserByIdAsync(userId, AppUserIncludes.Progress);
var userHasProgress =
userWithProgress.Progresses.Any(x => x.SeriesId == seriesId);
var series = await _unitOfWork.SeriesRepository.GetSeriesDtoByIdAsync(seriesId, userId);
var userHasProgress = series.PagesRead != 0 && series.PagesRead < series.Pages;
// If the user doesn't have progress, then return null, which the extension will catch as 204 (no content) and report nothing as read
if (!userHasProgress) return null;
@ -61,21 +60,22 @@ public class TachiyomiController : BaseApiController
var looseLeafChapterVolume = volumes.FirstOrDefault(v => v.Number == 0);
if (looseLeafChapterVolume == null)
{
var volumeChapter = _mapper.Map<ChapterDto>(volumes.Last().Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparerZeroFirst()).Last());
var volumeChapter = _mapper.Map<ChapterDto>(volumes.Last().Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparerZeroFirst.Default).Last());
return Ok(new ChapterDto()
{
Number = $"{int.Parse(volumeChapter.Number) / 100f}"
});
}
var lastChapter = looseLeafChapterVolume.Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparer()).Last();
var lastChapter = looseLeafChapterVolume.Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default).Last();
return Ok(_mapper.Map<ChapterDto>(lastChapter));
}
// There is progress, we now need to figure out the highest volume or chapter and return that.
var prevChapter = await _unitOfWork.ChapterRepository.GetChapterDtoAsync(prevChapterId);
var volumeWithProgress = await _unitOfWork.VolumeRepository.GetVolumeDtoAsync(prevChapter.VolumeId, userId);
if (volumeWithProgress.Number != 0)
// We only encode for single-file volumes
if (volumeWithProgress.Number != 0 && volumeWithProgress.Chapters.Count == 1)
{
// The progress is on a volume, encode it as a fake chapterDTO
return Ok(new ChapterDto()

View File

@ -51,6 +51,7 @@ public class ThemeController : BaseApiController
/// Returns css content to the UI. UI is expected to escape the content
/// </summary>
/// <returns></returns>
[AllowAnonymous]
[HttpGet("download-content")]
public async Task<ActionResult<string>> GetThemeContent(int themeId)
{

View File

@ -1,11 +1,17 @@
namespace API.DTOs.Reader
using System.ComponentModel.DataAnnotations;
namespace API.DTOs.Reader
{
public class BookmarkDto
{
public int Id { get; set; }
[Required]
public int Page { get; set; }
[Required]
public int VolumeId { get; set; }
[Required]
public int SeriesId { get; set; }
[Required]
public int ChapterId { get; set; }
}
}

View File

@ -1,10 +1,18 @@
namespace API.DTOs.ReadingLists
using System.ComponentModel.DataAnnotations;
namespace API.DTOs.ReadingLists
{
/// <summary>
/// DTO for moving a reading list item to another position within the same list
/// </summary>
public class UpdateReadingListPosition
{
[Required]
public int ReadingListId { get; set; }
[Required]
public int ReadingListItemId { get; set; }
public int FromPosition { get; set; }
[Required]
public int ToPosition { get; set; }
}
}

17
API/DTOs/ScanFolderDto.cs Normal file
View File

@ -0,0 +1,17 @@
namespace API.DTOs;
/// <summary>
/// DTO for requesting a folder to be scanned
/// </summary>
public class ScanFolderDto
{
/// <summary>
/// Api key for a user with Admin permissions
/// </summary>
public string ApiKey { get; set; }
/// <summary>
/// Folder Path to Scan
/// </summary>
/// <remarks>JSON cannot accept /, so you may need to use // escaping on paths</remarks>
public string FolderPath { get; set; }
}

View File

@ -54,5 +54,13 @@ namespace API.DTOs
public int MaxHoursToRead { get; set; }
/// <inheritdoc cref="IHasReadTimeEstimate.AvgHoursToRead"/>
public int AvgHoursToRead { get; set; }
/// <summary>
/// The highest level folder for this Series
/// </summary>
public string FolderPath { get; set; }
/// <summary>
/// The last time the folder for this series was scanned
/// </summary>
public DateTime LastFolderScanned { get; set; }
}
}

View File

@ -1,5 +1,4 @@
using System.Collections.Generic;
using API.Services;
using API.Services;
namespace API.DTOs.Settings
{
@ -43,7 +42,9 @@ namespace API.DTOs.Settings
/// Represents a unique Id to this Kavita installation. Only used in Stats to identify unique installs.
/// </summary>
public string InstallId { get; set; }
/// <summary>
/// If the server should save bookmarks as WebP encoding
/// </summary>
public bool ConvertBookmarkToWebP { get; set; }
/// <summary>
/// If the Swagger UI Should be exposed. Does not require authentication, but does require a JWT.
@ -55,5 +56,9 @@ namespace API.DTOs.Settings
/// </summary>
/// <remarks>Value should be between 1 and 30</remarks>
public int TotalBackups { get; set; } = 30;
/// <summary>
/// If Kavita should watch the library folders and process changes
/// </summary>
public bool EnableFolderWatching { get; set; } = true;
}
}

View File

@ -43,6 +43,7 @@ namespace API.Data
public DbSet<Tag> Tag { get; set; }
public DbSet<SiteTheme> SiteTheme { get; set; }
public DbSet<SeriesRelation> SeriesRelation { get; set; }
public DbSet<FolderPath> FolderPath { get; set; }
protected override void OnModelCreating(ModelBuilder builder)
@ -71,7 +72,9 @@ namespace API.Data
builder.Entity<SeriesRelation>()
.HasOne(pt => pt.TargetSeries)
.WithMany(t => t.RelationOf)
.HasForeignKey(pt => pt.TargetSeriesId);
.HasForeignKey(pt => pt.TargetSeriesId)
.OnDelete(DeleteBehavior.ClientCascade);
builder.Entity<AppUserPreferences>()
.Property(b => b.BookThemeName)

View File

@ -23,7 +23,27 @@ namespace API.Data
Name = name,
OriginalName = name,
LocalizedName = name,
NormalizedName = Parser.Parser.Normalize(name),
NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
SortName = name,
Volumes = new List<Volume>(),
Metadata = SeriesMetadata(Array.Empty<CollectionTag>())
};
}
public static Series Series(string name, string localizedName)
{
if (string.IsNullOrEmpty(localizedName))
{
localizedName = name;
}
return new Series
{
Name = name,
OriginalName = name,
LocalizedName = localizedName,
NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(localizedName),
SortName = name,
Volumes = new List<Volume>(),
Metadata = SeriesMetadata(Array.Empty<CollectionTag>())
@ -35,7 +55,7 @@ namespace API.Data
return new Volume()
{
Name = volumeNumber,
Number = (int) Parser.Parser.MinNumberFromRange(volumeNumber),
Number = (int) Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(volumeNumber),
Chapters = new List<Chapter>()
};
}
@ -46,7 +66,7 @@ namespace API.Data
var specialTitle = specialTreatment ? info.Filename : info.Chapters;
return new Chapter()
{
Number = specialTreatment ? "0" : Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty,
Number = specialTreatment ? "0" : Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty,
Range = specialTreatment ? info.Filename : info.Chapters,
Title = (specialTreatment && info.Format == MangaFormat.Epub)
? info.Title
@ -75,7 +95,7 @@ namespace API.Data
return new CollectionTag()
{
Id = id,
NormalizedTitle = API.Parser.Parser.Normalize(title?.Trim()).ToUpper(),
NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(title?.Trim()).ToUpper(),
Title = title?.Trim(),
Summary = summary?.Trim(),
Promoted = promoted
@ -86,7 +106,7 @@ namespace API.Data
{
return new ReadingList()
{
NormalizedTitle = API.Parser.Parser.Normalize(title?.Trim()).ToUpper(),
NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(title?.Trim()).ToUpper(),
Title = title?.Trim(),
Summary = summary?.Trim(),
Promoted = promoted,
@ -110,7 +130,7 @@ namespace API.Data
return new Genre()
{
Title = name.Trim().SentenceCase(),
NormalizedTitle = Parser.Parser.Normalize(name),
NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
ExternalTag = external
};
}
@ -120,7 +140,7 @@ namespace API.Data
return new Tag()
{
Title = name.Trim().SentenceCase(),
NormalizedTitle = Parser.Parser.Normalize(name),
NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
ExternalTag = external
};
}
@ -130,7 +150,7 @@ namespace API.Data
return new Person()
{
Name = name.Trim(),
NormalizedName = Parser.Parser.Normalize(name),
NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name),
Role = role
};
}

View File

@ -107,16 +107,16 @@ namespace API.Data.Metadata
info.SeriesSort = info.SeriesSort.Trim();
info.LocalizedSeries = info.LocalizedSeries.Trim();
info.Writer = Parser.Parser.CleanAuthor(info.Writer);
info.Colorist = Parser.Parser.CleanAuthor(info.Colorist);
info.Editor = Parser.Parser.CleanAuthor(info.Editor);
info.Inker = Parser.Parser.CleanAuthor(info.Inker);
info.Letterer = Parser.Parser.CleanAuthor(info.Letterer);
info.Penciller = Parser.Parser.CleanAuthor(info.Penciller);
info.Publisher = Parser.Parser.CleanAuthor(info.Publisher);
info.Characters = Parser.Parser.CleanAuthor(info.Characters);
info.Translator = Parser.Parser.CleanAuthor(info.Translator);
info.CoverArtist = Parser.Parser.CleanAuthor(info.CoverArtist);
info.Writer = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Writer);
info.Colorist = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Colorist);
info.Editor = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Editor);
info.Inker = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Inker);
info.Letterer = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Letterer);
info.Penciller = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Penciller);
info.Publisher = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Publisher);
info.Characters = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Characters);
info.Translator = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Translator);
info.CoverArtist = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.CoverArtist);
}

View File

@ -1,105 +0,0 @@
using System;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Entities.Enums;
using API.Services;
using Microsoft.Extensions.Logging;
namespace API.Data;
/// <summary>
/// Responsible to migrate existing bookmarks to files. Introduced in v0.4.9.27
/// </summary>
public static class MigrateBookmarks
{
/// <summary>
/// This will migrate existing bookmarks to bookmark folder based.
/// If the bookmarks folder already exists, this will not run.
/// </summary>
/// <remarks>Bookmark directory is configurable. This will always use the default bookmark directory.</remarks>
/// <param name="directoryService"></param>
/// <param name="unitOfWork"></param>
/// <param name="logger"></param>
/// <param name="cacheService"></param>
/// <returns></returns>
public static async Task Migrate(IDirectoryService directoryService, IUnitOfWork unitOfWork,
ILogger<Program> logger, ICacheService cacheService)
{
var bookmarkDirectory = (await unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory))
.Value;
if (string.IsNullOrEmpty(bookmarkDirectory))
{
bookmarkDirectory = directoryService.BookmarkDirectory;
}
if (directoryService.Exists(bookmarkDirectory)) return;
logger.LogInformation("Bookmark migration is needed....This may take some time");
var allBookmarks = (await unitOfWork.UserRepository.GetAllBookmarksAsync()).ToList();
var uniqueChapterIds = allBookmarks.Select(b => b.ChapterId).Distinct().ToList();
var uniqueUserIds = allBookmarks.Select(b => b.AppUserId).Distinct().ToList();
foreach (var userId in uniqueUserIds)
{
foreach (var chapterId in uniqueChapterIds)
{
var chapterBookmarks = allBookmarks.Where(b => b.ChapterId == chapterId).ToList();
var chapterPages = chapterBookmarks
.Select(b => b.Page).ToList();
var seriesId = chapterBookmarks
.Select(b => b.SeriesId).First();
var mangaFiles = await unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId);
var chapterExtractPath = directoryService.FileSystem.Path.Join(directoryService.TempDirectory, $"bookmark_c{chapterId}_u{userId}_s{seriesId}");
var numericComparer = new NumericComparer();
if (!mangaFiles.Any()) continue;
switch (mangaFiles.First().Format)
{
case MangaFormat.Image:
directoryService.ExistOrCreate(chapterExtractPath);
directoryService.CopyFilesToDirectory(mangaFiles.Select(f => f.FilePath), chapterExtractPath);
break;
case MangaFormat.Archive:
case MangaFormat.Pdf:
cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
break;
case MangaFormat.Epub:
continue;
default:
continue;
}
var files = directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
// Filter out images that aren't in bookmarks
Array.Sort(files, numericComparer);
foreach (var chapterPage in chapterPages)
{
var file = files.ElementAt(chapterPage);
var bookmark = allBookmarks.FirstOrDefault(b =>
b.ChapterId == chapterId && b.SeriesId == seriesId && b.AppUserId == userId &&
b.Page == chapterPage);
if (bookmark == null) continue;
var filename = directoryService.FileSystem.Path.GetFileName(file);
var newLocation = directoryService.FileSystem.Path.Join(
ReaderService.FormatBookmarkFolderPath(String.Empty, userId, seriesId, chapterId),
filename);
bookmark.FileName = newLocation;
directoryService.CopyFileToDirectory(file,
ReaderService.FormatBookmarkFolderPath(bookmarkDirectory, userId, seriesId, chapterId));
unitOfWork.UserRepository.Update(bookmark);
}
}
// Clear temp after each user to avoid too much space being eaten
directoryService.ClearDirectory(directoryService.TempDirectory);
}
await unitOfWork.CommitAsync();
// Run CleanupService as we cache a ton of files
directoryService.ClearDirectory(directoryService.TempDirectory);
}
}

View File

@ -0,0 +1,38 @@
using System.Linq;
using System.Threading.Tasks;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Logging;
namespace API.Data;
/// <summary>
/// v0.5.6 introduced Normalized Localized Name, which allows for faster lookups and less memory usage. This migration will calculate them once
/// </summary>
public static class MigrateNormalizedLocalizedName
{
public static async Task Migrate(IUnitOfWork unitOfWork, DataContext dataContext, ILogger<Program> logger)
{
if (!await dataContext.Series.Where(s => s.NormalizedLocalizedName == null).AnyAsync())
{
return;
}
logger.LogInformation("Running MigrateNormalizedLocalizedName migration. Please be patient, this may take some time");
foreach (var series in await dataContext.Series.ToListAsync())
{
series.NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName ?? string.Empty);
logger.LogInformation("Updated {SeriesName} normalized localized name: {LocalizedName}", series.Name, series.NormalizedLocalizedName);
unitOfWork.SeriesRepository.Update(series);
}
if (unitOfWork.HasChanges())
{
await unitOfWork.CommitAsync();
}
logger.LogInformation("MigrateNormalizedLocalizedName migration finished");
}
}

View File

@ -13,16 +13,15 @@ public static class MigrateRemoveExtraThemes
{
public static async Task Migrate(IUnitOfWork unitOfWork, IThemeService themeService)
{
Console.WriteLine("Removing Dark and E-Ink themes");
var themes = (await unitOfWork.SiteThemeRepository.GetThemes()).ToList();
if (themes.FirstOrDefault(t => t.Name.Equals("Light")) == null)
{
Console.WriteLine("Done. Nothing to do");
return;
}
Console.WriteLine("Removing Dark and E-Ink themes");
var darkTheme = themes.Single(t => t.Name.Equals("Dark"));
var lightTheme = themes.Single(t => t.Name.Equals("Light"));
var eInkTheme = themes.Single(t => t.Name.Equals("E-Ink"));

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,37 @@
using System;
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class SeriesFolder : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<string>(
name: "FolderPath",
table: "Series",
type: "TEXT",
nullable: true);
migrationBuilder.AddColumn<DateTime>(
name: "LastFolderScanned",
table: "Series",
type: "TEXT",
nullable: false,
defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified));
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "FolderPath",
table: "Series");
migrationBuilder.DropColumn(
name: "LastFolderScanned",
table: "Series");
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,25 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class NormalizedLocalizedName : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn<string>(
name: "NormalizedLocalizedName",
table: "Series",
type: "TEXT",
nullable: true);
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "NormalizedLocalizedName",
table: "Series");
}
}
}

View File

@ -782,12 +782,18 @@ namespace API.Data.Migrations
b.Property<DateTime>("Created")
.HasColumnType("TEXT");
b.Property<string>("FolderPath")
.HasColumnType("TEXT");
b.Property<int>("Format")
.HasColumnType("INTEGER");
b.Property<DateTime>("LastChapterAdded")
.HasColumnType("TEXT");
b.Property<DateTime>("LastFolderScanned")
.HasColumnType("TEXT");
b.Property<DateTime>("LastModified")
.HasColumnType("TEXT");
@ -812,6 +818,9 @@ namespace API.Data.Migrations
b.Property<bool>("NameLocked")
.HasColumnType("INTEGER");
b.Property<string>("NormalizedLocalizedName")
.HasColumnType("TEXT");
b.Property<string>("NormalizedName")
.HasColumnType("TEXT");

View File

@ -56,6 +56,7 @@ public class CollectionTagRepository : ICollectionTagRepository
/// </summary>
public async Task<int> RemoveTagsWithoutSeries()
{
// TODO: Write a Unit test to validate this works
var tagsToDelete = await _context.CollectionTag
.Include(c => c.SeriesMetadatas)
.Where(c => c.SeriesMetadatas.Count == 0)

View File

@ -44,7 +44,7 @@ public class GenreRepository : IGenreRepository
public async Task<Genre> FindByNameAsync(string genreName)
{
var normalizedName = Parser.Parser.Normalize(genreName);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(genreName);
return await _context.Genre
.FirstOrDefaultAsync(g => g.NormalizedTitle.Equals(normalizedName));
}

View File

@ -34,19 +34,19 @@ public interface ILibraryRepository
Task<IEnumerable<LibraryDto>> GetLibraryDtosAsync();
Task<bool> LibraryExists(string libraryName);
Task<Library> GetLibraryForIdAsync(int libraryId, LibraryIncludes includes);
Task<Library> GetFullLibraryForIdAsync(int libraryId);
Task<Library> GetFullLibraryForIdAsync(int libraryId, int seriesId);
Task<IEnumerable<LibraryDto>> GetLibraryDtosForUsernameAsync(string userName);
Task<IEnumerable<Library>> GetLibrariesAsync();
Task<IEnumerable<Library>> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None);
Task<bool> DeleteLibrary(int libraryId);
Task<IEnumerable<Library>> GetLibrariesForUserIdAsync(int userId);
Task<LibraryType> GetLibraryTypeAsync(int libraryId);
Task<IEnumerable<Library>> GetLibraryForIdsAsync(IList<int> libraryIds);
Task<IEnumerable<Library>> GetLibraryForIdsAsync(IEnumerable<int> libraryIds, LibraryIncludes includes = LibraryIncludes.None);
Task<int> GetTotalFiles();
IEnumerable<JumpKeyDto> GetJumpBarAsync(int libraryId);
Task<IList<AgeRatingDto>> GetAllAgeRatingsDtosForLibrariesAsync(List<int> libraryIds);
Task<IList<LanguageDto>> GetAllLanguagesForLibrariesAsync(List<int> libraryIds);
IEnumerable<PublicationStatusDto> GetAllPublicationStatusesDtosForLibrariesAsync(List<int> libraryIds);
Task<bool> DoAnySeriesFoldersMatch(IEnumerable<string> folders);
Library GetLibraryByFolder(string folder);
}
public class LibraryRepository : ILibraryRepository
@ -87,11 +87,19 @@ public class LibraryRepository : ILibraryRepository
.ToListAsync();
}
public async Task<IEnumerable<Library>> GetLibrariesAsync()
/// <summary>
/// Returns all libraries including their AppUsers + extra includes
/// </summary>
/// <param name="includes"></param>
/// <returns></returns>
public async Task<IEnumerable<Library>> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None)
{
return await _context.Library
var query = _context.Library
.Include(l => l.AppUsers)
.ToListAsync();
.Select(l => l);
query = AddIncludesToQuery(query, includes);
return await query.ToListAsync();
}
public async Task<bool> DeleteLibrary(int libraryId)
@ -120,11 +128,13 @@ public class LibraryRepository : ILibraryRepository
.SingleAsync();
}
public async Task<IEnumerable<Library>> GetLibraryForIdsAsync(IList<int> libraryIds)
public async Task<IEnumerable<Library>> GetLibraryForIdsAsync(IEnumerable<int> libraryIds, LibraryIncludes includes = LibraryIncludes.None)
{
return await _context.Library
.Where(x => libraryIds.Contains(x.Id))
.ToListAsync();
var query = _context.Library
.Where(x => libraryIds.Contains(x.Id));
AddIncludesToQuery(query, includes);
return await query.ToListAsync();
}
public async Task<int> GetTotalFiles()
@ -317,4 +327,23 @@ public class LibraryRepository : ILibraryRepository
.OrderBy(s => s.Title);
}
/// <summary>
/// Checks if any series folders match the folders passed in
/// </summary>
/// <param name="folders"></param>
/// <returns></returns>
public async Task<bool> DoAnySeriesFoldersMatch(IEnumerable<string> folders)
{
var normalized = folders.Select(Services.Tasks.Scanner.Parser.Parser.NormalizePath);
return await _context.Series.AnyAsync(s => normalized.Contains(s.FolderPath));
}
public Library? GetLibraryByFolder(string folder)
{
var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder);
return _context.Library
.Include(l => l.Folders)
.AsSplitQuery()
.SingleOrDefault(l => l.Folders.Select(f => f.Path).Contains(normalized));
}
}

View File

@ -42,7 +42,7 @@ public class PersonRepository : IPersonRepository
public async Task<Person> FindByNameAsync(string name)
{
var normalizedName = Parser.Parser.Normalize(name);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name);
return await _context.Person
.Where(p => normalizedName.Equals(p.NormalizedName))
.SingleOrDefaultAsync();

View File

@ -1,6 +1,5 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
@ -19,12 +18,11 @@ using API.Extensions;
using API.Helpers;
using API.Services;
using API.Services.Tasks;
using API.Services.Tasks.Scanner;
using AutoMapper;
using AutoMapper.QueryableExtensions;
using Kavita.Common.Extensions;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using SQLitePCL;
namespace API.Data.Repositories;
@ -120,6 +118,12 @@ public interface ISeriesRepository
Task<SeriesDto> GetSeriesForMangaFile(int mangaFileId, int userId);
Task<SeriesDto> GetSeriesForChapter(int chapterId, int userId);
Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter);
Task<int> GetSeriesIdByFolder(string folder);
Task<Series> GetSeriesByFolderPath(string folder);
Task<Series> GetFullSeriesByName(string series, int libraryId);
Task<Series> GetFullSeriesByAnyName(string seriesName, string localizedName, int libraryId, MangaFormat format, bool withFullIncludes = true);
Task<List<Series>> RemoveSeriesNotInList(IList<ParsedSeries> seenSeries, int libraryId);
Task<IDictionary<string, IList<SeriesModified>>> GetFolderPathMap(int libraryId);
}
public class SeriesRepository : ISeriesRepository
@ -156,6 +160,7 @@ public class SeriesRepository : ISeriesRepository
/// Returns if a series name and format exists already in a library
/// </summary>
/// <param name="name">Name of series</param>
/// <param name="libraryId"></param>
/// <param name="format">Format of series</param>
/// <returns></returns>
public async Task<bool> DoesSeriesNameExistInLibrary(string name, int libraryId, MangaFormat format)
@ -179,6 +184,7 @@ public class SeriesRepository : ISeriesRepository
/// Used for <see cref="ScannerService"/> to
/// </summary>
/// <param name="libraryId"></param>
/// <param name="userParams"></param>
/// <returns></returns>
public async Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams)
{
@ -224,6 +230,7 @@ public class SeriesRepository : ISeriesRepository
{
return await _context.Series
.Where(s => s.Id == seriesId)
.Include(s => s.Relations)
.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Include(s => s.Metadata)
@ -295,7 +302,7 @@ public class SeriesRepository : ISeriesRepository
{
const int maxRecords = 15;
var result = new SearchResultGroupDto();
var searchQueryNormalized = Parser.Parser.Normalize(searchQuery);
var searchQueryNormalized = Services.Tasks.Scanner.Parser.Parser.Normalize(searchQuery);
var seriesIds = _context.Series
.Where(s => libraryIds.Contains(s.LibraryId))
@ -432,6 +439,7 @@ public class SeriesRepository : ISeriesRepository
/// Returns Volumes, Metadata (Incl Genres and People), and Collection Tags
/// </summary>
/// <param name="seriesId"></param>
/// <param name="includes"></param>
/// <returns></returns>
public async Task<Series> GetSeriesByIdAsync(int seriesId, SeriesIncludes includes = SeriesIncludes.Volumes | SeriesIncludes.Metadata)
{
@ -477,6 +485,7 @@ public class SeriesRepository : ISeriesRepository
.Include(s => s.Volumes)
.Include(s => s.Metadata)
.ThenInclude(m => m.CollectionTags)
.Include(s => s.Relations)
.Where(s => seriesIds.Contains(s.Id))
.AsSplitQuery()
.ToListAsync();
@ -1136,21 +1145,162 @@ public class SeriesRepository : ISeriesRepository
.SingleOrDefaultAsync();
}
public async Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter)
/// <summary>
/// Given a folder path return a Series with the <see cref="Series.FolderPath"/> that matches.
/// </summary>
/// <remarks>This will apply normalization on the path.</remarks>
/// <param name="folder"></param>
/// <returns></returns>
public async Task<int> GetSeriesIdByFolder(string folder)
{
var libraryIds = GetLibraryIdsForUser(userId);
var query = _context.AppUser
.Where(user => user.Id == userId)
.SelectMany(u => u.WantToRead)
.Where(s => libraryIds.Contains(s.LibraryId))
.AsSplitQuery()
.AsNoTracking();
var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query);
return await PagedList<SeriesDto>.CreateAsync(filteredQuery.ProjectTo<SeriesDto>(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize);
var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder);
var series = await _context.Series
.Where(s => s.FolderPath.Equals(normalized))
.SingleOrDefaultAsync();
return series?.Id ?? 0;
}
/// <summary>
/// Return a Series by Folder path. Null if not found.
/// </summary>
/// <param name="folder">This will be normalized in the query</param>
/// <returns></returns>
public async Task<Series> GetSeriesByFolderPath(string folder)
{
var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder);
return await _context.Series.SingleOrDefaultAsync(s => s.FolderPath.Equals(normalized));
}
/// <summary>
/// Finds a series by series name for a given library.
/// </summary>
/// <remarks>This pulls everything with the Series, so should be used only when needing tracking on all related tables</remarks>
/// <param name="series"></param>
/// <param name="libraryId"></param>
/// <returns></returns>
public Task<Series> GetFullSeriesByName(string series, int libraryId)
{
var localizedSeries = Services.Tasks.Scanner.Parser.Parser.Normalize(series);
return _context.Series
.Where(s => (s.NormalizedName.Equals(localizedSeries)
|| s.LocalizedName.Equals(series)) && s.LibraryId == libraryId)
.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Include(s => s.Metadata)
.ThenInclude(m => m.Genres)
.Include(s => s.Library)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(cm => cm.People)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Tags)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Genres)
.Include(s => s.Metadata)
.ThenInclude(m => m.Tags)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Files)
.AsSplitQuery()
.SingleOrDefaultAsync();
}
/// <summary>
/// Finds a series by series name or localized name for a given library.
/// </summary>
/// <remarks>This pulls everything with the Series, so should be used only when needing tracking on all related tables</remarks>
/// <param name="seriesName"></param>
/// <param name="localizedName"></param>
/// <param name="libraryId"></param>
/// <param name="withFullIncludes">Defaults to true. This will query against all foreign keys (deep). If false, just the series will come back</param>
/// <returns></returns>
public Task<Series> GetFullSeriesByAnyName(string seriesName, string localizedName, int libraryId, MangaFormat format, bool withFullIncludes = true)
{
var normalizedSeries = Services.Tasks.Scanner.Parser.Parser.Normalize(seriesName);
var normalizedLocalized = Services.Tasks.Scanner.Parser.Parser.Normalize(localizedName);
var query = _context.Series
.Where(s => s.LibraryId == libraryId)
.Where(s => s.Format == format && format != MangaFormat.Unknown)
.Where(s => s.NormalizedName.Equals(normalizedSeries)
|| (s.NormalizedLocalizedName.Equals(normalizedSeries) && s.NormalizedLocalizedName != string.Empty));
if (!string.IsNullOrEmpty(normalizedLocalized))
{
query = query.Where(s =>
s.NormalizedName.Equals(normalizedLocalized) || s.NormalizedLocalizedName.Equals(normalizedLocalized));
}
if (!withFullIncludes)
{
return query.SingleOrDefaultAsync();
}
return query.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Include(s => s.Metadata)
.ThenInclude(m => m.Genres)
.Include(s => s.Library)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(cm => cm.People)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Tags)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Genres)
.Include(s => s.Metadata)
.ThenInclude(m => m.Tags)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Files)
.AsSplitQuery()
.SingleOrDefaultAsync();
}
/// <summary>
/// Removes series that are not in the seenSeries list. Does not commit.
/// </summary>
/// <param name="seenSeries"></param>
/// <param name="libraryId"></param>
public async Task<List<Series>> RemoveSeriesNotInList(IList<ParsedSeries> seenSeries, int libraryId)
{
if (seenSeries.Count == 0) return new List<Series>();
var ids = new List<int>();
foreach (var parsedSeries in seenSeries)
{
var series = await _context.Series
.Where(s => s.Format == parsedSeries.Format && s.NormalizedName == parsedSeries.NormalizedName &&
s.LibraryId == libraryId)
.Select(s => s.Id)
.SingleOrDefaultAsync();
if (series > 0)
{
ids.Add(series);
}
}
var seriesToRemove = await _context.Series
.Where(s => s.LibraryId == libraryId)
.Where(s => !ids.Contains(s.Id))
.ToListAsync();
_context.Series.RemoveRange(seriesToRemove);
return seriesToRemove;
}
public async Task<PagedList<SeriesDto>> GetHighlyRated(int userId, int libraryId, UserParams userParams)
{
@ -1320,4 +1470,53 @@ public class SeriesRepository : ISeriesRepository
.AsEnumerable();
return ret;
}
public async Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter)
{
var libraryIds = GetLibraryIdsForUser(userId);
var query = _context.AppUser
.Where(user => user.Id == userId)
.SelectMany(u => u.WantToRead)
.Where(s => libraryIds.Contains(s.LibraryId))
.AsSplitQuery()
.AsNoTracking();
var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query);
return await PagedList<SeriesDto>.CreateAsync(filteredQuery.ProjectTo<SeriesDto>(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize);
}
public async Task<IDictionary<string, IList<SeriesModified>>> GetFolderPathMap(int libraryId)
{
var info = await _context.Series
.Where(s => s.LibraryId == libraryId)
.AsNoTracking()
.Where(s => s.FolderPath != null)
.Select(s => new SeriesModified()
{
LastScanned = s.LastFolderScanned,
SeriesName = s.Name,
FolderPath = s.FolderPath,
Format = s.Format
}).ToListAsync();
var map = new Dictionary<string, IList<SeriesModified>>();
foreach (var series in info)
{
if (!map.ContainsKey(series.FolderPath))
{
map.Add(series.FolderPath, new List<SeriesModified>()
{
series
});
}
else
{
map[series.FolderPath].Add(series);
}
}
return map;
}
}

View File

@ -43,7 +43,7 @@ public class TagRepository : ITagRepository
public async Task<Tag> FindByNameAsync(string tagName)
{
var normalizedName = Parser.Parser.Normalize(tagName);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(tagName);
return await _context.Tag
.FirstOrDefaultAsync(g => g.NormalizedTitle.Equals(normalizedName));
}

View File

@ -23,7 +23,9 @@ public enum AppUserIncludes
ReadingLists = 8,
Ratings = 16,
UserPreferences = 32,
WantToRead = 64
WantToRead = 64,
ReadingListsWithItems = 128,
}
public interface IUserRepository
@ -36,7 +38,6 @@ public interface IUserRepository
Task<IEnumerable<MemberDto>> GetEmailConfirmedMemberDtosAsync();
Task<IEnumerable<MemberDto>> GetPendingMemberDtosAsync();
Task<IEnumerable<AppUser>> GetAdminUsersAsync();
Task<IEnumerable<AppUser>> GetNonAdminUsersAsync();
Task<bool> IsUserAdminAsync(AppUser user);
Task<AppUserRating> GetUserRatingAsync(int seriesId, int userId);
Task<AppUserPreferences> GetPreferencesAsync(string username);
@ -51,11 +52,9 @@ public interface IUserRepository
Task<AppUser> GetUserByUsernameAsync(string username, AppUserIncludes includeFlags = AppUserIncludes.None);
Task<AppUser> GetUserByIdAsync(int userId, AppUserIncludes includeFlags = AppUserIncludes.None);
Task<int> GetUserIdByUsernameAsync(string username);
Task<AppUser> GetUserWithReadingListsByUsernameAsync(string username);
Task<IList<AppUserBookmark>> GetAllBookmarksByIds(IList<int> bookmarkIds);
Task<AppUser> GetUserByEmailAsync(string email);
Task<IEnumerable<AppUser>> GetAllUsers();
Task<IEnumerable<AppUserPreferences>> GetAllPreferencesByThemeAsync(int themeId);
Task<bool> HasAccessToLibrary(int libraryId, int userId);
Task<IEnumerable<AppUser>> GetAllUsersAsync(AppUserIncludes includeFlags);
@ -167,6 +166,11 @@ public class UserRepository : IUserRepository
query = query.Include(u => u.ReadingLists);
}
if (includeFlags.HasFlag(AppUserIncludes.ReadingListsWithItems))
{
query = query.Include(u => u.ReadingLists).ThenInclude(r => r.Items);
}
if (includeFlags.HasFlag(AppUserIncludes.Ratings))
{
query = query.Include(u => u.Ratings);
@ -201,19 +205,6 @@ public class UserRepository : IUserRepository
.SingleOrDefaultAsync();
}
/// <summary>
/// Gets an AppUser by username. Returns back Reading List and their Items.
/// </summary>
/// <param name="username"></param>
/// <returns></returns>
public async Task<AppUser> GetUserWithReadingListsByUsernameAsync(string username)
{
return await _context.Users
.Include(u => u.ReadingLists)
.ThenInclude(l => l.Items)
.AsSplitQuery()
.SingleOrDefaultAsync(x => x.UserName == username);
}
/// <summary>
/// Returns all Bookmarks for a given set of Ids
@ -267,11 +258,6 @@ public class UserRepository : IUserRepository
return await _userManager.GetUsersInRoleAsync(PolicyConstants.AdminRole);
}
public async Task<IEnumerable<AppUser>> GetNonAdminUsersAsync()
{
return await _userManager.GetUsersInRoleAsync(PolicyConstants.PlebRole);
}
public async Task<bool> IsUserAdminAsync(AppUser user)
{
return await _userManager.IsInRoleAsync(user, PolicyConstants.AdminRole);
@ -404,14 +390,4 @@ public class UserRepository : IUserRepository
.AsNoTracking()
.ToListAsync();
}
public async Task<bool> ValidateUserExists(string username)
{
if (await _userManager.Users.AnyAsync(x => x.NormalizedUserName == username.ToUpper()))
{
throw new ValidationException("Username is taken.");
}
return true;
}
}

View File

@ -29,7 +29,7 @@ namespace API.Data
new()
{
Name = "Dark",
NormalizedName = Parser.Parser.Normalize("Dark"),
NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize("Dark"),
Provider = ThemeProvider.System,
FileName = "dark.scss",
IsDefault = true,
@ -103,6 +103,7 @@ namespace API.Data
new() {Key = ServerSettingKey.ConvertBookmarkToWebP, Value = "false"},
new() {Key = ServerSettingKey.EnableSwaggerUi, Value = "false"},
new() {Key = ServerSettingKey.TotalBackups, Value = "30"},
new() {Key = ServerSettingKey.EnableFolderWatching, Value = "false"},
}.ToArray());
foreach (var defaultSetting in DefaultSettings)

View File

@ -1,4 +1,5 @@
using System.Threading.Tasks;
using System;
using System.Threading.Tasks;
using API.Data.Repositories;
using API.Entities;
using AutoMapper;
@ -26,7 +27,6 @@ public interface IUnitOfWork
bool Commit();
Task<bool> CommitAsync();
bool HasChanges();
bool Rollback();
Task<bool> RollbackAsync();
}
public class UnitOfWork : IUnitOfWork
@ -93,16 +93,15 @@ public class UnitOfWork : IUnitOfWork
/// <returns></returns>
public async Task<bool> RollbackAsync()
{
await _context.DisposeAsync();
return true;
}
/// <summary>
/// Rollback transaction
/// </summary>
/// <returns></returns>
public bool Rollback()
{
_context.Dispose();
try
{
await _context.Database.RollbackTransactionAsync();
}
catch (Exception)
{
// Swallow exception (this might be used in places where a transaction isn't setup)
}
return true;
}
}

View File

@ -9,13 +9,13 @@ namespace API.Entities.Enums
{
/// <summary>
/// Image file
/// See <see cref="Parser.Parser.ImageFileExtensions"/> for supported extensions
/// See <see cref="Services.Tasks.Scanner.Parser.Parser.ImageFileExtensions"/> for supported extensions
/// </summary>
[Description("Image")]
Image = 0,
/// <summary>
/// Archive based file
/// See <see cref="Parser.Parser.ArchiveFileExtensions"/> for supported extensions
/// See <see cref="Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions"/> for supported extensions
/// </summary>
[Description("Archive")]
Archive = 1,

View File

@ -91,5 +91,10 @@ namespace API.Entities.Enums
/// </summary>
[Description("TotalBackups")]
TotalBackups = 16,
/// <summary>
/// If Kavita should watch the library folders and process changes
/// </summary>
[Description("EnableFolderWatching")]
EnableFolderWatching = 17,
}
}

View File

@ -8,8 +8,9 @@ namespace API.Entities
public int Id { get; set; }
public string Path { get; set; }
/// <summary>
/// Used when scanning to see if we can skip if nothing has changed. (not implemented)
/// Used when scanning to see if we can skip if nothing has changed
/// </summary>
/// <remarks>Time stored in UTC</remarks>
public DateTime LastScanned { get; set; }
// Relationship

View File

@ -1,5 +1,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using API.Entities.Enums;
using API.Entities.Interfaces;
@ -9,6 +11,10 @@ namespace API.Entities
{
public int Id { get; set; }
public string Name { get; set; }
/// <summary>
/// Update this summary with a way it's used, else let's remove it.
/// </summary>
[Obsolete("This has never been coded for. Likely we can remove it.")]
public string CoverImage { get; set; }
public LibraryType Type { get; set; }
public DateTime Created { get; set; }
@ -16,10 +22,22 @@ namespace API.Entities
/// <summary>
/// Last time Library was scanned
/// </summary>
/// <remarks>Time stored in UTC</remarks>
public DateTime LastScanned { get; set; }
public ICollection<FolderPath> Folders { get; set; }
public ICollection<AppUser> AppUsers { get; set; }
public ICollection<Series> Series { get; set; }
// Methods
/// <summary>
/// Has there been any modifications to the FolderPath's directory since the <see cref="FolderPath.LastScanned"/> date
/// </summary>
/// <returns></returns>
public bool AnyModificationsSinceLastScan()
{
// NOTE: I don't think we can do this due to NTFS
return Folders.All(folder => File.GetLastWriteTimeUtc(folder.Path) > folder.LastScanned);
}
}
}

View File

@ -14,10 +14,14 @@ public class Series : IEntityDate, IHasReadTimeEstimate
/// </summary>
public string Name { get; set; }
/// <summary>
/// Used internally for name matching. <see cref="Parser.Parser.Normalize"/>
/// Used internally for name matching. <see cref="Services.Tasks.Scanner.Parser.Parser.Normalize"/>
/// </summary>
public string NormalizedName { get; set; }
/// <summary>
/// Used internally for localized name matching. <see cref="Services.Tasks.Scanner.Parser.Parser.Normalize"/>
/// </summary>
public string NormalizedLocalizedName { get; set; }
/// <summary>
/// The name used to sort the Series. By default, will be the same as Name.
/// </summary>
public string SortName { get; set; }
@ -50,7 +54,15 @@ public class Series : IEntityDate, IHasReadTimeEstimate
/// Sum of all Volume page counts
/// </summary>
public int Pages { get; set; }
/// <summary>
/// Highest path (that is under library root) that contains the series.
/// </summary>
/// <remarks><see cref="Services.Tasks.Scanner.Parser.Parser.NormalizePath"/> must be used before setting</remarks>
public string FolderPath { get; set; }
/// <summary>
/// Last time the folder was scanned
/// </summary>
public DateTime LastFolderScanned { get; set; }
/// <summary>
/// The type of all the files attached to this series
/// </summary>

View File

@ -4,6 +4,7 @@ using API.Helpers;
using API.Services;
using API.Services.Tasks;
using API.Services.Tasks.Metadata;
using API.Services.Tasks.Scanner;
using API.SignalR;
using API.SignalR.Presence;
using Kavita.Common;
@ -46,10 +47,13 @@ namespace API.Extensions
services.AddScoped<IBookmarkService, BookmarkService>();
services.AddScoped<IThemeService, ThemeService>();
services.AddScoped<ISeriesService, SeriesService>();
services.AddScoped<IProcessSeries, ProcessSeries>();
services.AddScoped<IReadingListService, ReadingListService>();
services.AddScoped<IScannerService, ScannerService>();
services.AddScoped<IMetadataService, MetadataService>();
services.AddScoped<IWordCountAnalyzerService, WordCountAnalyzerService>();
services.AddScoped<ILibraryWatcher, LibraryWatcher>();

View File

@ -16,8 +16,8 @@ namespace API.Extensions
/// <returns></returns>
public static bool NameInList(this Series series, IEnumerable<string> list)
{
return list.Any(name => Parser.Parser.Normalize(name) == series.NormalizedName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.Name)
|| name == series.Name || name == series.LocalizedName || name == series.OriginalName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.OriginalName));
return list.Any(name => Services.Tasks.Scanner.Parser.Parser.Normalize(name) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name)
|| name == series.Name || name == series.LocalizedName || name == series.OriginalName || Services.Tasks.Scanner.Parser.Parser.Normalize(name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName));
}
/// <summary>
@ -28,8 +28,8 @@ namespace API.Extensions
/// <returns></returns>
public static bool NameInList(this Series series, IEnumerable<ParsedSeries> list)
{
return list.Any(name => Parser.Parser.Normalize(name.Name) == series.NormalizedName || Parser.Parser.Normalize(name.Name) == Parser.Parser.Normalize(series.Name)
|| name.Name == series.Name || name.Name == series.LocalizedName || name.Name == series.OriginalName || Parser.Parser.Normalize(name.Name) == Parser.Parser.Normalize(series.OriginalName) && series.Format == name.Format);
return list.Any(name => Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name)
|| name.Name == series.Name || name.Name == series.LocalizedName || name.Name == series.OriginalName || Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName) && series.Format == name.Format);
}
/// <summary>
@ -41,9 +41,9 @@ namespace API.Extensions
public static bool NameInParserInfo(this Series series, ParserInfo info)
{
if (info == null) return false;
return Parser.Parser.Normalize(info.Series) == series.NormalizedName || Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.Name)
|| info.Series == series.Name || info.Series == series.LocalizedName || info.Series == series.OriginalName
|| Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.OriginalName);
return Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name)
|| info.Series == series.Name || info.Series == series.LocalizedName || info.Series == series.OriginalName
|| Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName);
}
}
}

View File

@ -60,6 +60,9 @@ namespace API.Helpers.Converters
case ServerSettingKey.InstallId:
destination.InstallId = row.Value;
break;
case ServerSettingKey.EnableFolderWatching:
destination.EnableFolderWatching = bool.Parse(row.Value);
break;
}
}

View File

@ -1,4 +1,5 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using API.Data;
@ -21,7 +22,7 @@ public static class GenreHelper
{
if (string.IsNullOrEmpty(name.Trim())) continue;
var normalizedName = Parser.Parser.Normalize(name);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name);
var genre = allGenres.FirstOrDefault(p =>
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
if (genre == null)
@ -34,6 +35,7 @@ public static class GenreHelper
}
}
public static void KeepOnlySameGenreBetweenLists(ICollection<Genre> existingGenres, ICollection<Genre> removeAllExcept, Action<Genre> action = null)
{
var existing = existingGenres.ToList();
@ -55,7 +57,17 @@ public static class GenreHelper
public static void AddGenreIfNotExists(ICollection<Genre> metadataGenres, Genre genre)
{
var existingGenre = metadataGenres.FirstOrDefault(p =>
p.NormalizedTitle == Parser.Parser.Normalize(genre.Title));
p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(genre.Title));
if (existingGenre == null)
{
metadataGenres.Add(genre);
}
}
public static void AddGenreIfNotExists(BlockingCollection<Genre> metadataGenres, Genre genre)
{
var existingGenre = metadataGenres.FirstOrDefault(p =>
p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(genre.Title));
if (existingGenre == null)
{
metadataGenres.Add(genre);

View File

@ -16,20 +16,20 @@ public static class ParserInfoHelpers
/// <param name="parsedSeries"></param>
/// <returns></returns>
public static bool SeriesHasMatchingParserInfoFormat(Series series,
Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries)
Dictionary<ParsedSeries, IList<ParserInfo>> parsedSeries)
{
var format = MangaFormat.Unknown;
foreach (var pSeries in parsedSeries.Keys)
{
var name = pSeries.Name;
var normalizedName = Parser.Parser.Normalize(name);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name);
//if (series.NameInParserInfo(pSeries.))
if (normalizedName == series.NormalizedName ||
normalizedName == Parser.Parser.Normalize(series.Name) ||
normalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name) ||
name == series.Name || name == series.LocalizedName ||
name == series.OriginalName ||
normalizedName == Parser.Parser.Normalize(series.OriginalName))
normalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName))
{
format = pSeries.Format;
if (format == series.Format)

View File

@ -1,4 +1,5 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using API.Data;
@ -25,7 +26,7 @@ public static class PersonHelper
foreach (var name in names)
{
var normalizedName = Parser.Parser.Normalize(name);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name);
var person = allPeopleTypeRole.FirstOrDefault(p =>
p.NormalizedName.Equals(normalizedName));
if (person == null)
@ -48,7 +49,7 @@ public static class PersonHelper
/// <param name="action">Callback which will be executed for each person removed</param>
public static void RemovePeople(ICollection<Person> existingPeople, IEnumerable<string> people, PersonRole role, Action<Person> action = null)
{
var normalizedPeople = people.Select(Parser.Parser.Normalize).ToList();
var normalizedPeople = people.Select(Services.Tasks.Scanner.Parser.Parser.Normalize).ToList();
if (normalizedPeople.Count == 0)
{
var peopleToRemove = existingPeople.Where(p => p.Role == role).ToList();
@ -81,7 +82,8 @@ public static class PersonHelper
{
foreach (var person in existingPeople)
{
var existingPerson = removeAllExcept.FirstOrDefault(p => p.Role == person.Role && person.NormalizedName.Equals(p.NormalizedName));
var existingPerson = removeAllExcept
.FirstOrDefault(p => p.Role == person.Role && person.NormalizedName.Equals(p.NormalizedName));
if (existingPerson == null)
{
action?.Invoke(person);
@ -97,7 +99,22 @@ public static class PersonHelper
public static void AddPersonIfNotExists(ICollection<Person> metadataPeople, Person person)
{
var existingPerson = metadataPeople.SingleOrDefault(p =>
p.NormalizedName == Parser.Parser.Normalize(person.Name) && p.Role == person.Role);
p.NormalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(person.Name) && p.Role == person.Role);
if (existingPerson == null)
{
metadataPeople.Add(person);
}
}
/// <summary>
/// Adds the person to the list if it's not already in there
/// </summary>
/// <param name="metadataPeople"></param>
/// <param name="person"></param>
public static void AddPersonIfNotExists(BlockingCollection<Person> metadataPeople, Person person)
{
var existingPerson = metadataPeople.SingleOrDefault(p =>
p.NormalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(person.Name) && p.Role == person.Role);
if (existingPerson == null)
{
metadataPeople.Add(person);

View File

@ -17,8 +17,8 @@ public static class SeriesHelper
public static bool FindSeries(Series series, ParsedSeries parsedInfoKey)
{
return (series.NormalizedName.Equals(parsedInfoKey.NormalizedName) ||
Parser.Parser.Normalize(series.LocalizedName).Equals(parsedInfoKey.NormalizedName) ||
Parser.Parser.Normalize(series.OriginalName).Equals(parsedInfoKey.NormalizedName))
Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName).Equals(parsedInfoKey.NormalizedName) ||
Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName).Equals(parsedInfoKey.NormalizedName))
&& (series.Format == parsedInfoKey.Format || series.Format == MangaFormat.Unknown);
}

View File

@ -1,4 +1,5 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using API.Data;
@ -22,7 +23,7 @@ public static class TagHelper
if (string.IsNullOrEmpty(name.Trim())) continue;
var added = false;
var normalizedName = Parser.Parser.Normalize(name);
var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name);
var genre = allTags.FirstOrDefault(p =>
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
@ -58,7 +59,17 @@ public static class TagHelper
public static void AddTagIfNotExists(ICollection<Tag> metadataTags, Tag tag)
{
var existingGenre = metadataTags.FirstOrDefault(p =>
p.NormalizedTitle == Parser.Parser.Normalize(tag.Title));
p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(tag.Title));
if (existingGenre == null)
{
metadataTags.Add(tag);
}
}
public static void AddTagIfNotExists(BlockingCollection<Tag> metadataTags, Tag tag)
{
var existingGenre = metadataTags.FirstOrDefault(p =>
p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(tag.Title));
if (existingGenre == null)
{
metadataTags.Add(tag);
@ -75,7 +86,7 @@ public static class TagHelper
/// <param name="action">Callback which will be executed for each tag removed</param>
public static void RemoveTags(ICollection<Tag> existingTags, IEnumerable<string> tags, bool isExternal, Action<Tag> action = null)
{
var normalizedTags = tags.Select(Parser.Parser.Normalize).ToList();
var normalizedTags = tags.Select(Services.Tasks.Scanner.Parser.Parser.Normalize).ToList();
foreach (var person in normalizedTags)
{
var existingTag = existingTags.FirstOrDefault(p => p.ExternalTag == isExternal && person.Equals(p.NormalizedTitle));

View File

@ -60,7 +60,7 @@ namespace API.Services
/// <returns></returns>
public virtual ArchiveLibrary CanOpen(string archivePath)
{
if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
var ext = _directoryService.FileSystem.Path.GetExtension(archivePath).ToUpper();
if (ext.Equals(".CBR") || ext.Equals(".RAR")) return ArchiveLibrary.SharpCompress;
@ -100,14 +100,14 @@ namespace API.Services
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
return archive.Entries.Count(e => !Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Parser.Parser.IsImage(e.FullName));
return archive.Entries.Count(e => !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Tasks.Scanner.Parser.Parser.IsImage(e.FullName));
}
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
return archive.Entries.Count(entry => !entry.IsDirectory &&
!Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Parser.Parser.IsImage(entry.Key));
!Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key));
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetNumberOfPagesFromArchive] This archive cannot be read: {ArchivePath}. Defaulting to 0 pages", archivePath);
@ -132,24 +132,25 @@ namespace API.Services
public static string FindFolderEntry(IEnumerable<string> entryFullNames)
{
var result = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)))
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)))
.OrderByNatural(Path.GetFileNameWithoutExtension)
.FirstOrDefault(Parser.Parser.IsCoverImage);
.FirstOrDefault(Tasks.Scanner.Parser.Parser.IsCoverImage);
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="OrderByNatural"/> for ordering files
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="EnumerableExtensions.OrderByNatural"/> for ordering files
/// </summary>
/// <param name="entryFullNames"></param>
/// <param name="archiveName"></param>
/// <returns>Entry name of match, null if no match</returns>
public static string? FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
{
// First check if there are any files that are not in a nested folder before just comparing by filename. This is needed
// because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg.
var fullNames = entryFullNames
.Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)) && Parser.Parser.IsImage(path))
.Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)) && Tasks.Scanner.Parser.Parser.IsImage(path))
.OrderByNatural(c => c.GetFullPathWithoutExtension())
.ToList();
if (fullNames.Count == 0) return null;
@ -186,7 +187,7 @@ namespace API.Services
/// <summary>
/// Generates byte array of cover image.
/// Given a path to a compressed file <see cref="Parser.Parser.ArchiveFileExtensions"/>, will ensure the first image (respects directory structure) is returned unless
/// Given a path to a compressed file <see cref="Tasks.Scanner.Parser.Parser.ArchiveFileExtensions"/>, will ensure the first image (respects directory structure) is returned unless
/// a folder/cover.(image extension) exists in the the compressed file (if duplicate, the first is chosen)
///
/// This skips over any __MACOSX folder/file iteration.
@ -264,7 +265,7 @@ namespace API.Services
// Sometimes ZipArchive will list the directory and others it will just keep it in the FullName
return archive.Entries.Count > 0 &&
!Path.HasExtension(archive.Entries.ElementAt(0).FullName) ||
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName));
}
/// <summary>
@ -321,7 +322,7 @@ namespace API.Services
return false;
}
if (Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath)) return true;
if (Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath)) return true;
_logger.LogWarning("Archive {ArchivePath} is not a valid archive", archivePath);
return false;
@ -330,10 +331,10 @@ namespace API.Services
private static bool ValidComicInfoArchiveEntry(string fullName, string name)
{
var filenameWithoutExtension = Path.GetFileNameWithoutExtension(name).ToLower();
return !Parser.Parser.HasBlacklistedFolderInPath(fullName)
return !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(fullName)
&& filenameWithoutExtension.Equals(ComicInfoFilename, StringComparison.InvariantCultureIgnoreCase)
&& !filenameWithoutExtension.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)
&& Parser.Parser.IsXml(name);
&& !filenameWithoutExtension.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)
&& Tasks.Scanner.Parser.Parser.IsXml(name);
}
/// <summary>
@ -466,8 +467,8 @@ namespace API.Services
{
using var archive = ArchiveFactory.Open(archivePath);
ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory
&& !Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Parser.Parser.IsImage(entry.Key)), extractPath);
&& !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Tasks.Scanner.Parser.Parser.IsImage(entry.Key)), extractPath);
break;
}
case ArchiveLibrary.NotSupported:

View File

@ -167,7 +167,7 @@ namespace API.Services
// @Import statements will be handled by browser, so we must inline the css into the original file that request it, so they can be Scoped
var prepend = filename.Length > 0 ? filename.Replace(Path.GetFileName(filename), string.Empty) : string.Empty;
var importBuilder = new StringBuilder();
foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
foreach (Match match in Tasks.Scanner.Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
@ -218,7 +218,7 @@ namespace API.Services
private static void EscapeCssImportReferences(ref string stylesheetHtml, string apiBase, string prepend)
{
foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
foreach (Match match in Tasks.Scanner.Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
@ -228,7 +228,7 @@ namespace API.Services
private static void EscapeFontFamilyReferences(ref string stylesheetHtml, string apiBase, string prepend)
{
foreach (Match match in Parser.Parser.FontSrcUrlRegex.Matches(stylesheetHtml))
foreach (Match match in Tasks.Scanner.Parser.Parser.FontSrcUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
@ -238,7 +238,7 @@ namespace API.Services
private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book)
{
var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
var matches = Tasks.Scanner.Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml);
foreach (Match match in matches)
{
if (!match.Success) continue;
@ -394,7 +394,7 @@ namespace API.Services
public ComicInfo GetComicInfo(string filePath)
{
if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null;
if (!IsValidFile(filePath) || Tasks.Scanner.Parser.Parser.IsPdf(filePath)) return null;
try
{
@ -425,7 +425,7 @@ namespace API.Services
var info = new ComicInfo()
{
Summary = epubBook.Schema.Package.Metadata.Description,
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Parser.Parser.CleanAuthor(c.Creator))),
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Tasks.Scanner.Parser.Parser.CleanAuthor(c.Creator))),
Publisher = string.Join(",", epubBook.Schema.Package.Metadata.Publishers),
Month = month,
Day = day,
@ -468,7 +468,7 @@ namespace API.Services
return false;
}
if (Parser.Parser.IsBook(filePath)) return true;
if (Tasks.Scanner.Parser.Parser.IsBook(filePath)) return true;
_logger.LogWarning("[BookService] Book {EpubFile} is not a valid EPUB/PDF", filePath);
return false;
@ -480,7 +480,7 @@ namespace API.Services
try
{
if (Parser.Parser.IsPdf(filePath))
if (Tasks.Scanner.Parser.Parser.IsPdf(filePath))
{
using var docReader = DocLib.Instance.GetDocReader(filePath, new PageDimensions(1080, 1920));
return docReader.GetPageCount();
@ -536,7 +536,7 @@ namespace API.Services
/// <returns></returns>
public ParserInfo ParseInfo(string filePath)
{
if (!Parser.Parser.IsEpub(filePath)) return null;
if (!Tasks.Scanner.Parser.Parser.IsEpub(filePath)) return null;
try
{
@ -601,7 +601,7 @@ namespace API.Services
}
var info = new ParserInfo()
{
Chapters = Parser.Parser.DefaultChapter,
Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter,
Edition = string.Empty,
Format = MangaFormat.Epub,
Filename = Path.GetFileName(filePath),
@ -628,7 +628,7 @@ namespace API.Services
return new ParserInfo()
{
Chapters = Parser.Parser.DefaultChapter,
Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter,
Edition = string.Empty,
Format = MangaFormat.Epub,
Filename = Path.GetFileName(filePath),
@ -636,7 +636,7 @@ namespace API.Services
FullFilePath = filePath,
IsSpecial = false,
Series = epubBook.Title.Trim(),
Volumes = Parser.Parser.DefaultVolume,
Volumes = Tasks.Scanner.Parser.Parser.DefaultVolume,
};
}
catch (Exception ex)
@ -876,7 +876,7 @@ namespace API.Services
{
if (!IsValidFile(fileFilePath)) return string.Empty;
if (Parser.Parser.IsPdf(fileFilePath))
if (Tasks.Scanner.Parser.Parser.IsPdf(fileFilePath))
{
return GetPdfCoverImage(fileFilePath, fileName, outputDirectory);
}
@ -887,7 +887,7 @@ namespace API.Services
{
// Try to get the cover image from OPF file, if not set, try to parse it from all the files, then result to the first one.
var coverImageContent = epubBook.Content.Cover
?? epubBook.Content.Images.Values.FirstOrDefault(file => Parser.Parser.IsCoverImage(file.FileName))
?? epubBook.Content.Images.Values.FirstOrDefault(file => Tasks.Scanner.Parser.Parser.IsCoverImage(file.FileName))
?? epubBook.Content.Images.Values.FirstOrDefault();
if (coverImageContent == null) return string.Empty;

View File

@ -51,7 +51,7 @@ public class BookmarkService : IBookmarkService
var bookmarkDirectory =
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
var bookmarkFilesToDelete = bookmarks.Select(b => Parser.Parser.NormalizePath(
var bookmarkFilesToDelete = bookmarks.Select(b => Tasks.Scanner.Parser.Parser.NormalizePath(
_directoryService.FileSystem.Path.Join(bookmarkDirectory,
b.FileName))).ToList();
@ -165,7 +165,7 @@ public class BookmarkService : IBookmarkService
var bookmarks = await _unitOfWork.UserRepository.GetAllBookmarksByIds(bookmarkIds.ToList());
return bookmarks
.Select(b => Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory,
.Select(b => Tasks.Scanner.Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory,
b.FileName)));
}

View File

@ -57,7 +57,7 @@ namespace API.Services
{
// Calculate what chapter the page belongs to
var path = GetBookmarkCachePath(seriesId);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions);
files = files
.AsEnumerable()
.OrderByNatural(Path.GetFileNameWithoutExtension)
@ -100,11 +100,9 @@ namespace API.Services
var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId);
var extractPath = GetCachePath(chapterId);
if (!_directoryService.Exists(extractPath))
{
var files = chapter.Files.ToList();
ExtractChapterFiles(extractPath, files);
}
if (_directoryService.Exists(extractPath)) return chapter;
var files = chapter.Files.ToList();
ExtractChapterFiles(extractPath, files);
return chapter;
}
@ -215,9 +213,8 @@ namespace API.Services
{
// Calculate what chapter the page belongs to
var path = GetCachePath(chapter.Id);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
files = files
.AsEnumerable()
// TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access
var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions)
.OrderByNatural(Path.GetFileNameWithoutExtension)
.ToArray();

View File

@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.IO;
using System.IO.Abstractions;
using System.Linq;
@ -9,6 +10,8 @@ using System.Threading.Tasks;
using API.DTOs.System;
using API.Entities.Enums;
using API.Extensions;
using Kavita.Common.Helpers;
using Microsoft.Extensions.FileSystemGlobbing;
using Microsoft.Extensions.Logging;
namespace API.Services
@ -57,9 +60,23 @@ namespace API.Services
void RemoveNonImages(string directoryName);
void Flatten(string directoryName);
Task<bool> CheckWriteAccess(string directoryName);
IEnumerable<string> GetFilesWithCertainExtensions(string path,
string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly);
IEnumerable<string> GetDirectories(string folderPath);
IEnumerable<string> GetDirectories(string folderPath, GlobMatcher matcher);
string GetParentDirectoryName(string fileOrFolder);
#nullable enable
IList<string> ScanFiles(string folderPath, GlobMatcher? matcher = null);
DateTime GetLastWriteTime(string folderPath);
GlobMatcher CreateMatcherFromFile(string filePath);
#nullable disable
}
public class DirectoryService : IDirectoryService
{
public const string KavitaIgnoreFile = ".kavitaignore";
public IFileSystem FileSystem { get; }
public string CacheDirectory { get; }
public string CoverImageDirectory { get; }
@ -100,12 +117,12 @@ namespace API.Services
/// <summary>
/// Given a set of regex search criteria, get files in the given path.
/// </summary>
/// <remarks>This will always exclude <see cref="Parser.Parser.MacOsMetadataFileStartsWith"/> patterns</remarks>
/// <remarks>This will always exclude <see cref="Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith"/> patterns</remarks>
/// <param name="path">Directory to search</param>
/// <param name="searchPatternExpression">Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files.</param>
/// <param name="searchOption">SearchOption to use, defaults to TopDirectoryOnly</param>
/// <returns>List of file paths</returns>
private IEnumerable<string> GetFilesWithCertainExtensions(string path,
public IEnumerable<string> GetFilesWithCertainExtensions(string path,
string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
@ -114,7 +131,7 @@ namespace API.Services
return FileSystem.Directory.EnumerateFiles(path, "*", searchOption)
.Where(file =>
reSearchPattern.IsMatch(FileSystem.Path.GetExtension(file)) && !FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
reSearchPattern.IsMatch(FileSystem.Path.GetExtension(file)) && !FileSystem.Path.GetFileName(file).StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith));
}
@ -191,12 +208,12 @@ namespace API.Services
{
var fileName = FileSystem.Path.GetFileName(file);
return reSearchPattern.IsMatch(fileName) &&
!fileName.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith);
!fileName.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith);
});
}
return FileSystem.Directory.EnumerateFiles(path, "*", searchOption).Where(file =>
!FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith));
!FileSystem.Path.GetFileName(file).StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith));
}
/// <summary>
@ -480,10 +497,10 @@ namespace API.Services
{
var stopLookingForDirectories = false;
var dirs = new Dictionary<string, string>();
foreach (var folder in libraryFolders)
foreach (var folder in libraryFolders.Select(Tasks.Scanner.Parser.Parser.NormalizePath))
{
if (stopLookingForDirectories) break;
foreach (var file in filePaths)
foreach (var file in filePaths.Select(Tasks.Scanner.Parser.Parser.NormalizePath))
{
if (!file.Contains(folder)) continue;
@ -496,7 +513,7 @@ namespace API.Services
break;
}
var fullPath = Path.Join(folder, parts.Last());
var fullPath = Tasks.Scanner.Parser.Parser.NormalizePath(Path.Join(folder, parts.Last()));
if (!dirs.ContainsKey(fullPath))
{
dirs.Add(fullPath, string.Empty);
@ -507,10 +524,161 @@ namespace API.Services
return dirs;
}
/// <summary>
/// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope.
/// </summary>
/// <param name="folderPath"></param>
/// <returns>List of directory paths, empty if path doesn't exist</returns>
public IEnumerable<string> GetDirectories(string folderPath)
{
if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray<string>.Empty;
return FileSystem.Directory.GetDirectories(folderPath)
.Where(path => ExcludeDirectories.Matches(path).Count == 0);
}
/// <summary>
/// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope.
/// </summary>
/// <param name="folderPath"></param>
/// <param name="matcher">A set of glob rules that will filter directories out</param>
/// <returns>List of directory paths, empty if path doesn't exist</returns>
public IEnumerable<string> GetDirectories(string folderPath, GlobMatcher matcher)
{
if (matcher == null) return GetDirectories(folderPath);
return GetDirectories(folderPath)
.Where(folder => !matcher.ExcludeMatches(
$"{FileSystem.DirectoryInfo.FromDirectoryName(folder).Name}{FileSystem.Path.AltDirectorySeparatorChar}"));
}
/// <summary>
/// Returns all directories, including subdirectories. Automatically excludes directories that shouldn't be in scope.
/// </summary>
/// <param name="folderPath"></param>
/// <returns></returns>
public IEnumerable<string> GetAllDirectories(string folderPath)
{
if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray<string>.Empty;
var directories = new List<string>();
var foundDirs = GetDirectories(folderPath);
foreach (var foundDir in foundDirs)
{
directories.Add(foundDir);
directories.AddRange(GetAllDirectories(foundDir));
}
return directories;
}
/// <summary>
/// Returns the parent directories name for a file or folder. Empty string is path is not valid.
/// </summary>
/// <param name="fileOrFolder"></param>
/// <returns></returns>
public string GetParentDirectoryName(string fileOrFolder)
{
try
{
return Tasks.Scanner.Parser.Parser.NormalizePath(Directory.GetParent(fileOrFolder)?.FullName);
}
catch (Exception)
{
return string.Empty;
}
}
/// <summary>
/// Scans a directory by utilizing a recursive folder search. If a .kavitaignore file is found, will ignore matching patterns
/// </summary>
/// <param name="folderPath"></param>
/// <param name="matcher"></param>
/// <returns></returns>
public IList<string> ScanFiles(string folderPath, GlobMatcher? matcher = null)
{
_logger.LogDebug("[ScanFiles] called on {Path}", folderPath);
var files = new List<string>();
if (!Exists(folderPath)) return files;
var potentialIgnoreFile = FileSystem.Path.Join(folderPath, KavitaIgnoreFile);
if (matcher == null)
{
matcher = CreateMatcherFromFile(potentialIgnoreFile);
}
else
{
matcher.Merge(CreateMatcherFromFile(potentialIgnoreFile));
}
var directories = GetDirectories(folderPath, matcher);
foreach (var directory in directories)
{
files.AddRange(ScanFiles(directory, matcher));
}
// Get the matcher from either ignore or global (default setup)
if (matcher == null)
{
files.AddRange(GetFilesWithCertainExtensions(folderPath, Tasks.Scanner.Parser.Parser.SupportedExtensions));
}
else
{
var foundFiles = GetFilesWithCertainExtensions(folderPath,
Tasks.Scanner.Parser.Parser.SupportedExtensions)
.Where(file => !matcher.ExcludeMatches(FileSystem.FileInfo.FromFileName(file).Name));
files.AddRange(foundFiles);
}
return files;
}
/// <summary>
/// Recursively scans a folder and returns the max last write time on any folders and files
/// </summary>
/// <param name="folderPath"></param>
/// <returns>Max Last Write Time</returns>
public DateTime GetLastWriteTime(string folderPath)
{
if (!FileSystem.Directory.Exists(folderPath)) throw new IOException($"{folderPath} does not exist");
return Directory.GetFileSystemEntries(folderPath, "*.*", SearchOption.AllDirectories).Max(path => FileSystem.File.GetLastWriteTime(path));
}
/// <summary>
/// Generates a GlobMatcher from a .kavitaignore file found at path. Returns null otherwise.
/// </summary>
/// <param name="filePath"></param>
/// <returns></returns>
public GlobMatcher CreateMatcherFromFile(string filePath)
{
if (!FileSystem.File.Exists(filePath))
{
return null;
}
// Read file in and add each line to Matcher
var lines = FileSystem.File.ReadAllLines(filePath);
if (lines.Length == 0)
{
return null;
}
GlobMatcher matcher = new();
foreach (var line in lines)
{
matcher.AddExclude(line);
}
return matcher;
}
/// <summary>
/// Recursively scans files and applies an action on them. This uses as many cores the underlying PC has to speed
/// up processing.
/// NOTE: This is no longer parallel due to user's machines locking up
/// </summary>
/// <param name="root">Directory to scan</param>
/// <param name="action">Action to apply on file path</param>
@ -538,18 +706,16 @@ namespace API.Services
string[] files;
try {
subDirs = FileSystem.Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0);
subDirs = GetDirectories(currentDir);
}
// Thrown if we do not have discovery permission on the directory.
catch (UnauthorizedAccessException e) {
Console.WriteLine(e.Message);
logger.LogError(e, "Unauthorized access on {Directory}", currentDir);
logger.LogCritical(e, "Unauthorized access on {Directory}", currentDir);
continue;
}
// Thrown if another process has deleted the directory after we retrieved its name.
catch (DirectoryNotFoundException e) {
Console.WriteLine(e.Message);
logger.LogError(e, "Directory not found on {Directory}", currentDir);
logger.LogCritical(e, "Directory not found on {Directory}", currentDir);
continue;
}
@ -558,15 +724,15 @@ namespace API.Services
.ToArray();
}
catch (UnauthorizedAccessException e) {
Console.WriteLine(e.Message);
logger.LogCritical(e, "Unauthorized access on a file in {Directory}", currentDir);
continue;
}
catch (DirectoryNotFoundException e) {
Console.WriteLine(e.Message);
logger.LogCritical(e, "Directory not found on a file in {Directory}", currentDir);
continue;
}
catch (IOException e) {
Console.WriteLine(e.Message);
logger.LogCritical(e, "IO exception on a file in {Directory}", currentDir);
continue;
}
@ -577,19 +743,16 @@ namespace API.Services
foreach (var file in files) {
action(file);
fileCount++;
}
}
}
catch (AggregateException ae) {
ae.Handle((ex) => {
if (ex is UnauthorizedAccessException) {
// Here we just output a message and go on.
Console.WriteLine(ex.Message);
_logger.LogError(ex, "Unauthorized access on file");
return true;
}
// Handle other exceptions here if necessary...
if (ex is not UnauthorizedAccessException) return false;
// Here we just output a message and go on.
_logger.LogError(ex, "Unauthorized access on file");
return true;
// Handle other exceptions here if necessary...
return false;
});
}
@ -682,7 +845,7 @@ namespace API.Services
/// <param name="directoryName">Fully qualified directory</param>
public void RemoveNonImages(string directoryName)
{
DeleteFiles(GetFiles(directoryName, searchOption:SearchOption.AllDirectories).Where(file => !Parser.Parser.IsImage(file)));
DeleteFiles(GetFiles(directoryName, searchOption:SearchOption.AllDirectories).Where(file => !Tasks.Scanner.Parser.Parser.IsImage(file)));
}
@ -755,9 +918,9 @@ namespace API.Services
foreach (var file in directory.EnumerateFiles().OrderByNatural(file => file.FullName))
{
if (file.Directory == null) continue;
var paddedIndex = Parser.Parser.PadZeros(directoryIndex + "");
var paddedIndex = Tasks.Scanner.Parser.Parser.PadZeros(directoryIndex + "");
// We need to rename the files so that after flattening, they are in the order we found them
var newName = $"{paddedIndex}_{Parser.Parser.PadZeros(fileIndex + "")}{file.Extension}";
var newName = $"{paddedIndex}_{Tasks.Scanner.Parser.Parser.PadZeros(fileIndex + "")}{file.Extension}";
var newPath = Path.Join(root.FullName, newName);
if (!File.Exists(newPath)) file.MoveTo(newPath);
fileIndex++;
@ -769,7 +932,7 @@ namespace API.Services
foreach (var subDirectory in directory.EnumerateDirectories().OrderByNatural(d => d.FullName))
{
// We need to check if the directory is not a blacklisted (ie __MACOSX)
if (Parser.Parser.HasBlacklistedFolderInPath(subDirectory.FullName)) continue;
if (Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(subDirectory.FullName)) continue;
FlattenDirectory(root, subDirectory, ref directoryIndex);
}

View File

@ -82,8 +82,15 @@ public class EmailService : IEmailService
public async Task<bool> CheckIfAccessible(string host)
{
// This is the only exception for using the default because we need an external service to check if the server is accessible for emails
if (IsLocalIpAddress(host)) return false;
return await SendEmailWithGet(DefaultApiUrl + "/api/email/reachable?host=" + host);
try
{
if (IsLocalIpAddress(host)) return false;
return await SendEmailWithGet(DefaultApiUrl + "/api/email/reachable?host=" + host);
}
catch (Exception)
{
return false;
}
}
public async Task<bool> SendMigrationEmail(EmailMigrationDto data)

View File

@ -1,6 +1,8 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using API.Data;
using API.Services.Tasks.Scanner;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
@ -23,6 +25,8 @@ namespace API.Services.HostedServices
await taskScheduler.ScheduleTasks();
taskScheduler.ScheduleUpdaterTasks();
try
{
// These methods will automatically check if stat collection is disabled to prevent sending any data regardless
@ -34,6 +38,21 @@ namespace API.Services.HostedServices
{
//If stats startup fail the user can keep using the app
}
try
{
var unitOfWork = scope.ServiceProvider.GetRequiredService<IUnitOfWork>();
if ((await unitOfWork.SettingsRepository.GetSettingsDtoAsync()).EnableFolderWatching)
{
var libraryWatcher = scope.ServiceProvider.GetRequiredService<ILibraryWatcher>();
await libraryWatcher.StartWatching();
}
}
catch (Exception)
{
// Fail silently
}
}
public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask;

View File

@ -63,7 +63,7 @@ public class ImageService : IImageService
else
{
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(fileFilePath), targetDirectory,
Parser.Parser.ImageFileExtensions);
Tasks.Scanner.Parser.Parser.ImageFileExtensions);
}
}

View File

@ -36,11 +36,15 @@ public interface IMetadataService
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true);
Task GenerateCoversForSeries(Series series, bool forceUpdate = false);
Task RemoveAbandonedMetadataKeys();
}
public class MetadataService : IMetadataService
{
public const string Name = "MetadataService";
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<MetadataService> _logger;
private readonly IEventHub _eventHub;
@ -77,9 +81,7 @@ public class MetadataService : IMetadataService
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile.FilePath);
chapter.CoverImage = _readingItemService.GetCoverImage(firstFile.FilePath, ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId), firstFile.Format);
// await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate,
// MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter), false);
_unitOfWork.ChapterRepository.Update(chapter);
_updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter));
return Task.FromResult(true);
}
@ -110,7 +112,6 @@ public class MetadataService : IMetadataService
if (firstChapter == null) return Task.FromResult(false);
volume.CoverImage = firstChapter.CoverImage;
//await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(volume.Id, MessageFactoryEntityTypes.Volume), false);
_updateEvents.Add(MessageFactory.CoverUpdateEvent(volume.Id, MessageFactoryEntityTypes.Volume));
return Task.FromResult(true);
@ -147,7 +148,6 @@ public class MetadataService : IMetadataService
}
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
//await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false);
_updateEvents.Add(MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series));
return Task.CompletedTask;
}
@ -160,7 +160,7 @@ public class MetadataService : IMetadataService
/// <param name="forceUpdate"></param>
private async Task ProcessSeriesCoverGen(Series series, bool forceUpdate)
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
_logger.LogDebug("[MetadataService] Processing cover image generation for series: {SeriesName}", series.OriginalName);
try
{
var volumeIndex = 0;
@ -194,7 +194,7 @@ public class MetadataService : IMetadataService
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during updating metadata for {SeriesName} ", series.Name);
_logger.LogError(ex, "[MetadataService] There was an exception during cover generation for {SeriesName} ", series.Name);
}
}
@ -210,14 +210,14 @@ public class MetadataService : IMetadataService
public async Task GenerateCoversForLibrary(int libraryId, bool forceUpdate = false)
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name);
_logger.LogInformation("[MetadataService] Beginning cover generation refresh of {LibraryName}", library.Name);
_updateEvents.Clear();
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName} for cover generation. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CoverUpdateProgressEvent(library.Id, 0F, ProgressEventType.Started, $"Starting {library.Name}"));
@ -228,7 +228,7 @@ public class MetadataService : IMetadataService
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
_logger.LogDebug("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd})",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
@ -254,7 +254,7 @@ public class MetadataService : IMetadataService
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name);
_logger.LogError(ex, "[MetadataService] There was an exception during cover generation refresh for {SeriesName}", series.Name);
}
seriesIndex++;
}
@ -271,17 +271,18 @@ public class MetadataService : IMetadataService
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CoverUpdateProgressEvent(library.Id, 1F, ProgressEventType.Ended, $"Complete"));
await RemoveAbandonedMetadataKeys();
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
_logger.LogInformation("[MetadataService] Updated covers for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
}
private async Task RemoveAbandonedMetadataKeys()
public async Task RemoveAbandonedMetadataKeys()
{
await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated();
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
}
/// <summary>
@ -292,7 +293,6 @@ public class MetadataService : IMetadataService
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
public async Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true)
{
var sw = Stopwatch.StartNew();
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
if (series == null)
{
@ -300,8 +300,19 @@ public class MetadataService : IMetadataService
return;
}
await GenerateCoversForSeries(series, forceUpdate);
}
/// <summary>
/// Generate Cover for a Series. This is used by Scan Loop and should not be invoked directly via User Interaction.
/// </summary>
/// <param name="series">A full Series, with metadata, chapters, etc</param>
/// <param name="forceUpdate"></param>
public async Task GenerateCoversForSeries(Series series, bool forceUpdate = false)
{
var sw = Stopwatch.StartNew();
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CoverUpdateProgressEvent(libraryId, 0F, ProgressEventType.Started, series.Name));
MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 0F, ProgressEventType.Started, series.Name));
await ProcessSeriesCoverGen(series, forceUpdate);
@ -309,17 +320,14 @@ public class MetadataService : IMetadataService
if (_unitOfWork.HasChanges())
{
await _unitOfWork.CommitAsync();
_logger.LogInformation("[MetadataService] Updated covers for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
}
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.CoverUpdateProgressEvent(libraryId, 1F, ProgressEventType.Ended, series.Name));
await RemoveAbandonedMetadataKeys();
MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 1F, ProgressEventType.Ended, series.Name));
await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false);
await FlushEvents();
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
}
private async Task FlushEvents()

View File

@ -59,7 +59,7 @@ public class ReaderService : IReaderService
public static string FormatBookmarkFolderPath(string baseDirectory, int userId, int seriesId, int chapterId)
{
return Parser.Parser.NormalizePath(Path.Join(baseDirectory, $"{userId}", $"{seriesId}", $"{chapterId}"));
return Tasks.Scanner.Parser.Parser.NormalizePath(Path.Join(baseDirectory, $"{userId}", $"{seriesId}", $"{chapterId}"));
}
/// <summary>
@ -496,7 +496,7 @@ public class ReaderService : IReaderService
{
var chapters = volume.Chapters
.OrderBy(c => float.Parse(c.Number))
.Where(c => !c.IsSpecial && Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber);
.Where(c => !c.IsSpecial && Tasks.Scanner.Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber);
await MarkChaptersAsRead(user, volume.SeriesId, chapters);
}
}

View File

@ -12,6 +12,7 @@ public interface IReadingItemService
string GetCoverImage(string filePath, string fileName, MangaFormat format);
void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1);
ParserInfo Parse(string path, string rootPath, LibraryType type);
ParserInfo ParseFile(string path, string rootPath, LibraryType type);
}
public class ReadingItemService : IReadingItemService
@ -20,7 +21,7 @@ public class ReadingItemService : IReadingItemService
private readonly IBookService _bookService;
private readonly IImageService _imageService;
private readonly IDirectoryService _directoryService;
private readonly DefaultParser _defaultParser;
private readonly IDefaultParser _defaultParser;
public ReadingItemService(IArchiveService archiveService, IBookService bookService, IImageService imageService, IDirectoryService directoryService)
{
@ -39,12 +40,12 @@ public class ReadingItemService : IReadingItemService
/// <returns></returns>
public ComicInfo? GetComicInfo(string filePath)
{
if (Parser.Parser.IsEpub(filePath))
if (Tasks.Scanner.Parser.Parser.IsEpub(filePath))
{
return _bookService.GetComicInfo(filePath);
}
if (Parser.Parser.IsComicInfoExtension(filePath))
if (Tasks.Scanner.Parser.Parser.IsComicInfoExtension(filePath))
{
return _archiveService.GetComicInfo(filePath);
}
@ -52,6 +53,71 @@ public class ReadingItemService : IReadingItemService
return null;
}
/// <summary>
/// Processes files found during a library scan.
/// </summary>
/// <param name="path">Path of a file</param>
/// <param name="rootPath"></param>
/// <param name="type">Library type to determine parsing to perform</param>
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
{
var info = Parse(path, rootPath, type);
if (info == null)
{
return null;
}
// This catches when original library type is Manga/Comic and when parsing with non
if (Tasks.Scanner.Parser.Parser.IsEpub(path) && Tasks.Scanner.Parser.Parser.ParseVolume(info.Series) != Tasks.Scanner.Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
{
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
var info2 = Parse(path, rootPath, type);
info.Merge(info2);
}
info.ComicInfo = GetComicInfo(path);
if (info.ComicInfo == null) return info;
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
{
info.Volumes = info.ComicInfo.Volume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
{
info.Series = info.ComicInfo.Series.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
{
info.Chapters = info.ComicInfo.Number;
}
// Patch is SeriesSort from ComicInfo
if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort))
{
info.SeriesSort = info.ComicInfo.TitleSort.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Tasks.Scanner.Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format))
{
info.IsSpecial = true;
info.Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter;
info.Volumes = Tasks.Scanner.Parser.Parser.DefaultVolume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort))
{
info.SeriesSort = info.ComicInfo.SeriesSort.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries))
{
info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim();
}
return info;
}
/// <summary>
///
/// </summary>
@ -134,6 +200,6 @@ public class ReadingItemService : IReadingItemService
/// <returns></returns>
public ParserInfo Parse(string path, string rootPath, LibraryType type)
{
return Parser.Parser.IsEpub(path) ? _bookService.ParseInfo(path) : _defaultParser.Parse(path, rootPath, type);
return Tasks.Scanner.Parser.Parser.IsEpub(path) ? _bookService.ParseInfo(path) : _defaultParser.Parse(path, rootPath, type);
}
}

View File

@ -0,0 +1,182 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Data.Repositories;
using API.DTOs.ReadingLists;
using API.Entities;
using Microsoft.Extensions.Logging;
namespace API.Services;
public interface IReadingListService
{
Task<bool> RemoveFullyReadItems(int readingListId, AppUser user);
Task<bool> UpdateReadingListItemPosition(UpdateReadingListPosition dto);
Task<bool> DeleteReadingListItem(UpdateReadingListPosition dto);
Task<AppUser?> UserHasReadingListAccess(int readingListId, string username);
Task<bool> DeleteReadingList(int readingListId, AppUser user);
Task<bool> AddChaptersToReadingList(int seriesId, IList<int> chapterIds,
ReadingList readingList);
}
/// <summary>
/// Methods responsible for management of Reading Lists
/// </summary>
/// <remarks>If called from API layer, expected for <see cref="UserHasReadingListAccess"/> to be called beforehand</remarks>
public class ReadingListService : IReadingListService
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<ReadingListService> _logger;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public ReadingListService(IUnitOfWork unitOfWork, ILogger<ReadingListService> logger)
{
_unitOfWork = unitOfWork;
_logger = logger;
}
/// <summary>
/// Removes all entries that are fully read from the reading list
/// </summary>
/// <remarks>If called from API layer, expected for <see cref="UserHasReadingListAccess"/> to be called beforehand</remarks>
/// <param name="readingListId">Reading List Id</param>
/// <param name="user">User</param>
/// <returns></returns>
public async Task<bool> RemoveFullyReadItems(int readingListId, AppUser user)
{
var items = await _unitOfWork.ReadingListRepository.GetReadingListItemDtosByIdAsync(readingListId, user.Id);
items = await _unitOfWork.ReadingListRepository.AddReadingProgressModifiers(user.Id, items.ToList());
// Collect all Ids to remove
var itemIdsToRemove = items.Where(item => item.PagesRead == item.PagesTotal).Select(item => item.Id);
try
{
var listItems =
(await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(readingListId)).Where(r =>
itemIdsToRemove.Contains(r.Id));
_unitOfWork.ReadingListRepository.BulkRemove(listItems);
if (!_unitOfWork.HasChanges()) return true;
await _unitOfWork.CommitAsync();
return true;
}
catch
{
await _unitOfWork.RollbackAsync();
}
return false;
}
/// <summary>
/// Updates a reading list item from one position to another. This will cause items at that position to be pushed one index.
/// </summary>
/// <param name="dto"></param>
/// <returns></returns>
public async Task<bool> UpdateReadingListItemPosition(UpdateReadingListPosition dto)
{
var items = (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(dto.ReadingListId)).ToList();
var item = items.Find(r => r.Id == dto.ReadingListItemId);
items.Remove(item);
items.Insert(dto.ToPosition, item);
for (var i = 0; i < items.Count; i++)
{
items[i].Order = i;
}
if (!_unitOfWork.HasChanges()) return true;
return await _unitOfWork.CommitAsync();
}
public async Task<bool> DeleteReadingListItem(UpdateReadingListPosition dto)
{
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId);
readingList.Items = readingList.Items.Where(r => r.Id != dto.ReadingListItemId).ToList();
var index = 0;
foreach (var readingListItem in readingList.Items)
{
readingListItem.Order = index;
index++;
}
if (!_unitOfWork.HasChanges()) return true;
return await _unitOfWork.CommitAsync();
}
/// <summary>
/// Validates the user has access to the reading list to perform actions on it
/// </summary>
/// <param name="readingListId"></param>
/// <param name="username"></param>
/// <returns></returns>
public async Task<AppUser?> UserHasReadingListAccess(int readingListId, string username)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username,
AppUserIncludes.ReadingListsWithItems);
if (user.ReadingLists.SingleOrDefault(rl => rl.Id == readingListId) == null && !await _unitOfWork.UserRepository.IsUserAdminAsync(user))
{
return null;
}
return user;
}
/// <summary>
/// Removes the Reading List from kavita
/// </summary>
/// <param name="readingListId"></param>
/// <param name="user">User should have ReadingLists populated</param>
/// <returns></returns>
public async Task<bool> DeleteReadingList(int readingListId, AppUser user)
{
var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(readingListId);
user.ReadingLists.Remove(readingList);
if (!_unitOfWork.HasChanges()) return true;
return await _unitOfWork.CommitAsync();
}
/// <summary>
/// Adds a list of Chapters as reading list items to the passed reading list.
/// </summary>
/// <param name="seriesId"></param>
/// <param name="chapterIds"></param>
/// <param name="readingList"></param>
/// <returns>True if new chapters were added</returns>
public async Task<bool> AddChaptersToReadingList(int seriesId, IList<int> chapterIds, ReadingList readingList)
{
readingList.Items ??= new List<ReadingListItem>();
var lastOrder = 0;
if (readingList.Items.Any())
{
lastOrder = readingList.Items.DefaultIfEmpty().Max(rli => rli.Order);
}
var existingChapterExists = readingList.Items.Select(rli => rli.ChapterId).ToHashSet();
var chaptersForSeries = (await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds))
.OrderBy(c => Tasks.Scanner.Parser.Parser.MinNumberFromRange(c.Volume.Name))
.ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting);
var index = lastOrder + 1;
foreach (var chapter in chaptersForSeries)
{
if (existingChapterExists.Contains(chapter.Id)) continue;
readingList.Items.Add(DbFactory.ReadingListItem(index, seriesId, chapter.VolumeId, chapter.Id));
index += 1;
}
return index > lastOrder + 1;
}
}

View File

@ -8,7 +8,6 @@ using API.Data;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Metadata;
using API.DTOs.Reader;
using API.DTOs.SeriesDetail;
using API.Entities;
using API.Entities.Enums;
@ -51,8 +50,8 @@ public class SeriesService : ISeriesService
/// <returns></returns>
public static Chapter GetFirstChapterForMetadata(Series series, bool isBookLibrary)
{
return series.Volumes.OrderBy(v => v.Number, new ChapterSortComparer())
.SelectMany(v => v.Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparer()))
return series.Volumes.OrderBy(v => v.Number, ChapterSortComparer.Default)
.SelectMany(v => v.Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default))
.FirstOrDefault();
}
@ -255,7 +254,7 @@ public class SeriesService : ISeriesService
// At this point, all tags that aren't in dto have been removed.
foreach (var tagTitle in tags.Select(t => t.Title))
{
var normalizedTitle = Parser.Parser.Normalize(tagTitle);
var normalizedTitle = Tasks.Scanner.Parser.Parser.Normalize(tagTitle);
var existingTag = allTags.SingleOrDefault(t => t.NormalizedTitle == normalizedTitle);
if (existingTag != null)
{
@ -296,7 +295,7 @@ public class SeriesService : ISeriesService
// At this point, all tags that aren't in dto have been removed.
foreach (var tagTitle in tags.Select(t => t.Title))
{
var normalizedTitle = Parser.Parser.Normalize(tagTitle);
var normalizedTitle = Tasks.Scanner.Parser.Parser.Normalize(tagTitle);
var existingTag = allTags.SingleOrDefault(t => t.NormalizedTitle.Equals(normalizedTitle));
if (existingTag != null)
{
@ -422,8 +421,17 @@ public class SeriesService : ISeriesService
}
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(seriesIds);
var libraryIds = series.Select(s => s.LibraryId);
var libraries = await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(libraryIds);
foreach (var library in libraries)
{
library.LastModified = DateTime.Now;
_unitOfWork.LibraryRepository.Update(library);
}
_unitOfWork.SeriesRepository.Remove(series);
if (!_unitOfWork.HasChanges() || !await _unitOfWork.CommitAsync()) return true;
foreach (var s in series)
@ -457,7 +465,7 @@ public class SeriesService : ISeriesService
var libraryType = await _unitOfWork.LibraryRepository.GetLibraryTypeAsync(series.LibraryId);
var volumes = (await _unitOfWork.VolumeRepository.GetVolumesDtoAsync(seriesId, userId))
.OrderBy(v => Parser.Parser.MinNumberFromRange(v.Name))
.OrderBy(v => Tasks.Scanner.Parser.Parser.MinNumberFromRange(v.Name))
.ToList();
// For books, the Name of the Volume is remapped to the actual name of the book, rather than Volume number.
@ -485,7 +493,7 @@ public class SeriesService : ISeriesService
if (v.Number == 0) return c;
c.VolumeTitle = v.Name;
return c;
}).OrderBy(c => float.Parse(c.Number), new ChapterSortComparer()));
}).OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default)).ToList();
foreach (var chapter in chapters)
{
@ -510,7 +518,13 @@ public class SeriesService : ISeriesService
var storylineChapters = volumes
.Where(v => v.Number == 0)
.SelectMany(v => v.Chapters.Where(c => !c.IsSpecial))
.OrderBy(c => float.Parse(c.Number), new ChapterSortComparer());
.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default)
.ToList();
// When there's chapters without a volume number revert to chapter sorting only as opposed to volume then chapter
if (storylineChapters.Any()) {
retChapters = retChapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default);
}
return new SeriesDetailDto()
{
@ -528,7 +542,7 @@ public class SeriesService : ISeriesService
/// <returns></returns>
private static bool ShouldIncludeChapter(ChapterDto chapter)
{
return !chapter.IsSpecial && !chapter.Number.Equals(Parser.Parser.DefaultChapter);
return !chapter.IsSpecial && !chapter.Number.Equals(Tasks.Scanner.Parser.Parser.DefaultChapter);
}
public static void RenameVolumeName(ChapterDto firstChapter, VolumeDto volume, LibraryType libraryType)
@ -537,7 +551,7 @@ public class SeriesService : ISeriesService
{
if (string.IsNullOrEmpty(firstChapter.TitleName))
{
if (firstChapter.Range.Equals(Parser.Parser.DefaultVolume)) return;
if (firstChapter.Range.Equals(Tasks.Scanner.Parser.Parser.DefaultVolume)) return;
var title = Path.GetFileNameWithoutExtension(firstChapter.Range);
if (string.IsNullOrEmpty(title)) return;
volume.Name += $" - {title}";
@ -558,7 +572,7 @@ public class SeriesService : ISeriesService
{
if (isSpecial)
{
return Parser.Parser.CleanSpecialTitle(chapterTitle);
return Tasks.Scanner.Parser.Parser.CleanSpecialTitle(chapterTitle);
}
var hashSpot = withHash ? "#" : string.Empty;

View File

@ -8,8 +8,8 @@ using API.Entities.Enums;
using API.Helpers.Converters;
using API.Services.Tasks;
using API.Services.Tasks.Metadata;
using API.Services.Tasks.Scanner;
using Hangfire;
using Hangfire.Storage;
using Microsoft.Extensions.Logging;
namespace API.Services;
@ -19,7 +19,8 @@ public interface ITaskScheduler
Task ScheduleTasks();
Task ScheduleStatsTasks();
void ScheduleUpdaterTasks();
void ScanLibrary(int libraryId);
void ScanFolder(string folderPath);
void ScanLibrary(int libraryId, bool force = false);
void CleanupChapters(int[] chapterIds);
void RefreshMetadata(int libraryId, bool forceUpdate = true);
void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false);
@ -29,8 +30,6 @@ public interface ITaskScheduler
void CancelStatsTasks();
Task RunStatCollection();
void ScanSiteThemes();
}
public class TaskScheduler : ITaskScheduler
{
@ -48,6 +47,12 @@ public class TaskScheduler : ITaskScheduler
private readonly IWordCountAnalyzerService _wordCountAnalyzerService;
public static BackgroundJobServer Client => new BackgroundJobServer();
public const string ScanQueue = "scan";
public const string DefaultQueue = "default";
public static readonly IList<string> ScanTasks = new List<string>()
{"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"};
private static readonly Random Rnd = new Random();
@ -83,7 +88,7 @@ public class TaskScheduler : ITaskScheduler
}
else
{
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
RecurringJob.AddOrUpdate("scan-libraries", () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
}
setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value;
@ -149,6 +154,7 @@ public class TaskScheduler : ITaskScheduler
BackgroundJob.Enqueue(() => _themeService.Scan());
}
#endregion
#region UpdateTasks
@ -159,17 +165,44 @@ public class TaskScheduler : ITaskScheduler
// Schedule update check between noon and 6pm local time
RecurringJob.AddOrUpdate("check-updates", () => CheckForUpdate(), Cron.Daily(Rnd.Next(12, 18)), TimeZoneInfo.Local);
}
public void ScanFolder(string folderPath)
{
_scannerService.ScanFolder(Tasks.Scanner.Parser.Parser.NormalizePath(folderPath));
}
#endregion
public void ScanLibrary(int libraryId)
public void ScanLibraries()
{
if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}))
if (RunningAnyTasksByMethod(ScanTasks, ScanQueue))
{
_logger.LogInformation("A Scan is already running, rescheduling ScanLibraries in 3 hours");
BackgroundJob.Schedule(() => ScanLibraries(), TimeSpan.FromHours(3));
return;
}
_scannerService.ScanLibraries();
}
public void ScanLibrary(int libraryId, bool force = false)
{
var alreadyEnqueued =
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) ||
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue);
if (alreadyEnqueued)
{
_logger.LogInformation("A duplicate request to scan library for library occured. Skipping");
return;
}
if (RunningAnyTasksByMethod(ScanTasks, ScanQueue))
{
_logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours");
BackgroundJob.Schedule(() => ScanLibrary(libraryId, force), TimeSpan.FromHours(3));
return;
}
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, force));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory());
}
@ -181,7 +214,11 @@ public class TaskScheduler : ITaskScheduler
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate}))
var alreadyEnqueued = HasAlreadyEnqueuedTask(MetadataService.Name, "GenerateCoversForLibrary",
new object[] {libraryId, true}) ||
HasAlreadyEnqueuedTask("MetadataService", "GenerateCoversForLibrary",
new object[] {libraryId, false});
if (alreadyEnqueued)
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
@ -193,7 +230,7 @@ public class TaskScheduler : ITaskScheduler
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate}))
if (HasAlreadyEnqueuedTask(MetadataService.Name,"GenerateCoversForSeries", new object[] {libraryId, seriesId, forceUpdate}))
{
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
return;
@ -205,14 +242,20 @@ public class TaskScheduler : ITaskScheduler
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
if (HasAlreadyEnqueuedTask(ScannerService.Name, "ScanSeries", new object[] {seriesId, forceUpdate}, ScanQueue))
{
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
return;
}
if (RunningAnyTasksByMethod(ScanTasks, ScanQueue))
{
_logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 minutes");
BackgroundJob.Schedule(() => ScanSeries(libraryId, seriesId, forceUpdate), TimeSpan.FromMinutes(10));
return;
}
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(seriesId, forceUpdate));
}
public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false)
@ -242,6 +285,13 @@ public class TaskScheduler : ITaskScheduler
await _versionUpdaterService.PushUpdate(update);
}
public static bool HasScanTaskRunningForLibrary(int libraryId)
{
return
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) ||
HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue);
}
/// <summary>
/// Checks if this same invocation is already enqueued
/// </summary>
@ -250,7 +300,7 @@ public class TaskScheduler : ITaskScheduler
/// <param name="args">object[] of arguments in the order they are passed to enqueued job</param>
/// <param name="queue">Queue to check against. Defaults to "default"</param>
/// <returns></returns>
private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default")
public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue)
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
@ -258,4 +308,11 @@ public class TaskScheduler : ITaskScheduler
j.Value.Job.Method.Name.Equals(methodName) &&
j.Value.Job.Method.DeclaringType.Name.Equals(className));
}
public static bool RunningAnyTasksByMethod(IEnumerable<string> classNames, string queue = DefaultQueue)
{
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
return enqueuedJobs.Any(j => !j.Value.InEnqueuedState &&
classNames.Contains(j.Value.Job.Method.DeclaringType?.Name));
}
}

View File

@ -20,6 +20,7 @@ namespace API.Services.Tasks
Task DeleteChapterCoverImages();
Task DeleteTagCoverImages();
Task CleanupBackups();
void CleanupTemp();
}
/// <summary>
/// Cleans up after operations on reoccurring basis
@ -127,16 +128,18 @@ namespace API.Services.Tasks
}
/// <summary>
/// Removes all files and directories in the cache directory
/// Removes all files and directories in the cache and temp directory
/// </summary>
public void CleanupCacheDirectory()
{
_logger.LogInformation("Performing cleanup of Cache directory");
_directoryService.ExistOrCreate(_directoryService.CacheDirectory);
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
{
_directoryService.ClearDirectory(_directoryService.CacheDirectory);
_directoryService.ClearDirectory(_directoryService.TempDirectory);
}
catch (Exception ex)
{
@ -175,5 +178,22 @@ namespace API.Services.Tasks
}
_logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now);
}
public void CleanupTemp()
{
_logger.LogInformation("Performing cleanup of Temp directory");
_directoryService.ExistOrCreate(_directoryService.TempDirectory);
try
{
_directoryService.ClearDirectory(_directoryService.TempDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
_logger.LogInformation("Temp directory purged");
}
}
}

View File

@ -142,7 +142,8 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService
_logger.LogInformation("[WordCountAnalyzerService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
}
private async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true)
public async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true)
{
var isEpub = series.Format == MangaFormat.Epub;
var existingWordCount = series.WordCount;
@ -208,6 +209,11 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService
chapter.MinHoursToRead = est.MinHours;
chapter.MaxHoursToRead = est.MaxHours;
chapter.AvgHoursToRead = est.AvgHours;
foreach (var file in chapter.Files)
{
file.LastFileAnalysis = DateTime.Now;
_unitOfWork.MangaFileRepository.Update(file);
}
_unitOfWork.ChapterRepository.Update(chapter);
}

View File

@ -0,0 +1,250 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using Hangfire;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner;
/// <summary>
/// Change information
/// </summary>
public class Change
{
/// <summary>
/// Gets or sets the type of the change.
/// </summary>
/// <value>
/// The type of the change.
/// </value>
public WatcherChangeTypes ChangeType { get; set; }
/// <summary>
/// Gets or sets the full path.
/// </summary>
/// <value>
/// The full path.
/// </value>
public string FullPath { get; set; }
/// <summary>
/// Gets or sets the name.
/// </summary>
/// <value>
/// The name.
/// </value>
public string Name { get; set; }
/// <summary>
/// Gets or sets the old full path.
/// </summary>
/// <value>
/// The old full path.
/// </value>
public string OldFullPath { get; set; }
/// <summary>
/// Gets or sets the old name.
/// </summary>
/// <value>
/// The old name.
/// </value>
public string OldName { get; set; }
}
public interface ILibraryWatcher
{
/// <summary>
/// Start watching all library folders
/// </summary>
/// <returns></returns>
Task StartWatching();
/// <summary>
/// Stop watching all folders
/// </summary>
void StopWatching();
/// <summary>
/// Essentially stops then starts watching. Useful if there is a change in folders or libraries
/// </summary>
/// <returns></returns>
Task RestartWatching();
}
/// <summary>
/// Responsible for watching the file system and processing change events. This is mainly responsible for invoking
/// Scanner to quickly pickup on changes.
/// </summary>
public class LibraryWatcher : ILibraryWatcher
{
private readonly IDirectoryService _directoryService;
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<LibraryWatcher> _logger;
private readonly IScannerService _scannerService;
private readonly Dictionary<string, IList<FileSystemWatcher>> _watcherDictionary = new ();
/// <summary>
/// This is just here to prevent GC from Disposing our watchers
/// </summary>
private readonly IList<FileSystemWatcher> _fileWatchers = new List<FileSystemWatcher>();
private IList<string> _libraryFolders = new List<string>();
private readonly TimeSpan _queueWaitTime;
public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger<LibraryWatcher> logger, IScannerService scannerService, IHostEnvironment environment)
{
_directoryService = directoryService;
_unitOfWork = unitOfWork;
_logger = logger;
_scannerService = scannerService;
_queueWaitTime = environment.IsDevelopment() ? TimeSpan.FromSeconds(30) : TimeSpan.FromMinutes(5);
}
public async Task StartWatching()
{
_logger.LogInformation("Starting file watchers");
_libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
.SelectMany(l => l.Folders)
.Distinct()
.Select(Parser.Parser.NormalizePath)
.Where(_directoryService.Exists)
.ToList();
foreach (var libraryFolder in _libraryFolders)
{
_logger.LogDebug("Watching {FolderPath}", libraryFolder);
var watcher = new FileSystemWatcher(libraryFolder);
watcher.Changed += OnChanged;
watcher.Created += OnCreated;
watcher.Deleted += OnDeleted;
watcher.Error += OnError;
watcher.Filter = "*.*";
watcher.IncludeSubdirectories = true;
watcher.EnableRaisingEvents = true;
_fileWatchers.Add(watcher);
if (!_watcherDictionary.ContainsKey(libraryFolder))
{
_watcherDictionary.Add(libraryFolder, new List<FileSystemWatcher>());
}
_watcherDictionary[libraryFolder].Add(watcher);
}
}
public void StopWatching()
{
_logger.LogInformation("Stopping watching folders");
foreach (var fileSystemWatcher in _watcherDictionary.Values.SelectMany(watcher => watcher))
{
fileSystemWatcher.EnableRaisingEvents = false;
fileSystemWatcher.Changed -= OnChanged;
fileSystemWatcher.Created -= OnCreated;
fileSystemWatcher.Deleted -= OnDeleted;
fileSystemWatcher.Dispose();
}
_fileWatchers.Clear();
_watcherDictionary.Clear();
}
public async Task RestartWatching()
{
StopWatching();
await StartWatching();
}
private void OnChanged(object sender, FileSystemEventArgs e)
{
if (e.ChangeType != WatcherChangeTypes.Changed) return;
_logger.LogDebug("[LibraryWatcher] Changed: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name)));
}
private void OnCreated(object sender, FileSystemEventArgs e)
{
_logger.LogDebug("[LibraryWatcher] Created: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, !_directoryService.FileSystem.File.Exists(e.Name));
}
/// <summary>
/// From testing, on Deleted only needs to pass through the event when a folder is deleted. If a file is deleted, Changed will handle automatically.
/// </summary>
/// <param name="sender"></param>
/// <param name="e"></param>
private void OnDeleted(object sender, FileSystemEventArgs e) {
var isDirectory = string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name));
if (!isDirectory) return;
_logger.LogDebug("[LibraryWatcher] Deleted: {FullPath}, {Name}", e.FullPath, e.Name);
ProcessChange(e.FullPath, true);
}
private void OnError(object sender, ErrorEventArgs e)
{
_logger.LogError(e.GetException(), "[LibraryWatcher] An error occured, likely too many watches occured at once. Restarting Watchers");
Task.Run(RestartWatching);
}
/// <summary>
/// Processes the file or folder change. If the change is a file change and not from a supported extension, it will be ignored.
/// </summary>
/// <remarks>This will ignore image files that are added to the system. However, they may still trigger scans due to folder changes.</remarks>
/// <param name="filePath">File or folder that changed</param>
/// <param name="isDirectoryChange">If the change is on a directory and not a file</param>
private void ProcessChange(string filePath, bool isDirectoryChange = false)
{
var sw = Stopwatch.StartNew();
try
{
// We need to check if directory or not
if (!isDirectoryChange &&
!(Parser.Parser.IsArchive(filePath) || Parser.Parser.IsBook(filePath))) return;
var parentDirectory = _directoryService.GetParentDirectoryName(filePath);
if (string.IsNullOrEmpty(parentDirectory)) return;
// We need to find the library this creation belongs to
// Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault
var libraryFolder = _libraryFolders.FirstOrDefault(f => parentDirectory.Contains(f));
if (string.IsNullOrEmpty(libraryFolder)) return;
var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList();
if (!rootFolder.Any()) return;
// Select the first folder and join with library folder, this should give us the folder to scan.
var fullPath =
Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.First()));
var alreadyScheduled =
TaskScheduler.HasAlreadyEnqueuedTask(ScannerService.Name, "ScanFolder", new object[] {fullPath});
_logger.LogDebug("{FullPath} already enqueued: {Value}", fullPath, alreadyScheduled);
if (!alreadyScheduled)
{
_logger.LogDebug("[LibraryWatcher] Scheduling ScanFolder for {Folder}", fullPath);
BackgroundJob.Schedule(() => _scannerService.ScanFolder(fullPath), _queueWaitTime);
}
else
{
_logger.LogDebug("[LibraryWatcher] Skipped scheduling ScanFolder for {Folder} as a job already queued",
fullPath);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "[LibraryWatcher] An error occured when processing a watch event");
}
_logger.LogDebug("ProcessChange occured in {ElapsedMilliseconds}ms", sw.ElapsedMilliseconds);
}
}

View File

@ -1,37 +1,53 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Data.Metadata;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using API.Extensions;
using API.Parser;
using API.SignalR;
using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner
{
public class ParsedSeries
{
/// <summary>
/// Name of the Series
/// </summary>
public string Name { get; init; }
/// <summary>
/// Normalized Name of the Series
/// </summary>
public string NormalizedName { get; init; }
/// <summary>
/// Format of the Series
/// </summary>
public MangaFormat Format { get; init; }
}
public enum Modified
{
Modified = 1,
NotModified = 2
}
public class SeriesModified
{
public string FolderPath { get; set; }
public string SeriesName { get; set; }
public DateTime LastScanned { get; set; }
public MangaFormat Format { get; set; }
}
public class ParseScannedFiles
{
private readonly ConcurrentDictionary<ParsedSeries, List<ParserInfo>> _scannedSeries;
private readonly ILogger _logger;
private readonly IDirectoryService _directoryService;
private readonly IReadingItemService _readingItemService;
private readonly IEventHub _eventHub;
private readonly DefaultParser _defaultParser;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
@ -47,108 +63,51 @@ namespace API.Services.Tasks.Scanner
_logger = logger;
_directoryService = directoryService;
_readingItemService = readingItemService;
_scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
_defaultParser = new DefaultParser(_directoryService);
_eventHub = eventHub;
}
/// <summary>
/// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list.
/// </summary>
/// <param name="parsedSeries"></param>
/// <param name="series"></param>
/// <returns></returns>
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries, Series series)
{
var allKeys = parsedSeries.Keys.Where(ps =>
SeriesHelper.FindSeries(series, ps));
var infos = new List<ParserInfo>();
foreach (var key in allKeys)
{
infos.AddRange(parsedSeries[key]);
}
return infos;
}
/// <summary>
/// Processes files found during a library scan.
/// Populates a collection of <see cref="ParserInfo"/> for DB updates later.
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
/// </summary>
/// <param name="path">Path of a file</param>
/// <param name="rootPath"></param>
/// <param name="type">Library type to determine parsing to perform</param>
private void ProcessFile(string path, string rootPath, LibraryType type)
/// <param name="scanDirectoryByDirectory">Scan directory by directory and for each, call folderAction</param>
/// <param name="folderPath">A library folder or series folder</param>
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
/// <param name="forceCheck">If we should bypass any folder last write time checks on the scan and force I/O</param>
public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory,
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<IList<string>, string,Task> folderAction, bool forceCheck = false)
{
var info = _readingItemService.Parse(path, rootPath, type);
if (info == null)
string normalizedPath;
if (scanDirectoryByDirectory)
{
// If the file is an image and literally a cover image, skip processing.
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
// This is used in library scan, so we should check first for a ignore file and use that here as well
var potentialIgnoreFile = _directoryService.FileSystem.Path.Join(folderPath, DirectoryService.KavitaIgnoreFile);
var directories = _directoryService.GetDirectories(folderPath, _directoryService.CreateMatcherFromFile(potentialIgnoreFile)).ToList();
foreach (var directory in directories)
{
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
normalizedPath = Parser.Parser.NormalizePath(directory);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
await folderAction(new List<string>(), directory);
}
else
{
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
await folderAction(_directoryService.ScanFiles(directory), directory);
}
}
return;
}
// This catches when original library type is Manga/Comic and when parsing with non
if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
normalizedPath = Parser.Parser.NormalizePath(folderPath);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
{
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
var info2 = _readingItemService.Parse(path, rootPath, type);
info.Merge(info2);
}
info.ComicInfo = _readingItemService.GetComicInfo(path);
if (info.ComicInfo != null)
{
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
{
info.Volumes = info.ComicInfo.Volume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
{
info.Series = info.ComicInfo.Series.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
{
info.Chapters = info.ComicInfo.Number;
}
// Patch is SeriesSort from ComicInfo
if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort))
{
info.SeriesSort = info.ComicInfo.TitleSort.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format))
{
info.IsSpecial = true;
info.Chapters = Parser.Parser.DefaultChapter;
info.Volumes = Parser.Parser.DefaultVolume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort))
{
info.SeriesSort = info.ComicInfo.SeriesSort.Trim();
}
if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries))
{
info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim();
}
}
try
{
TrackSeries(info);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
await folderAction(new List<string>(), folderPath);
return;
}
await folderAction(_directoryService.ScanFiles(folderPath), folderPath);
}
@ -156,13 +115,14 @@ namespace API.Services.Tasks.Scanner
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
/// </summary>
/// <param name="scannedSeries">A localized list of a series' parsed infos</param>
/// <param name="info"></param>
private void TrackSeries(ParserInfo info)
private void TrackSeries(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
if (info.Series == string.Empty) return;
// Check if normalized info.Series already exists and if so, update info to use that name instead
info.Series = MergeName(info);
info.Series = MergeName(scannedSeries, info);
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort);
@ -170,7 +130,7 @@ namespace API.Services.Tasks.Scanner
try
{
var existingKey = _scannedSeries.Keys.SingleOrDefault(ps =>
var existingKey = scannedSeries.Keys.SingleOrDefault(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries)));
@ -181,7 +141,7 @@ namespace API.Services.Tasks.Scanner
NormalizedName = normalizedSeries
};
_scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
if (!oldValue.Contains(info))
@ -195,7 +155,7 @@ namespace API.Services.Tasks.Scanner
catch (Exception ex)
{
_logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series);
foreach (var seriesKey in _scannedSeries.Keys.Where(ps =>
foreach (var seriesKey in scannedSeries.Keys.Where(ps =>
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|| ps.NormalizedName.Equals(normalizedSortSeries))))
@ -205,23 +165,24 @@ namespace API.Services.Tasks.Scanner
}
}
/// <summary>
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="info"></param>
/// <returns>Series Name to group this info into</returns>
public string MergeName(ParserInfo info)
private string MergeName(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries);
// We use FirstOrDefault because this was introduced late in development and users might have 2 series with both names
try
{
var existingName =
_scannedSeries.SingleOrDefault(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
scannedSeries.SingleOrDefault(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) ||
Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) &&
p.Key.Format == info.Format)
.Key;
@ -233,7 +194,7 @@ namespace API.Services.Tasks.Scanner
catch (Exception ex)
{
_logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath);
var values = _scannedSeries.Where(p =>
var values = scannedSeries.Where(p =>
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
p.Key.Format == info.Format);
@ -247,34 +208,77 @@ namespace API.Services.Tasks.Scanner
return info.Series;
}
/// <summary>
///
/// This will process series by folder groups.
/// </summary>
/// <param name="libraryType">Type of library. Used for selecting the correct file extensions to search for and parsing files</param>
/// <param name="folders">The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders</param>
/// <param name="libraryName">Name of the Library</param>
/// <param name="libraryType"></param>
/// <param name="folders"></param>
/// <param name="libraryName"></param>
/// <returns></returns>
public async Task<Dictionary<ParsedSeries, List<ParserInfo>>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable<string> folders, string libraryName)
public async Task ScanLibrariesForSeries(LibraryType libraryType,
IEnumerable<string> folders, string libraryName, bool isLibraryScan,
IDictionary<string, IList<SeriesModified>> seriesPaths, Action<Tuple<bool, IList<ParserInfo>>> processSeriesInfos, bool forceCheck = false)
{
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Started));
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Starting", libraryName, ProgressEventType.Started));
foreach (var folderPath in folders)
{
try
{
async void Action(string f)
await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) =>
{
try
var normalizedFolder = Parser.Parser.NormalizePath(folder);
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck))
{
ProcessFile(f, folderPath, libraryType);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(f, libraryName, ProgressEventType.Updated));
var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo()
{
Series = fp.SeriesName,
Format = fp.Format,
}).ToList();
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(true, parsedInfos));
_logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder);
return;
}
catch (FileNotFoundException exception)
_logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder);
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated));
if (files.Count == 0)
{
_logger.LogError(exception, "The file {Filename} could not be found", f);
_logger.LogInformation("[ScannerService] {Folder} is empty", folder);
return;
}
}
var scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
var infos = files
.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType))
.Where(info => info != null)
.ToList();
_directoryService.TraverseTreeParallelForEach(folderPath, Action, Parser.Parser.SupportedExtensions, _logger);
MergeLocalizedSeriesWithSeries(infos);
foreach (var info in infos)
{
try
{
TrackSeries(scannedSeries, info);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
}
}
// It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it
// Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow)
foreach (var series in scannedSeries.Keys)
{
if (scannedSeries[series].Count > 0 && processSeriesInfos != null)
{
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(false, scannedSeries[series]));
}
}
}, forceCheck);
}
catch (ArgumentException ex)
{
@ -282,20 +286,76 @@ namespace API.Services.Tasks.Scanner
}
}
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Ended));
return SeriesWithInfos();
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Done", libraryName, ProgressEventType.Ended));
}
/// <summary>
/// Returns any series where there were parsed infos
/// Checks against all folder paths on file if the last scanned is >= the directory's last write down to the second
/// </summary>
/// <param name="seriesPaths"></param>
/// <param name="normalizedFolder"></param>
/// <param name="forceCheck"></param>
/// <returns></returns>
private Dictionary<ParsedSeries, List<ParserInfo>> SeriesWithInfos()
private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary<string, IList<SeriesModified>> seriesPaths, string normalizedFolder, bool forceCheck = false)
{
var filtered = _scannedSeries.Where(kvp => kvp.Value.Count > 0);
var series = filtered.ToDictionary(v => v.Key, v => v.Value);
return series;
if (forceCheck) return false;
return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerSecond) >=
_directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerSecond));
}
/// <summary>
/// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so,
/// rewrites the infos with series name instead of the localized name, so they stack.
/// </summary>
/// <example>
/// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration"
/// World of Acceleration v02.cbz has Series "World of Acceleration"
/// After running this code, we'd have:
/// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration"
/// </example>
/// <param name="infos">A collection of ParserInfos</param>
private void MergeLocalizedSeriesWithSeries(IReadOnlyCollection<ParserInfo> infos)
{
var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries));
if (!hasLocalizedSeries) return;
var localizedSeries = infos
.Where(i => !i.IsSpecial)
.Select(i => i.LocalizedSeries)
.Distinct()
.FirstOrDefault(i => !string.IsNullOrEmpty(i));
if (string.IsNullOrEmpty(localizedSeries)) return;
// NOTE: If we have multiple series in a folder with a localized title, then this will fail. It will group into one series. User needs to fix this themselves.
string nonLocalizedSeries;
// Normalize this as many of the cases is a capitalization difference
var nonLocalizedSeriesFound = infos
.Where(i => !i.IsSpecial)
.Select(i => i.Series).DistinctBy(Parser.Parser.Normalize).ToList();
if (nonLocalizedSeriesFound.Count == 1)
{
nonLocalizedSeries = nonLocalizedSeriesFound.First();
}
else
{
// There can be a case where there are multiple series in a folder that causes merging.
if (nonLocalizedSeriesFound.Count > 2)
{
_logger.LogError("[ScannerService] There are multiple series within one folder that contain localized series. This will cause them to group incorrectly. Please separate series into their own dedicated folder or ensure there is only 2 potential series (localized and series): {LocalizedSeries}", string.Join(", ", nonLocalizedSeriesFound));
}
nonLocalizedSeries = nonLocalizedSeriesFound.FirstOrDefault(s => !s.Equals(localizedSeries));
}
if (string.IsNullOrEmpty(nonLocalizedSeries)) return;
var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries);
foreach (var infoNeedingMapping in infos.Where(i =>
!Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries)))
{
infoNeedingMapping.Series = nonLocalizedSeries;
infoNeedingMapping.LocalizedSeries = localizedSeries;
}
}
}
}

View File

@ -5,10 +5,16 @@ using API.Services;
namespace API.Parser;
public interface IDefaultParser
{
ParserInfo Parse(string filePath, string rootPath, LibraryType type = LibraryType.Manga);
void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret);
}
/// <summary>
/// This is an implementation of the Parser that is the basis for everything
/// </summary>
public class DefaultParser
public class DefaultParser : IDefaultParser
{
private readonly IDirectoryService _directoryService;
@ -30,15 +36,15 @@ public class DefaultParser
var fileName = _directoryService.FileSystem.Path.GetFileNameWithoutExtension(filePath);
ParserInfo ret;
if (Parser.IsEpub(filePath))
if (Services.Tasks.Scanner.Parser.Parser.IsEpub(filePath))
{
ret = new ParserInfo()
{
Chapters = Parser.ParseChapter(fileName) ?? Parser.ParseComicChapter(fileName),
Series = Parser.ParseSeries(fileName) ?? Parser.ParseComicSeries(fileName),
Volumes = Parser.ParseVolume(fileName) ?? Parser.ParseComicVolume(fileName),
Chapters = Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName),
Series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName),
Volumes = Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName),
Filename = Path.GetFileName(filePath),
Format = Parser.ParseFormat(filePath),
Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath),
FullFilePath = filePath
};
}
@ -46,65 +52,65 @@ public class DefaultParser
{
ret = new ParserInfo()
{
Chapters = type == LibraryType.Manga ? Parser.ParseChapter(fileName) : Parser.ParseComicChapter(fileName),
Series = type == LibraryType.Manga ? Parser.ParseSeries(fileName) : Parser.ParseComicSeries(fileName),
Volumes = type == LibraryType.Manga ? Parser.ParseVolume(fileName) : Parser.ParseComicVolume(fileName),
Chapters = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName),
Series = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName),
Volumes = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName),
Filename = Path.GetFileName(filePath),
Format = Parser.ParseFormat(filePath),
Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath),
Title = Path.GetFileNameWithoutExtension(fileName),
FullFilePath = filePath
};
}
if (Parser.IsImage(filePath) && Parser.IsCoverImage(filePath)) return null;
if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath) && Services.Tasks.Scanner.Parser.Parser.IsCoverImage(filePath)) return null;
if (Parser.IsImage(filePath))
if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath))
{
// Reset Chapters, Volumes, and Series as images are not good to parse information out of. Better to use folders.
ret.Volumes = Parser.DefaultVolume;
ret.Chapters = Parser.DefaultChapter;
ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume;
ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter;
ret.Series = string.Empty;
}
if (ret.Series == string.Empty || Parser.IsImage(filePath))
if (ret.Series == string.Empty || Services.Tasks.Scanner.Parser.Parser.IsImage(filePath))
{
// Try to parse information out of each folder all the way to rootPath
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
var edition = Parser.ParseEdition(fileName);
var edition = Services.Tasks.Scanner.Parser.Parser.ParseEdition(fileName);
if (!string.IsNullOrEmpty(edition))
{
ret.Series = Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic);
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic);
ret.Edition = edition;
}
var isSpecial = type == LibraryType.Comic ? Parser.ParseComicSpecial(fileName) : Parser.ParseMangaSpecial(fileName);
var isSpecial = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSpecial(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(fileName);
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
if (ret.Chapters == Parser.DefaultChapter && ret.Volumes == Parser.DefaultVolume && !string.IsNullOrEmpty(isSpecial))
if (ret.Chapters == Services.Tasks.Scanner.Parser.Parser.DefaultChapter && ret.Volumes == Services.Tasks.Scanner.Parser.Parser.DefaultVolume && !string.IsNullOrEmpty(isSpecial))
{
ret.IsSpecial = true;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret); // NOTE: This can cause some complications, we should try to be a bit less aggressive to fallback to folder
}
// If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name
if (Parser.HasSpecialMarker(fileName))
if (Services.Tasks.Scanner.Parser.Parser.HasSpecialMarker(fileName))
{
ret.IsSpecial = true;
ret.Chapters = Parser.DefaultChapter;
ret.Volumes = Parser.DefaultVolume;
ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter;
ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
if (string.IsNullOrEmpty(ret.Series))
{
ret.Series = Parser.CleanTitle(fileName, type is LibraryType.Comic);
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(fileName, type is LibraryType.Comic);
}
// Pdfs may have .pdf in the series name, remove that
if (Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf"))
if (Services.Tasks.Scanner.Parser.Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf"))
{
ret.Series = ret.Series.Substring(0, ret.Series.Length - ".pdf".Length);
}
@ -125,18 +131,18 @@ public class DefaultParser
for (var i = 0; i < fallbackFolders.Count; i++)
{
var folder = fallbackFolders[i];
if (!string.IsNullOrEmpty(Parser.ParseMangaSpecial(folder))) continue;
if (!string.IsNullOrEmpty(Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(folder))) continue;
var parsedVolume = type is LibraryType.Manga ? Parser.ParseVolume(folder) : Parser.ParseComicVolume(folder);
var parsedChapter = type is LibraryType.Manga ? Parser.ParseChapter(folder) : Parser.ParseComicChapter(folder);
var parsedVolume = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseVolume(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(folder);
var parsedChapter = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseChapter(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(folder);
if (!parsedVolume.Equals(Parser.DefaultVolume) || !parsedChapter.Equals(Parser.DefaultChapter))
if (!parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume) || !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter))
{
if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Parser.DefaultVolume)) && !parsedVolume.Equals(Parser.DefaultVolume))
if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) && !parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume))
{
ret.Volumes = parsedVolume;
}
if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Parser.DefaultChapter)) && !parsedChapter.Equals(Parser.DefaultChapter))
if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) && !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter))
{
ret.Chapters = parsedChapter;
}
@ -145,11 +151,11 @@ public class DefaultParser
// Generally users group in series folders. Let's try to parse series from the top folder
if (!folder.Equals(ret.Series) && i == fallbackFolders.Count - 1)
{
var series = Parser.ParseSeries(folder);
var series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(folder);
if (string.IsNullOrEmpty(series))
{
ret.Series = Parser.CleanTitle(folder, type is LibraryType.Comic);
ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(folder, type is LibraryType.Comic);
break;
}

View File

@ -5,7 +5,7 @@ using System.Linq;
using System.Text.RegularExpressions;
using API.Entities.Enums;
namespace API.Parser
namespace API.Services.Tasks.Scanner.Parser
{
public static class Parser
{
@ -15,7 +15,7 @@ namespace API.Parser
public const string ImageFileExtensions = @"^(\.png|\.jpeg|\.jpg|\.webp|\.gif)";
public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|\.cb7|\.cbt";
public const string BookFileExtensions = @"\.epub|\.pdf";
private const string BookFileExtensions = @"\.epub|\.pdf";
public const string MacOsMetadataFileStartsWith = @"._";
public const string SupportedExtensions =
@ -1031,9 +1031,15 @@ namespace API.Parser
return IsImage(filename) && CoverImageRegex.IsMatch(filename);
}
/// <summary>
/// Validates that a Path doesn't start with certain blacklisted folders, like __MACOSX, @Recently-Snapshot, etc and that if a full path, the filename
/// doesn't start with ._, which is a metadata file on MACOSX.
/// </summary>
/// <param name="path"></param>
/// <returns></returns>
public static bool HasBlacklistedFolderInPath(string path)
{
return path.Contains("__MACOSX") || path.StartsWith("@Recently-Snapshot") || path.StartsWith("@recycle") || path.StartsWith("._") || path.Contains(".qpkg");
return path.Contains("__MACOSX") || path.StartsWith("@Recently-Snapshot") || path.StartsWith("@recycle") || path.StartsWith("._") || Path.GetFileName(path).StartsWith("._") || path.Contains(".qpkg");
}
@ -1066,7 +1072,8 @@ namespace API.Parser
/// <returns></returns>
public static string NormalizePath(string path)
{
return path.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
return path.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar)
.Replace(@"//", Path.AltDirectorySeparatorChar + string.Empty);
}
/// <summary>

View File

@ -1,5 +1,6 @@
using API.Data.Metadata;
using API.Entities.Enums;
using API.Services.Tasks.Scanner.Parser;
namespace API.Parser
{

View File

@ -0,0 +1,819 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Data.Metadata;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
using API.Helpers;
using API.Parser;
using API.Services.Tasks.Metadata;
using API.SignalR;
using Hangfire;
using Microsoft.Extensions.Logging;
namespace API.Services.Tasks.Scanner;
public interface IProcessSeries
{
/// <summary>
/// Do not allow this Prime to be invoked by multiple threads. It will break the DB.
/// </summary>
/// <returns></returns>
Task Prime();
Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library);
void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false);
}
/// <summary>
/// All code needed to Update a Series from a Scan action
/// </summary>
public class ProcessSeries : IProcessSeries
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<ProcessSeries> _logger;
private readonly IEventHub _eventHub;
private readonly IDirectoryService _directoryService;
private readonly ICacheHelper _cacheHelper;
private readonly IReadingItemService _readingItemService;
private readonly IFileService _fileService;
private readonly IMetadataService _metadataService;
private readonly IWordCountAnalyzerService _wordCountAnalyzerService;
private IList<Genre> _genres;
private IList<Person> _people;
private IList<Tag> _tags;
public ProcessSeries(IUnitOfWork unitOfWork, ILogger<ProcessSeries> logger, IEventHub eventHub,
IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService,
IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService)
{
_unitOfWork = unitOfWork;
_logger = logger;
_eventHub = eventHub;
_directoryService = directoryService;
_cacheHelper = cacheHelper;
_readingItemService = readingItemService;
_fileService = fileService;
_metadataService = metadataService;
_wordCountAnalyzerService = wordCountAnalyzerService;
}
/// <summary>
/// Invoke this before processing any series, just once to prime all the needed data during a scan
/// </summary>
public async Task Prime()
{
_genres = await _unitOfWork.GenreRepository.GetAllGenresAsync();
_people = await _unitOfWork.PersonRepository.GetAllPeople();
_tags = await _unitOfWork.TagRepository.GetAllTagsAsync();
}
public async Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library)
{
if (!parsedInfos.Any()) return;
var seriesAdded = false;
var scanWatch = Stopwatch.StartNew();
var seriesName = parsedInfos.First().Series;
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Updated, seriesName));
_logger.LogInformation("[ScannerService] Beginning series update on {SeriesName}", seriesName);
// Check if there is a Series
var firstInfo = parsedInfos.First();
Series series;
try
{
series =
await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(firstInfo.Series, firstInfo.LocalizedSeries,
library.Id, firstInfo.Format);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an exception finding existing series for {SeriesName} with Localized name of {LocalizedName} for library {LibraryId}. This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan", firstInfo.Series, firstInfo.LocalizedSeries, library.Id);
await _eventHub.SendMessageAsync(MessageFactory.Error,
MessageFactory.ErrorEvent($"There was an exception finding existing series for {firstInfo.Series} with Localized name of {firstInfo.LocalizedSeries} for library {library.Id}",
"This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan."));
return;
}
if (series == null)
{
seriesAdded = true;
series = DbFactory.Series(firstInfo.Series, firstInfo.LocalizedSeries);
}
if (series.LibraryId == 0) series.LibraryId = library.Id;
try
{
_logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName);
var firstParsedInfo = parsedInfos[0];
UpdateVolumes(series, parsedInfos);
series.Pages = series.Volumes.Sum(v => v.Pages);
series.NormalizedName = Parser.Parser.Normalize(series.Name);
series.OriginalName ??= firstParsedInfo.Series;
if (series.Format == MangaFormat.Unknown)
{
series.Format = firstParsedInfo.Format;
}
if (string.IsNullOrEmpty(series.SortName))
{
series.SortName = series.Name;
}
if (!series.SortNameLocked)
{
series.SortName = series.Name;
if (!string.IsNullOrEmpty(firstParsedInfo.SeriesSort))
{
series.SortName = firstParsedInfo.SeriesSort;
}
}
// parsedInfos[0] is not the first volume or chapter. We need to find it
var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p));
if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries))
{
series.LocalizedName = localizedSeries;
series.NormalizedLocalizedName = Parser.Parser.Normalize(series.LocalizedName);
}
UpdateSeriesMetadata(series, library.Type);
// Update series FolderPath here
await UpdateSeriesFolderPath(parsedInfos, library, series);
series.LastFolderScanned = DateTime.Now;
_unitOfWork.SeriesRepository.Attach(series);
if (_unitOfWork.HasChanges())
{
try
{
await _unitOfWork.CommitAsync();
}
catch (Exception ex)
{
await _unitOfWork.RollbackAsync();
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series);
await _eventHub.SendMessageAsync(MessageFactory.Error,
MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}",
ex.Message));
return;
}
if (seriesAdded)
{
await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded,
MessageFactory.SeriesAddedEvent(series.Id, series.Name, series.LibraryId), false);
}
_logger.LogInformation("[ScannerService] Finished series update on {SeriesName} in {Milliseconds} ms", seriesName, scanWatch.ElapsedMilliseconds);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "[ScannerService] There was an exception updating series for {SeriesName}", series.Name);
}
await _metadataService.GenerateCoversForSeries(series, false);
EnqueuePostSeriesProcessTasks(series.LibraryId, series.Id);
}
private async Task UpdateSeriesFolderPath(IEnumerable<ParserInfo> parsedInfos, Library library, Series series)
{
var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path),
parsedInfos.Select(f => f.FullFilePath).ToList());
if (seriesDirs.Keys.Count == 0)
{
_logger.LogCritical(
"Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are under a single folder from library");
await _eventHub.SendMessageAsync(MessageFactory.Info,
MessageFactory.InfoEvent($"{series.Name} has files spread outside a single series folder",
"This has negative performance effects. Please ensure all series are under a single folder from library"));
}
else
{
// Don't save FolderPath if it's a library Folder
if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First()))
{
series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First());
}
}
}
public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false)
{
//BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate));
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
}
private static void UpdateSeriesMetadata(Series series, LibraryType libraryType)
{
series.Metadata ??= DbFactory.SeriesMetadata(new List<CollectionTag>());
var isBook = libraryType == LibraryType.Book;
var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook);
var firstFile = firstChapter?.Files.FirstOrDefault();
if (firstFile == null) return;
if (Parser.Parser.IsPdf(firstFile.FilePath)) return;
var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList();
// Update Metadata based on Chapter metadata
series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year);
if (series.Metadata.ReleaseYear < 1000)
{
// Not a valid year, default to 0
series.Metadata.ReleaseYear = 0;
}
// Set the AgeRating as highest in all the comicInfos
if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating);
series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount);
series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count);
// To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well.
if (series.Metadata.MaxCount != series.Metadata.TotalCount)
{
var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name));
var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range));
if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume;
else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter;
}
if (!series.Metadata.PublicationStatusLocked)
{
series.Metadata.PublicationStatus = PublicationStatus.OnGoing;
if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0)
{
series.Metadata.PublicationStatus = PublicationStatus.Completed;
} else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0)
{
series.Metadata.PublicationStatus = PublicationStatus.Ended;
}
}
if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked)
{
series.Metadata.Summary = firstChapter.Summary;
}
if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked)
{
series.Metadata.Language = firstChapter.Language;
}
// Handle People
foreach (var chapter in chapters)
{
if (!series.Metadata.WriterLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Writer))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.CoverArtistLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.CoverArtist))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.PublisherLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Publisher))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.CharacterLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Character))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.ColoristLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Colorist))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.EditorLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Editor))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.InkerLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Inker))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.LettererLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Letterer))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.PencillerLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Penciller))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.TranslatorLocked)
{
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Translator))
{
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
}
}
if (!series.Metadata.TagsLocked)
{
foreach (var tag in chapter.Tags)
{
TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag);
}
}
if (!series.Metadata.GenresLocked)
{
foreach (var genre in chapter.Genres)
{
GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre);
}
}
}
var genres = chapters.SelectMany(c => c.Genres).ToList();
GenreHelper.KeepOnlySameGenreBetweenLists(series.Metadata.Genres.ToList(), genres, genre =>
{
if (series.Metadata.GenresLocked) return;
series.Metadata.Genres.Remove(genre);
});
// NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it
// I might be able to filter out people that are in locked fields?
var people = chapters.SelectMany(c => c.People).ToList();
PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People.ToList(),
people, person =>
{
switch (person.Role)
{
case PersonRole.Writer:
if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Penciller:
if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Inker:
if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Colorist:
if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Letterer:
if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.CoverArtist:
if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Editor:
if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Publisher:
if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Character:
if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person);
break;
case PersonRole.Translator:
if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person);
break;
default:
series.Metadata.People.Remove(person);
break;
}
});
}
private void UpdateVolumes(Series series, IList<ParserInfo> parsedInfos)
{
var startingVolumeCount = series.Volumes.Count;
// Add new volumes and update chapters per volume
var distinctVolumes = parsedInfos.DistinctVolumes();
_logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name);
foreach (var volumeNumber in distinctVolumes)
{
var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber);
if (volume == null)
{
volume = DbFactory.Volume(volumeNumber);
volume.SeriesId = series.Id;
series.Volumes.Add(volume);
}
volume.Name = volumeNumber;
_logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name);
var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray();
UpdateChapters(series, volume, infos);
volume.Pages = volume.Chapters.Sum(c => c.Pages);
// Update all the metadata on the Chapters
foreach (var chapter in volume.Chapters)
{
var firstFile = chapter.Files.MinBy(x => x.Chapter);
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue;
try
{
var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath));
UpdateChapterFromComicInfo(chapter, firstChapterInfo?.ComicInfo);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was some issue when updating chapter's metadata");
}
}
}
// Remove existing volumes that aren't in parsedInfos
var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList();
if (series.Volumes.Count != nonDeletedVolumes.Count)
{
_logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name",
(series.Volumes.Count - nonDeletedVolumes.Count), series.Name);
var deletedVolumes = series.Volumes.Except(nonDeletedVolumes);
foreach (var volume in deletedVolumes)
{
var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? "";
if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file))
{
_logger.LogError(
"[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}",
file);
}
_logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file);
}
series.Volumes = nonDeletedVolumes;
}
_logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
series.Name, startingVolumeCount, series.Volumes.Count);
}
private void UpdateChapters(Series series, Volume volume, IList<ParserInfo> parsedInfos)
{
// Add new chapters
foreach (var info in parsedInfos)
{
// Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0
// also are treated like specials for UI grouping.
Chapter chapter;
try
{
chapter = volume.Chapters.GetChapterByRange(info);
}
catch (Exception ex)
{
_logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters);
continue;
}
if (chapter == null)
{
_logger.LogDebug(
"[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters);
chapter = DbFactory.Chapter(info);
volume.Chapters.Add(chapter);
series.LastChapterAdded = DateTime.Now;
}
else
{
chapter.UpdateFrom(info);
}
if (chapter == null) continue;
// Add files
var specialTreatment = info.IsSpecialInfo();
AddOrUpdateFileForChapter(chapter, info);
chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty;
chapter.Range = specialTreatment ? info.Filename : info.Chapters;
}
// Remove chapters that aren't in parsedInfos or have no files linked
var existingChapters = volume.Chapters.ToList();
foreach (var existingChapter in existingChapters)
{
if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter))
{
_logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series);
volume.Chapters.Remove(existingChapter);
}
else
{
// Ensure we remove any files that no longer exist AND order
existingChapter.Files = existingChapter.Files
.Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath))
.OrderByNatural(f => f.FilePath).ToList();
existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages);
}
}
}
private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info)
{
chapter.Files ??= new List<MangaFile>();
var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath);
if (existingFile != null)
{
existingFile.Format = info.Format;
if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return;
existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format);
// We skip updating DB here with last modified time so that metadata refresh can do it
}
else
{
var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format));
if (file == null) return;
chapter.Files.Add(file);
}
}
#nullable enable
private void UpdateChapterFromComicInfo(Chapter chapter, ComicInfo? info)
{
var firstFile = chapter.Files.MinBy(x => x.Chapter);
if (firstFile == null ||
_cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return;
var comicInfo = info;
if (info == null)
{
comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath);
}
if (comicInfo == null) return;
_logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath);
chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating);
if (!string.IsNullOrEmpty(comicInfo.Title))
{
chapter.TitleName = comicInfo.Title.Trim();
}
if (!string.IsNullOrEmpty(comicInfo.Summary))
{
chapter.Summary = comicInfo.Summary;
}
if (!string.IsNullOrEmpty(comicInfo.LanguageISO))
{
chapter.Language = comicInfo.LanguageISO;
}
if (comicInfo.Count > 0)
{
chapter.TotalCount = comicInfo.Count;
}
// This needs to check against both Number and Volume to calculate Count
if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0)
{
chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number));
}
if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0)
{
chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume)));
}
void AddPerson(Person person)
{
PersonHelper.AddPersonIfNotExists(chapter.People, person);
}
void AddGenre(Genre genre)
{
//chapter.Genres.Add(genre);
GenreHelper.AddGenreIfNotExists(chapter.Genres, genre);
}
void AddTag(Tag tag, bool added)
{
//chapter.Tags.Add(tag);
TagHelper.AddTagIfNotExists(chapter.Tags, tag);
}
if (comicInfo.Year > 0)
{
var day = Math.Max(comicInfo.Day, 1);
var month = Math.Max(comicInfo.Month, 1);
chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}");
}
var people = GetTagValues(comicInfo.Colorist);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist);
UpdatePeople(people, PersonRole.Colorist,
AddPerson);
people = GetTagValues(comicInfo.Characters);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character);
UpdatePeople(people, PersonRole.Character,
AddPerson);
people = GetTagValues(comicInfo.Translator);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator);
UpdatePeople(people, PersonRole.Translator,
AddPerson);
people = GetTagValues(comicInfo.Writer);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer);
UpdatePeople(people, PersonRole.Writer,
AddPerson);
people = GetTagValues(comicInfo.Editor);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor);
UpdatePeople(people, PersonRole.Editor,
AddPerson);
people = GetTagValues(comicInfo.Inker);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker);
UpdatePeople(people, PersonRole.Inker,
AddPerson);
people = GetTagValues(comicInfo.Letterer);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer);
UpdatePeople(people, PersonRole.Letterer,
AddPerson);
people = GetTagValues(comicInfo.Penciller);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller);
UpdatePeople(people, PersonRole.Penciller,
AddPerson);
people = GetTagValues(comicInfo.CoverArtist);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist);
UpdatePeople(people, PersonRole.CoverArtist,
AddPerson);
people = GetTagValues(comicInfo.Publisher);
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher);
UpdatePeople(people, PersonRole.Publisher,
AddPerson);
var genres = GetTagValues(comicInfo.Genre);
GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList());
UpdateGenre(genres, false,
AddGenre);
var tags = GetTagValues(comicInfo.Tags);
TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList());
UpdateTag(tags, false,
AddTag);
}
private static IList<string> GetTagValues(string comicInfoTagSeparatedByComma)
{
if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma))
{
return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList();
}
return ImmutableList<string>.Empty;
}
#nullable disable
/// <summary>
/// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and
/// add an entry. For each person in name, the callback will be executed.
/// </summary>
/// <remarks>This does not remove people if an empty list is passed into names</remarks>
/// <remarks>This is used to add new people to a list without worrying about duplicating rows in the DB</remarks>
/// <param name="names"></param>
/// <param name="role"></param>
/// <param name="action"></param>
private void UpdatePeople(IEnumerable<string> names, PersonRole role, Action<Person> action)
{
var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList();
foreach (var name in names)
{
var normalizedName = Parser.Parser.Normalize(name);
var person = allPeopleTypeRole.FirstOrDefault(p =>
p.NormalizedName.Equals(normalizedName));
if (person == null)
{
person = DbFactory.Person(name, role);
lock (_people)
{
_people.Add(person);
}
}
action(person);
}
}
/// <summary>
///
/// </summary>
/// <param name="names"></param>
/// <param name="isExternal"></param>
/// <param name="action"></param>
private void UpdateGenre(IEnumerable<string> names, bool isExternal, Action<Genre> action)
{
foreach (var name in names)
{
if (string.IsNullOrEmpty(name.Trim())) continue;
var normalizedName = Parser.Parser.Normalize(name);
var genre = _genres.FirstOrDefault(p =>
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
if (genre == null)
{
genre = DbFactory.Genre(name, false);
lock (_genres)
{
_genres.Add(genre);
}
}
action(genre);
}
}
/// <summary>
///
/// </summary>
/// <param name="names"></param>
/// <param name="isExternal"></param>
/// <param name="action">Callback for every item. Will give said item back and a bool if item was added</param>
private void UpdateTag(IEnumerable<string> names, bool isExternal, Action<Tag, bool> action)
{
foreach (var name in names)
{
if (string.IsNullOrEmpty(name.Trim())) continue;
var added = false;
var normalizedName = Parser.Parser.Normalize(name);
var tag = _tags.FirstOrDefault(p =>
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
if (tag == null)
{
added = true;
tag = DbFactory.Tag(name, false);
lock (_tags)
{
_tags.Add(tag);
}
}
action(tag, added);
}
}
}

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More