From 0eac19324831fdcc07d0496f9bb2742d4482692a Mon Sep 17 00:00:00 2001 From: Joseph Milazzo Date: Fri, 19 Aug 2022 07:42:38 -0500 Subject: [PATCH 001/134] New Scan Loop (#1447) * Staging the code for the new scan loop. * Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real. * Started writing unit test for new loop code * Implemented a basic method to scan a folder path with ignore support (not implemented, code in place) * Added some code to the parser to build out the idea of processing series in batches based on some top level folder. * Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue. * Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support). * Wrote some notes on update library scan loop. * Removed migration for merge * Reapplied the SeriesFolder migration after merge * Refactored a check that used multiple db calls into one. * Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then. * Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned. * Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them. * Fixed an issue where ignore files nested wouldn't stack with higher level ignores * Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking. * Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it. * Refactored ScanFiles out to Directory Service. * Refactored more code out to keep the code clean. * More unit tests * Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work). * Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning. * Prep for unit tests (updating broken ones with new implementations) * Just some notes. Not sure I want to finish this work. * Refactored the LibraryWatcher with some comments and state variables. * Undid the migrations in case I don't move forward with this branch * Started to clean the code and prepare for finishing this work. * Fixed a bad merge * Updated signatures to cleanup the code and commit to the new strategy for scanning. * Swapped out the code with async processing of series on a small library * The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations. * Refactored UpdateSeries out of Scanner and into a dedicated file. * Refactored how ProcessTasks are awaited to allow more async * Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush * Moved where we start to stopwatch to encapsulate the full scan * Cleaned up SignalR events to report correctly (still needs a redesign) * Remove the "remove" code until I figure it out * Put in extremely expensive series deletion code for library scan. * Have Genre and Tag update the DB immediately to avoid dup issues * Taking a break * Moving to a lock with People was successful. Need to apply to others. * Refactored code for series level and tag and genre with new locking strategy. * New scan loop works. Next up optimization * Swapped out the Kavita log with svg for faster load * Refactored metadata updates to occur when the series are being updated. * Code cleanup * Added a new type of generic message (Info) to inform the user. * Code cleanup * Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds. Fixed a bug where File Analysis was running everytime for each non-epub file. * Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet. * Some code cleanup * Added experimental signalr update code to have a more natural refresh of library-detail page * Hooked in ability to send new series events to UI * Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series. * Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors. Added --event-widget-info-bg-color * Remove --drawer-background-color since it's not used * When new series added, inject directly into the view. * Some debug code cleanup * Fixed up the unit tests * Ensure all config directories exist on startup * Disabled Library Watching (that will go in next build) * Ensure update for series is admin only * Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again. * Removed SeriesFolder migration * Added the SeriesFolder migration * Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail. * The scan optimizations now work for NTFS systems. * Removed a TODO * Migrated all the times to use DateTime.Now and not Utc. * Refactored some repo calls to use the includes flag pattern * Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed. * Added another optimization which will use just folder attribute of last write time if the drive is not NTFS. * Fixed a unit test * Some code cleanup --- API.Benchmark/ParseScannedFilesBenchmarks.cs | 69 - .../ParserInfoListExtensionsTests.cs | 2 +- API.Tests/Helpers/ParserInfoFactory.cs | 4 +- API.Tests/Helpers/ParserInfoHelperTests.cs | 4 +- API.Tests/Parser/MangaParserTests.cs | 1 + API.Tests/Services/BookmarkServiceTests.cs | 70 + API.Tests/Services/CacheServiceTests.cs | 5 + API.Tests/Services/DirectoryServiceTests.cs | 122 ++ API.Tests/Services/ParseScannedFilesTests.cs | 259 ++- API.Tests/Services/ScannerServiceTests.cs | 6 +- API/Controllers/AccountController.cs | 6 +- API/Controllers/ReaderController.cs | 3 +- API/DTOs/SeriesDto.cs | 4 + API/Data/DataContext.cs | 1 + .../20220817173731_SeriesFolder.Designer.cs | 1605 +++++++++++++++++ .../Migrations/20220817173731_SeriesFolder.cs | 37 + .../Migrations/DataContextModelSnapshot.cs | 6 + .../Repositories/CollectionTagRepository.cs | 1 + API/Data/Repositories/LibraryRepository.cs | 51 +- API/Data/Repositories/SeriesRepository.cs | 148 +- API/Entities/FolderPath.cs | 3 +- API/Entities/Library.cs | 18 + API/Entities/Series.cs | 10 +- .../ApplicationServiceExtensions.cs | 3 + API/Helpers/GenreHelper.cs | 12 + API/Helpers/ParserInfoHelpers.cs | 2 +- API/Helpers/PersonHelper.cs | 16 + API/Helpers/TagHelper.cs | 11 + API/Parser/DefaultParser.cs | 8 +- API/Parser/Parser.cs | 4 +- API/Services/ArchiveService.cs | 3 +- API/Services/CacheService.cs | 13 +- API/Services/DirectoryService.cs | 208 ++- .../StartupTasksHostedService.cs | 6 + API/Services/MetadataService.cs | 37 +- API/Services/ReadingItemService.cs | 68 +- API/Services/SeriesService.cs | 9 + API/Services/TaskScheduler.cs | 53 +- .../Metadata/WordCountAnalyzerService.cs | 8 +- API/Services/Tasks/Scanner/LibraryWatcher.cs | 212 +++ .../Tasks/Scanner/ParseScannedFiles.cs | 263 +-- API/Services/Tasks/Scanner/ProcessSeries.cs | 776 ++++++++ API/Services/Tasks/Scanner/ScanLibrary.cs | 111 ++ API/Services/Tasks/ScannerService.cs | 1122 ++++-------- API/SignalR/MessageFactory.cs | 26 +- API/Startup.cs | 6 +- API/config/appsettings.Development.json | 2 +- Kavita.Common/Helpers/GlobMatcher.cs | 64 + Kavita.Common/Kavita.Common.csproj | 1 + UI/Web/src/app/_models/events/info-event.ts | 32 + UI/Web/src/app/_models/series.ts | 4 + .../src/app/_services/message-hub.service.ts | 13 +- .../edit-series-modal.component.html | 6 +- .../card-detail-layout.component.ts | 9 +- .../src/app/dashboard/dashboard.component.ts | 7 +- .../library-detail.component.html | 1 + .../library-detail.component.ts | 32 +- .../manga-reader/manga-reader.component.ts | 5 - .../events-widget.component.html | 45 +- .../events-widget.component.scss | 24 + .../events-widget/events-widget.component.ts | 49 +- .../nav/nav-header/nav-header.component.html | 180 +- UI/Web/src/app/pipe/default-date.pipe.ts | 13 + UI/Web/src/app/pipe/pipe.module.ts | 7 +- .../confirm-email/confirm-email.component.ts | 2 +- UI/Web/src/app/shared/_models/download.ts | 2 +- .../side-nav-item/side-nav-item.component.ts | 1 + .../sidenav/side-nav/side-nav.component.ts | 1 - UI/Web/src/theme/themes/dark.scss | 2 +- UI/Web/src/theme/themes/e-ink.scss | 1 - 70 files changed, 4728 insertions(+), 1187 deletions(-) delete mode 100644 API.Benchmark/ParseScannedFilesBenchmarks.cs create mode 100644 API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs create mode 100644 API/Data/Migrations/20220817173731_SeriesFolder.cs create mode 100644 API/Services/Tasks/Scanner/LibraryWatcher.cs create mode 100644 API/Services/Tasks/Scanner/ProcessSeries.cs create mode 100644 API/Services/Tasks/Scanner/ScanLibrary.cs create mode 100644 Kavita.Common/Helpers/GlobMatcher.cs create mode 100644 UI/Web/src/app/_models/events/info-event.ts create mode 100644 UI/Web/src/app/pipe/default-date.pipe.ts diff --git a/API.Benchmark/ParseScannedFilesBenchmarks.cs b/API.Benchmark/ParseScannedFilesBenchmarks.cs deleted file mode 100644 index 1dcca79b9..000000000 --- a/API.Benchmark/ParseScannedFilesBenchmarks.cs +++ /dev/null @@ -1,69 +0,0 @@ -using System.IO; -using System.IO.Abstractions; -using System.Threading.Tasks; -using API.Entities.Enums; -using API.Parser; -using API.Services; -using API.Services.Tasks.Scanner; -using API.SignalR; -using BenchmarkDotNet.Attributes; -using BenchmarkDotNet.Order; -using Microsoft.Extensions.Logging; -using NSubstitute; - -namespace API.Benchmark -{ - [MemoryDiagnoser] - [Orderer(SummaryOrderPolicy.FastestToSlowest)] - [RankColumn] - //[SimpleJob(launchCount: 1, warmupCount: 3, targetCount: 5, invocationCount: 100, id: "Test"), ShortRunJob] - public class ParseScannedFilesBenchmarks - { - private readonly ParseScannedFiles _parseScannedFiles; - private readonly ILogger _logger = Substitute.For>(); - private readonly ILogger _bookLogger = Substitute.For>(); - private readonly IArchiveService _archiveService = Substitute.For(); - - public ParseScannedFilesBenchmarks() - { - var directoryService = new DirectoryService(Substitute.For>(), new FileSystem()); - _parseScannedFiles = new ParseScannedFiles( - Substitute.For(), - directoryService, - new ReadingItemService(_archiveService, new BookService(_bookLogger, directoryService, new ImageService(Substitute.For>(), directoryService)), Substitute.For(), directoryService), - Substitute.For()); - } - - // [Benchmark] - // public void Test() - // { - // var libraryPath = Path.Join(Directory.GetCurrentDirectory(), - // "../../../Services/Test Data/ScannerService/Manga"); - // var parsedSeries = _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new string[] {libraryPath}, - // out var totalFiles, out var scanElapsedTime); - // } - - /// - /// Generate a list of Series and another list with - /// - [Benchmark] - public async Task MergeName() - { - var libraryPath = Path.Join(Directory.GetCurrentDirectory(), - "../../../Services/Test Data/ScannerService/Manga"); - var p1 = new ParserInfo() - { - Chapters = "0", - Edition = "", - Format = MangaFormat.Archive, - FullFilePath = Path.Join(libraryPath, "A Town Where You Live", "A_Town_Where_You_Live_v01.zip"), - IsSpecial = false, - Series = "A Town Where You Live", - Title = "A Town Where You Live", - Volumes = "1" - }; - await _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new [] {libraryPath}, "Manga"); - _parseScannedFiles.MergeName(p1); - } - } -} diff --git a/API.Tests/Extensions/ParserInfoListExtensionsTests.cs b/API.Tests/Extensions/ParserInfoListExtensionsTests.cs index e7c8e9994..ff20403b1 100644 --- a/API.Tests/Extensions/ParserInfoListExtensionsTests.cs +++ b/API.Tests/Extensions/ParserInfoListExtensionsTests.cs @@ -14,7 +14,7 @@ namespace API.Tests.Extensions { public class ParserInfoListExtensions { - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; public ParserInfoListExtensions() { _defaultParser = diff --git a/API.Tests/Helpers/ParserInfoFactory.cs b/API.Tests/Helpers/ParserInfoFactory.cs index 2dc2f2869..84847dca2 100644 --- a/API.Tests/Helpers/ParserInfoFactory.cs +++ b/API.Tests/Helpers/ParserInfoFactory.cs @@ -26,7 +26,7 @@ namespace API.Tests.Helpers }; } - public static void AddToParsedInfo(IDictionary> collectedSeries, ParserInfo info) + public static void AddToParsedInfo(IDictionary> collectedSeries, ParserInfo info) { var existingKey = collectedSeries.Keys.FirstOrDefault(ps => ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series)); @@ -38,7 +38,7 @@ namespace API.Tests.Helpers }; if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>)) { - ((ConcurrentDictionary>) collectedSeries).AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => + ((ConcurrentDictionary>) collectedSeries).AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => { oldValue ??= new List(); if (!oldValue.Contains(info)) diff --git a/API.Tests/Helpers/ParserInfoHelperTests.cs b/API.Tests/Helpers/ParserInfoHelperTests.cs index d3b58d96b..d81e100c0 100644 --- a/API.Tests/Helpers/ParserInfoHelperTests.cs +++ b/API.Tests/Helpers/ParserInfoHelperTests.cs @@ -16,7 +16,7 @@ public class ParserInfoHelperTests [Fact] public void SeriesHasMatchingParserInfoFormat_ShouldBeFalse() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); //AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); @@ -45,7 +45,7 @@ public class ParserInfoHelperTests [Fact] public void SeriesHasMatchingParserInfoFormat_ShouldBeTrue() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); diff --git a/API.Tests/Parser/MangaParserTests.cs b/API.Tests/Parser/MangaParserTests.cs index 546837fd1..1ee94807c 100644 --- a/API.Tests/Parser/MangaParserTests.cs +++ b/API.Tests/Parser/MangaParserTests.cs @@ -180,6 +180,7 @@ namespace API.Tests.Parser [InlineData("Highschool of the Dead - Full Color Edition v02 [Uasaha] (Yen Press)", "Highschool of the Dead - Full Color Edition")] [InlineData("諌山創] 進撃の巨人 第23巻", "諌山創] 進撃の巨人")] [InlineData("(一般コミック) [奥浩哉] いぬやしき 第09巻", "いぬやしき")] + [InlineData("Highschool of the Dead - 02", "Highschool of the Dead")] public void ParseSeriesTest(string filename, string expected) { Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename)); diff --git a/API.Tests/Services/BookmarkServiceTests.cs b/API.Tests/Services/BookmarkServiceTests.cs index 0083a047d..e878e5eb5 100644 --- a/API.Tests/Services/BookmarkServiceTests.cs +++ b/API.Tests/Services/BookmarkServiceTests.cs @@ -405,5 +405,75 @@ public class BookmarkServiceTests } + #endregion + + #region Misc + + [Fact] + public async Task ShouldNotDeleteBookmarkOnChapterDeletion() + { + var filesystem = CreateFileSystem(); + filesystem.AddFile($"{CacheDirectory}1/0001.jpg", new MockFileData("123")); + filesystem.AddFile($"{BookmarkDirectory}1/1/0001.jpg", new MockFileData("123")); + + // Delete all Series to reset state + await ResetDB(); + + _context.Series.Add(new Series() + { + Name = "Test", + Library = new Library() { + Name = "Test LIb", + Type = LibraryType.Manga, + }, + Volumes = new List() + { + new Volume() + { + Chapters = new List() + { + new Chapter() + { + + } + } + } + } + }); + + + _context.AppUser.Add(new AppUser() + { + UserName = "Joe", + Bookmarks = new List() + { + new AppUserBookmark() + { + Page = 1, + ChapterId = 1, + FileName = $"1/1/0001.jpg", + SeriesId = 1, + VolumeId = 1 + } + } + }); + + await _context.SaveChangesAsync(); + + + var ds = new DirectoryService(Substitute.For>(), filesystem); + var bookmarkService = Create(ds); + var user = await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Bookmarks); + + var vol = await _unitOfWork.VolumeRepository.GetVolumeAsync(1); + vol.Chapters = new List(); + _unitOfWork.VolumeRepository.Update(vol); + await _unitOfWork.CommitAsync(); + + + Assert.Equal(1, ds.GetFiles(BookmarkDirectory, searchOption:SearchOption.AllDirectories).Count()); + Assert.NotNull(await _unitOfWork.UserRepository.GetBookmarkAsync(1)); + } + #endregion } diff --git a/API.Tests/Services/CacheServiceTests.cs b/API.Tests/Services/CacheServiceTests.cs index c29a78036..a812e5bdd 100644 --- a/API.Tests/Services/CacheServiceTests.cs +++ b/API.Tests/Services/CacheServiceTests.cs @@ -55,6 +55,11 @@ namespace API.Tests.Services { throw new System.NotImplementedException(); } + + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + throw new System.NotImplementedException(); + } } public class CacheServiceTests { diff --git a/API.Tests/Services/DirectoryServiceTests.cs b/API.Tests/Services/DirectoryServiceTests.cs index 23a7dfad1..fac04bf9e 100644 --- a/API.Tests/Services/DirectoryServiceTests.cs +++ b/API.Tests/Services/DirectoryServiceTests.cs @@ -841,5 +841,127 @@ namespace API.Tests.Services Assert.Equal(expected, DirectoryService.GetHumanReadableBytes(bytes)); } #endregion + + #region ScanFiles + + [Fact] + public Task ScanFiles_ShouldFindNoFiles_AllAreIgnored() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("*.*")); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(0, allFiles.Count); + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_ShouldFindNoNestedFiles_IgnoreNestedFiles() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*")); + fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(1, allFiles.Count); // Ignore files are not counted in files, only valid extensions + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_NestedIgnore_IgnoreNestedFilesInOneDirectoryOnly() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddDirectory("C:/Data/Specials/"); + fileSystem.AddDirectory("C:/Data/Specials/ArtBooks/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*")); + fileSystem.AddFile("C:/Data/Specials/.kavitaignore", new MockFileData("**/ArtBooks/*")); + fileSystem.AddFile("C:/Data/Specials/Hi.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Specials/ArtBooks/art book 01.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(2, allFiles.Count); // Ignore files are not counted in files, only valid extensions + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_ShouldFindAllFiles() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.txt", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Nothing.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(5, allFiles.Count); + + return Task.CompletedTask; + } + + #endregion + + #region GetAllDirectories + + [Fact] + public void GetAllDirectories_ShouldFindAllNestedDirectories() + { + const string testDirectory = "C:/manga/base/"; + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 2")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "A")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "B")); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(2, ds.GetAllDirectories(fileSystem.Path.Join(testDirectory, "folder 1")).Count()); + } + + #endregion } } diff --git a/API.Tests/Services/ParseScannedFilesTests.cs b/API.Tests/Services/ParseScannedFilesTests.cs index 39f990bbf..2bcbb1271 100644 --- a/API.Tests/Services/ParseScannedFilesTests.cs +++ b/API.Tests/Services/ParseScannedFilesTests.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Data.Common; using System.IO.Abstractions.TestingHelpers; @@ -14,6 +15,8 @@ using API.Services.Tasks.Scanner; using API.SignalR; using API.Tests.Helpers; using AutoMapper; +using DotNet.Globbing; +using Flurl.Util; using Microsoft.Data.Sqlite; using Microsoft.EntityFrameworkCore; using Microsoft.EntityFrameworkCore.Infrastructure; @@ -25,9 +28,9 @@ namespace API.Tests.Services; internal class MockReadingItemService : IReadingItemService { - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; - public MockReadingItemService(DefaultParser defaultParser) + public MockReadingItemService(IDefaultParser defaultParser) { _defaultParser = defaultParser; } @@ -56,6 +59,11 @@ internal class MockReadingItemService : IReadingItemService { return _defaultParser.Parse(path, rootPath, type); } + + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + return _defaultParser.Parse(path, rootPath, type); + } } public class ParseScannedFilesTests @@ -163,7 +171,7 @@ public class ParseScannedFilesTests ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) }; - var parsedSeries = new Dictionary> + var parsedSeries = new Dictionary> { { new ParsedSeries() @@ -208,7 +216,7 @@ public class ParseScannedFilesTests ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) }; - var parsedSeries = new Dictionary> + var parsedSeries = new Dictionary> { { new ParsedSeries() @@ -240,46 +248,71 @@ public class ParseScannedFilesTests #region MergeName - [Fact] - public async Task MergeName_ShouldMergeMatchingFormatAndName() - { - var fileSystem = new MockFileSystem(); - fileSystem.AddDirectory("C:/Data/"); - fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); - - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - - await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); - - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false))); - } - - [Fact] - public async Task MergeName_ShouldMerge_MismatchedFormatSameName() - { - var fileSystem = new MockFileSystem(); - fileSystem.AddDirectory("C:/Data/"); - fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); - - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - - await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); - - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false))); - } + // NOTE: I don't think I can test MergeName as it relies on Tracking Files, which is more complicated than I need + // [Fact] + // public async Task MergeName_ShouldMergeMatchingFormatAndName() + // { + // var fileSystem = new MockFileSystem(); + // fileSystem.AddDirectory("C:/Data/"); + // fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); + // + // var ds = new DirectoryService(Substitute.For>(), fileSystem); + // var psf = new ParseScannedFiles(Substitute.For>(), ds, + // new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + // + // var parsedSeries = new Dictionary>(); + // var parsedFiles = new ConcurrentDictionary>(); + // + // void TrackFiles(Tuple> parsedInfo) + // { + // var skippedScan = parsedInfo.Item1; + // var parsedFiles = parsedInfo.Item2; + // if (parsedFiles.Count == 0) return; + // + // var foundParsedSeries = new ParsedSeries() + // { + // Name = parsedFiles.First().Series, + // NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series), + // Format = parsedFiles.First().Format + // }; + // + // parsedSeries.Add(foundParsedSeries, parsedFiles); + // } + // + // await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName", + // false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles); + // + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false))); + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false))); + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false))); + // } + // + // [Fact] + // public async Task MergeName_ShouldMerge_MismatchedFormatSameName() + // { + // var fileSystem = new MockFileSystem(); + // fileSystem.AddDirectory("C:/Data/"); + // fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); + // + // var ds = new DirectoryService(Substitute.For>(), fileSystem); + // var psf = new ParseScannedFiles(Substitute.For>(), ds, + // new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + // + // + // await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); + // + // Assert.Equal("Accel World", + // psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false))); + // Assert.Equal("Accel World", + // psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false))); + // } #endregion @@ -299,14 +332,150 @@ public class ParseScannedFilesTests var psf = new ParseScannedFiles(Substitute.For>(), ds, new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + var parsedSeries = new Dictionary>(); + + void TrackFiles(Tuple> parsedInfo) + { + var skippedScan = parsedInfo.Item1; + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + parsedSeries.Add(foundParsedSeries, parsedFiles); + } + + + await psf.ScanLibrariesForSeries(LibraryType.Manga, + new List() {"C:/Data/"}, "libraryName", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles); - var parsedSeries = await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); Assert.Equal(3, parsedSeries.Values.Count); Assert.NotEmpty(parsedSeries.Keys.Where(p => p.Format == MangaFormat.Archive && p.Name.Equals("Accel World"))); + } + #endregion + + + #region ProcessFiles + + private static MockFileSystem CreateTestFilesystem() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + return fileSystem; + } + + [Fact] + public async Task ProcessFiles_ForLibraryMode_OnlyCallsFolderActionForEachTopLevelFolder() + { + var fileSystem = CreateTestFilesystem(); + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var directoriesSeen = new HashSet(); + await psf.ProcessFiles("C:/Data/", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), + (files, directoryPath) => + { + directoriesSeen.Add(directoryPath); + return Task.CompletedTask; + }); + + Assert.Equal(2, directoriesSeen.Count); + } + + [Fact] + public async Task ProcessFiles_ForNonLibraryMode_CallsFolderActionOnce() + { + var fileSystem = CreateTestFilesystem(); + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var directoriesSeen = new HashSet(); + await psf.ProcessFiles("C:/Data/", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, directoryPath) => + { + directoriesSeen.Add(directoryPath); + return Task.CompletedTask; + }); + + Assert.Single(directoriesSeen); + directoriesSeen.TryGetValue("C:/Data/", out var actual); + Assert.Equal("C:/Data/", actual); + } + + [Fact] + public async Task ProcessFiles_ShouldCallFolderActionTwice() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var callCount = 0; + await psf.ProcessFiles("C:/Data", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) => + { + callCount++; + + return Task.CompletedTask; + }); + + Assert.Equal(2, callCount); } + /// + /// Due to this not being a library, it's going to consider everything under C:/Data as being one folder aka a series folder + /// + [Fact] + public async Task ProcessFiles_ShouldCallFolderActionOnce() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var callCount = 0; + await psf.ProcessFiles("C:/Data", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) => + { + callCount++; + return Task.CompletedTask; + }); + + Assert.Equal(1, callCount); + } + #endregion } diff --git a/API.Tests/Services/ScannerServiceTests.cs b/API.Tests/Services/ScannerServiceTests.cs index e3331bf6d..5b806e96b 100644 --- a/API.Tests/Services/ScannerServiceTests.cs +++ b/API.Tests/Services/ScannerServiceTests.cs @@ -16,7 +16,7 @@ namespace API.Tests.Services [Fact] public void FindSeriesNotOnDisk_Should_Remove1() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); //AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); @@ -48,7 +48,7 @@ namespace API.Tests.Services [Fact] public void FindSeriesNotOnDisk_Should_RemoveNothing_Test() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive}); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive}); @@ -125,6 +125,8 @@ namespace API.Tests.Services // } + // TODO: I want a test for UpdateSeries where if I have chapter 10 and now it's mapping into Vol 2 Chapter 10, + // if I can do it without deleting the underlying chapter (aka id change) } } diff --git a/API/Controllers/AccountController.cs b/API/Controllers/AccountController.cs index d5336917c..2b06a74b8 100644 --- a/API/Controllers/AccountController.cs +++ b/API/Controllers/AccountController.cs @@ -354,7 +354,7 @@ namespace API.Controllers lib.AppUsers.Remove(user); } - libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList(); } foreach (var lib in libraries) @@ -458,11 +458,11 @@ namespace API.Controllers { _logger.LogInformation("{UserName} is being registered as admin. Granting access to all libraries", user.UserName); - libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync()).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync(LibraryIncludes.AppUser)).ToList(); } else { - libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList(); } foreach (var lib in libraries) diff --git a/API/Controllers/ReaderController.cs b/API/Controllers/ReaderController.cs index bafac20d2..232b02f24 100644 --- a/API/Controllers/ReaderController.cs +++ b/API/Controllers/ReaderController.cs @@ -60,6 +60,7 @@ namespace API.Controllers try { + var path = _cacheService.GetCachedFile(chapter); if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"Pdf doesn't exist when it should."); @@ -90,7 +91,7 @@ namespace API.Controllers try { var path = _cacheService.GetCachedPagePath(chapter, page); - if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}"); + if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}. Try refreshing to allow re-cache."); var format = Path.GetExtension(path).Replace(".", ""); return PhysicalFile(path, "image/" + format, Path.GetFileName(path), true); diff --git a/API/DTOs/SeriesDto.cs b/API/DTOs/SeriesDto.cs index b5fc63473..2904bf57c 100644 --- a/API/DTOs/SeriesDto.cs +++ b/API/DTOs/SeriesDto.cs @@ -54,5 +54,9 @@ namespace API.DTOs public int MaxHoursToRead { get; set; } /// public int AvgHoursToRead { get; set; } + /// + /// The highest level folder for this Series + /// + public string FolderPath { get; set; } } } diff --git a/API/Data/DataContext.cs b/API/Data/DataContext.cs index 7b2ca2654..567375134 100644 --- a/API/Data/DataContext.cs +++ b/API/Data/DataContext.cs @@ -43,6 +43,7 @@ namespace API.Data public DbSet Tag { get; set; } public DbSet SiteTheme { get; set; } public DbSet SeriesRelation { get; set; } + public DbSet FolderPath { get; set; } protected override void OnModelCreating(ModelBuilder builder) diff --git a/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs b/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs new file mode 100644 index 000000000..96fed7004 --- /dev/null +++ b/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs @@ -0,0 +1,1605 @@ +// +using System; +using API.Data; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; +using Microsoft.EntityFrameworkCore.Migrations; +using Microsoft.EntityFrameworkCore.Storage.ValueConversion; + +#nullable disable + +namespace API.Data.Migrations +{ + [DbContext(typeof(DataContext))] + [Migration("20220817173731_SeriesFolder")] + partial class SeriesFolder + { + protected override void BuildTargetModel(ModelBuilder modelBuilder) + { +#pragma warning disable 612, 618 + modelBuilder.HasAnnotation("ProductVersion", "6.0.7"); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("Name") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedName") + .IsUnique() + .HasDatabaseName("RoleNameIndex"); + + b.ToTable("AspNetRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AccessFailedCount") + .HasColumnType("INTEGER"); + + b.Property("ApiKey") + .HasColumnType("TEXT"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("ConfirmationToken") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("Email") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("EmailConfirmed") + .HasColumnType("INTEGER"); + + b.Property("LastActive") + .HasColumnType("TEXT"); + + b.Property("LockoutEnabled") + .HasColumnType("INTEGER"); + + b.Property("LockoutEnd") + .HasColumnType("TEXT"); + + b.Property("NormalizedEmail") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedUserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("PasswordHash") + .HasColumnType("TEXT"); + + b.Property("PhoneNumber") + .HasColumnType("TEXT"); + + b.Property("PhoneNumberConfirmed") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SecurityStamp") + .HasColumnType("TEXT"); + + b.Property("TwoFactorEnabled") + .HasColumnType("INTEGER"); + + b.Property("UserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedEmail") + .HasDatabaseName("EmailIndex"); + + b.HasIndex("NormalizedUserName") + .IsUnique() + .HasDatabaseName("UserNameIndex"); + + b.ToTable("AspNetUsers", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Page") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("AppUserBookmark"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AutoCloseMenu") + .HasColumnType("INTEGER"); + + b.Property("BackgroundColor") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("#000000"); + + b.Property("BlurUnreadSummaries") + .HasColumnType("INTEGER"); + + b.Property("BookReaderFontFamily") + .HasColumnType("TEXT"); + + b.Property("BookReaderFontSize") + .HasColumnType("INTEGER"); + + b.Property("BookReaderImmersiveMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLayoutMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLineSpacing") + .HasColumnType("INTEGER"); + + b.Property("BookReaderMargin") + .HasColumnType("INTEGER"); + + b.Property("BookReaderReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("BookReaderTapToPaginate") + .HasColumnType("INTEGER"); + + b.Property("BookThemeName") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("Dark"); + + b.Property("GlobalPageLayoutMode") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER") + .HasDefaultValue(0); + + b.Property("LayoutMode") + .HasColumnType("INTEGER"); + + b.Property("PageSplitOption") + .HasColumnType("INTEGER"); + + b.Property("PromptForDownloadSize") + .HasColumnType("INTEGER"); + + b.Property("ReaderMode") + .HasColumnType("INTEGER"); + + b.Property("ReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("ScalingOption") + .HasColumnType("INTEGER"); + + b.Property("ShowScreenHints") + .HasColumnType("INTEGER"); + + b.Property("ThemeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId") + .IsUnique(); + + b.HasIndex("ThemeId"); + + b.ToTable("AppUserPreferences"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("BookScrollId") + .HasColumnType("TEXT"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("PagesRead") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserProgresses"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("Rating") + .HasColumnType("INTEGER"); + + b.Property("Review") + .HasColumnType("TEXT"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserRating"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("UserId", "RoleId"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetUserRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Count") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("IsSpecial") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Number") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("Range") + .HasColumnType("TEXT"); + + b.Property("ReleaseDate") + .HasColumnType("TEXT"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.Property("TitleName") + .HasColumnType("TEXT"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("VolumeId"); + + b.ToTable("Chapter"); + }); + + modelBuilder.Entity("API.Entities.CollectionTag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("Id", "Promoted") + .IsUnique(); + + b.ToTable("CollectionTag"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("Path") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("LibraryId"); + + b.ToTable("FolderPath"); + }); + + modelBuilder.Entity("API.Entities.Genre", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Genre"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Type") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FilePath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastFileAnalysis") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.ToTable("MangaFile"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AgeRatingLocked") + .HasColumnType("INTEGER"); + + b.Property("CharacterLocked") + .HasColumnType("INTEGER"); + + b.Property("ColoristLocked") + .HasColumnType("INTEGER"); + + b.Property("CoverArtistLocked") + .HasColumnType("INTEGER"); + + b.Property("EditorLocked") + .HasColumnType("INTEGER"); + + b.Property("GenresLocked") + .HasColumnType("INTEGER"); + + b.Property("InkerLocked") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LanguageLocked") + .HasColumnType("INTEGER"); + + b.Property("LettererLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxCount") + .HasColumnType("INTEGER"); + + b.Property("PencillerLocked") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatus") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatusLocked") + .HasColumnType("INTEGER"); + + b.Property("PublisherLocked") + .HasColumnType("INTEGER"); + + b.Property("ReleaseYear") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("SummaryLocked") + .HasColumnType("INTEGER"); + + b.Property("TagsLocked") + .HasColumnType("INTEGER"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("TranslatorLocked") + .HasColumnType("INTEGER"); + + b.Property("WriterLocked") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId") + .IsUnique(); + + b.HasIndex("Id", "SeriesId") + .IsUnique(); + + b.ToTable("SeriesMetadata"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("RelationKind") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("TargetSeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.HasIndex("TargetSeriesId"); + + b.ToTable("SeriesRelation"); + }); + + modelBuilder.Entity("API.Entities.Person", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Role") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Person"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("ReadingList"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Order") + .HasColumnType("INTEGER"); + + b.Property("ReadingListId") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.HasIndex("ReadingListId"); + + b.HasIndex("SeriesId"); + + b.HasIndex("VolumeId"); + + b.ToTable("ReadingListItem"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FolderPath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastChapterAdded") + .HasColumnType("TEXT"); + + b.Property("LastFolderScanned") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("LocalizedName") + .HasColumnType("TEXT"); + + b.Property("LocalizedNameLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NameLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("OriginalName") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SortName") + .HasColumnType("TEXT"); + + b.Property("SortNameLocked") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("LibraryId"); + + b.ToTable("Series"); + }); + + modelBuilder.Entity("API.Entities.ServerSetting", b => + { + b.Property("Key") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("Key"); + + b.ToTable("ServerSetting"); + }); + + modelBuilder.Entity("API.Entities.SiteTheme", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("IsDefault") + .HasColumnType("INTEGER"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Provider") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("SiteTheme"); + }); + + modelBuilder.Entity("API.Entities.Tag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Tag"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Number") + .HasColumnType("INTEGER"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.ToTable("Volume"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.Property("AppUsersId") + .HasColumnType("INTEGER"); + + b.Property("LibrariesId") + .HasColumnType("INTEGER"); + + b.HasKey("AppUsersId", "LibrariesId"); + + b.HasIndex("LibrariesId"); + + b.ToTable("AppUserLibrary"); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "GenresId"); + + b.HasIndex("GenresId"); + + b.ToTable("ChapterGenre"); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.Property("ChapterMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.HasKey("ChapterMetadatasId", "PeopleId"); + + b.HasIndex("PeopleId"); + + b.ToTable("ChapterPerson"); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("ChapterTag"); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.Property("CollectionTagsId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("CollectionTagsId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("CollectionTagSeriesMetadata"); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("GenresId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("GenreSeriesMetadata"); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetRoleClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("ProviderKey") + .HasColumnType("TEXT"); + + b.Property("ProviderDisplayName") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("LoginProvider", "ProviderKey"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserLogins", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("UserId", "LoginProvider", "Name"); + + b.ToTable("AspNetUserTokens", (string)null); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("PeopleId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("PersonSeriesMetadata"); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("SeriesMetadatasId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("SeriesMetadataTag"); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Bookmarks") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithOne("UserPreferences") + .HasForeignKey("API.Entities.AppUserPreferences", "AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.SiteTheme", "Theme") + .WithMany() + .HasForeignKey("ThemeId"); + + b.Navigation("AppUser"); + + b.Navigation("Theme"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Progresses") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Progress") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Ratings") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Ratings") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.HasOne("API.Entities.AppRole", "Role") + .WithMany("UserRoles") + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.AppUser", "User") + .WithMany("UserRoles") + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Role"); + + b.Navigation("User"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.HasOne("API.Entities.Volume", "Volume") + .WithMany("Chapters") + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.HasOne("API.Entities.Library", "Library") + .WithMany("Folders") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany("Files") + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithOne("Metadata") + .HasForeignKey("API.Entities.Metadata.SeriesMetadata", "SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Relations") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.ClientCascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "TargetSeries") + .WithMany("RelationOf") + .HasForeignKey("TargetSeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + + b.Navigation("TargetSeries"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("ReadingLists") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany() + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.ReadingList", "ReadingList") + .WithMany("Items") + .HasForeignKey("ReadingListId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "Series") + .WithMany() + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Volume", "Volume") + .WithMany() + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + + b.Navigation("ReadingList"); + + b.Navigation("Series"); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany("WantToRead") + .HasForeignKey("AppUserId"); + + b.HasOne("API.Entities.Library", "Library") + .WithMany("Series") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Volumes") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("AppUsersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Library", null) + .WithMany() + .HasForeignKey("LibrariesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChapterMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.HasOne("API.Entities.CollectionTag", null) + .WithMany() + .HasForeignKey("CollectionTagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.HasOne("API.Entities.AppRole", null) + .WithMany() + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Navigation("UserRoles"); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Navigation("Bookmarks"); + + b.Navigation("Progresses"); + + b.Navigation("Ratings"); + + b.Navigation("ReadingLists"); + + b.Navigation("UserPreferences"); + + b.Navigation("UserRoles"); + + b.Navigation("WantToRead"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Navigation("Files"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Navigation("Folders"); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Navigation("Items"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Navigation("Metadata"); + + b.Navigation("Progress"); + + b.Navigation("Ratings"); + + b.Navigation("RelationOf"); + + b.Navigation("Relations"); + + b.Navigation("Volumes"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Navigation("Chapters"); + }); +#pragma warning restore 612, 618 + } + } +} diff --git a/API/Data/Migrations/20220817173731_SeriesFolder.cs b/API/Data/Migrations/20220817173731_SeriesFolder.cs new file mode 100644 index 000000000..33373c0c4 --- /dev/null +++ b/API/Data/Migrations/20220817173731_SeriesFolder.cs @@ -0,0 +1,37 @@ +using System; +using Microsoft.EntityFrameworkCore.Migrations; + +#nullable disable + +namespace API.Data.Migrations +{ + public partial class SeriesFolder : Migration + { + protected override void Up(MigrationBuilder migrationBuilder) + { + migrationBuilder.AddColumn( + name: "FolderPath", + table: "Series", + type: "TEXT", + nullable: true); + + migrationBuilder.AddColumn( + name: "LastFolderScanned", + table: "Series", + type: "TEXT", + nullable: false, + defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified)); + } + + protected override void Down(MigrationBuilder migrationBuilder) + { + migrationBuilder.DropColumn( + name: "FolderPath", + table: "Series"); + + migrationBuilder.DropColumn( + name: "LastFolderScanned", + table: "Series"); + } + } +} diff --git a/API/Data/Migrations/DataContextModelSnapshot.cs b/API/Data/Migrations/DataContextModelSnapshot.cs index 6a4eba753..a3cdf7f05 100644 --- a/API/Data/Migrations/DataContextModelSnapshot.cs +++ b/API/Data/Migrations/DataContextModelSnapshot.cs @@ -782,12 +782,18 @@ namespace API.Data.Migrations b.Property("Created") .HasColumnType("TEXT"); + b.Property("FolderPath") + .HasColumnType("TEXT"); + b.Property("Format") .HasColumnType("INTEGER"); b.Property("LastChapterAdded") .HasColumnType("TEXT"); + b.Property("LastFolderScanned") + .HasColumnType("TEXT"); + b.Property("LastModified") .HasColumnType("TEXT"); diff --git a/API/Data/Repositories/CollectionTagRepository.cs b/API/Data/Repositories/CollectionTagRepository.cs index da44d5e18..7b9398b85 100644 --- a/API/Data/Repositories/CollectionTagRepository.cs +++ b/API/Data/Repositories/CollectionTagRepository.cs @@ -56,6 +56,7 @@ public class CollectionTagRepository : ICollectionTagRepository /// public async Task RemoveTagsWithoutSeries() { + // TODO: Write a Unit test to validate this works var tagsToDelete = await _context.CollectionTag .Include(c => c.SeriesMetadatas) .Where(c => c.SeriesMetadatas.Count == 0) diff --git a/API/Data/Repositories/LibraryRepository.cs b/API/Data/Repositories/LibraryRepository.cs index 782247a1a..b39a74e35 100644 --- a/API/Data/Repositories/LibraryRepository.cs +++ b/API/Data/Repositories/LibraryRepository.cs @@ -34,19 +34,19 @@ public interface ILibraryRepository Task> GetLibraryDtosAsync(); Task LibraryExists(string libraryName); Task GetLibraryForIdAsync(int libraryId, LibraryIncludes includes); - Task GetFullLibraryForIdAsync(int libraryId); - Task GetFullLibraryForIdAsync(int libraryId, int seriesId); Task> GetLibraryDtosForUsernameAsync(string userName); - Task> GetLibrariesAsync(); + Task> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None); Task DeleteLibrary(int libraryId); Task> GetLibrariesForUserIdAsync(int userId); Task GetLibraryTypeAsync(int libraryId); - Task> GetLibraryForIdsAsync(IList libraryIds); + Task> GetLibraryForIdsAsync(IEnumerable libraryIds, LibraryIncludes includes = LibraryIncludes.None); Task GetTotalFiles(); IEnumerable GetJumpBarAsync(int libraryId); Task> GetAllAgeRatingsDtosForLibrariesAsync(List libraryIds); Task> GetAllLanguagesForLibrariesAsync(List libraryIds); IEnumerable GetAllPublicationStatusesDtosForLibrariesAsync(List libraryIds); + Task DoAnySeriesFoldersMatch(IEnumerable folders); + Library GetLibraryByFolder(string folder); } public class LibraryRepository : ILibraryRepository @@ -87,11 +87,19 @@ public class LibraryRepository : ILibraryRepository .ToListAsync(); } - public async Task> GetLibrariesAsync() + /// + /// Returns all libraries including their AppUsers + extra includes + /// + /// + /// + public async Task> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None) { - return await _context.Library + var query = _context.Library .Include(l => l.AppUsers) - .ToListAsync(); + .Select(l => l); + + query = AddIncludesToQuery(query, includes); + return await query.ToListAsync(); } public async Task DeleteLibrary(int libraryId) @@ -120,11 +128,13 @@ public class LibraryRepository : ILibraryRepository .SingleAsync(); } - public async Task> GetLibraryForIdsAsync(IList libraryIds) + public async Task> GetLibraryForIdsAsync(IEnumerable libraryIds, LibraryIncludes includes = LibraryIncludes.None) { - return await _context.Library - .Where(x => libraryIds.Contains(x.Id)) - .ToListAsync(); + var query = _context.Library + .Where(x => libraryIds.Contains(x.Id)); + + AddIncludesToQuery(query, includes); + return await query.ToListAsync(); } public async Task GetTotalFiles() @@ -317,4 +327,23 @@ public class LibraryRepository : ILibraryRepository .OrderBy(s => s.Title); } + /// + /// Checks if any series folders match the folders passed in + /// + /// + /// + public async Task DoAnySeriesFoldersMatch(IEnumerable folders) + { + var normalized = folders.Select(Parser.Parser.NormalizePath); + return await _context.Series.AnyAsync(s => normalized.Contains(s.FolderPath)); + } + + public Library? GetLibraryByFolder(string folder) + { + var normalized = Parser.Parser.NormalizePath(folder); + return _context.Library + .Include(l => l.Folders) + .AsSplitQuery() + .SingleOrDefault(l => l.Folders.Select(f => f.Path).Contains(normalized)); + } } diff --git a/API/Data/Repositories/SeriesRepository.cs b/API/Data/Repositories/SeriesRepository.cs index c859ed2de..3d0d4a5de 100644 --- a/API/Data/Repositories/SeriesRepository.cs +++ b/API/Data/Repositories/SeriesRepository.cs @@ -1,6 +1,5 @@ using System; using System.Collections.Generic; -using System.Globalization; using System.Linq; using System.Text.RegularExpressions; using System.Threading.Tasks; @@ -19,12 +18,11 @@ using API.Extensions; using API.Helpers; using API.Services; using API.Services.Tasks; +using API.Services.Tasks.Scanner; using AutoMapper; using AutoMapper.QueryableExtensions; -using Kavita.Common.Extensions; -using Microsoft.AspNetCore.Mvc; using Microsoft.EntityFrameworkCore; -using SQLitePCL; + namespace API.Data.Repositories; @@ -120,6 +118,11 @@ public interface ISeriesRepository Task GetSeriesForMangaFile(int mangaFileId, int userId); Task GetSeriesForChapter(int chapterId, int userId); Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter); + Task GetSeriesIdByFolder(string folder); + Task GetSeriesByFolderPath(string folder); + Task GetFullSeriesByName(string series, int libraryId); + Task RemoveSeriesNotInList(IList seenSeries, int libraryId); + Task>> GetFolderPathMap(int libraryId); } public class SeriesRepository : ISeriesRepository @@ -156,6 +159,7 @@ public class SeriesRepository : ISeriesRepository /// Returns if a series name and format exists already in a library /// /// Name of series + /// /// Format of series /// public async Task DoesSeriesNameExistInLibrary(string name, int libraryId, MangaFormat format) @@ -179,6 +183,7 @@ public class SeriesRepository : ISeriesRepository /// Used for to /// /// + /// /// public async Task> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams) { @@ -432,6 +437,7 @@ public class SeriesRepository : ISeriesRepository /// Returns Volumes, Metadata (Incl Genres and People), and Collection Tags /// /// + /// /// public async Task GetSeriesByIdAsync(int seriesId, SeriesIncludes includes = SeriesIncludes.Volumes | SeriesIncludes.Metadata) { @@ -1136,21 +1142,82 @@ public class SeriesRepository : ISeriesRepository .SingleOrDefaultAsync(); } - public async Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter) + /// + /// Given a folder path return a Series with the that matches. + /// + /// This will apply normalization on the path. + /// + /// + public async Task GetSeriesIdByFolder(string folder) { - var libraryIds = GetLibraryIdsForUser(userId); - var query = _context.AppUser - .Where(user => user.Id == userId) - .SelectMany(u => u.WantToRead) - .Where(s => libraryIds.Contains(s.LibraryId)) - .AsSplitQuery() - .AsNoTracking(); - - var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query); - - return await PagedList.CreateAsync(filteredQuery.ProjectTo(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize); + var normalized = Parser.Parser.NormalizePath(folder); + var series = await _context.Series + .Where(s => s.FolderPath.Equals(normalized)) + .SingleOrDefaultAsync(); + return series?.Id ?? 0; } + /// + /// Return a Series by Folder path. Null if not found. + /// + /// This will be normalized in the query + /// + public async Task GetSeriesByFolderPath(string folder) + { + var normalized = Parser.Parser.NormalizePath(folder); + return await _context.Series.SingleOrDefaultAsync(s => s.FolderPath.Equals(normalized)); + } + + public Task GetFullSeriesByName(string series, int libraryId) + { + return _context.Series + .Where(s => s.NormalizedName.Equals(Parser.Parser.Normalize(series)) && s.LibraryId == libraryId) + .Include(s => s.Metadata) + .ThenInclude(m => m.People) + .Include(s => s.Metadata) + .ThenInclude(m => m.Genres) + .Include(s => s.Library) + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(cm => cm.People) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Genres) + + + .Include(s => s.Metadata) + .ThenInclude(m => m.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Files) + .AsSplitQuery() + .SingleOrDefaultAsync(); + } + + public async Task RemoveSeriesNotInList(IList seenSeries, int libraryId) + { + if (seenSeries.Count == 0) return; + var ids = new List(); + foreach (var parsedSeries in seenSeries) + { + ids.Add(await _context.Series + .Where(s => s.Format == parsedSeries.Format && s.NormalizedName == parsedSeries.NormalizedName && s.LibraryId == libraryId) + .Select(s => s.Id).SingleAsync()); + } + + var seriesToRemove = await _context.Series + .Where(s => s.LibraryId == libraryId) + .Where(s => !ids.Contains(s.Id)) + .ToListAsync(); + + _context.Series.RemoveRange(seriesToRemove); + } public async Task> GetHighlyRated(int userId, int libraryId, UserParams userParams) { @@ -1320,4 +1387,53 @@ public class SeriesRepository : ISeriesRepository .AsEnumerable(); return ret; } + + public async Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter) + { + var libraryIds = GetLibraryIdsForUser(userId); + var query = _context.AppUser + .Where(user => user.Id == userId) + .SelectMany(u => u.WantToRead) + .Where(s => libraryIds.Contains(s.LibraryId)) + .AsSplitQuery() + .AsNoTracking(); + + var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query); + + return await PagedList.CreateAsync(filteredQuery.ProjectTo(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize); + } + + public async Task>> GetFolderPathMap(int libraryId) + { + var info = await _context.Series + .Where(s => s.LibraryId == libraryId) + .AsNoTracking() + .Where(s => s.FolderPath != null) + .Select(s => new SeriesModified() + { + LastScanned = s.LastFolderScanned, + SeriesName = s.Name, + FolderPath = s.FolderPath, + Format = s.Format + }).ToListAsync(); + + var map = new Dictionary>(); + foreach (var series in info) + { + if (!map.ContainsKey(series.FolderPath)) + { + map.Add(series.FolderPath, new List() + { + series + }); + } + else + { + map[series.FolderPath].Add(series); + } + + } + + return map; + } } diff --git a/API/Entities/FolderPath.cs b/API/Entities/FolderPath.cs index 267564fe8..20ba4f466 100644 --- a/API/Entities/FolderPath.cs +++ b/API/Entities/FolderPath.cs @@ -8,8 +8,9 @@ namespace API.Entities public int Id { get; set; } public string Path { get; set; } /// - /// Used when scanning to see if we can skip if nothing has changed. (not implemented) + /// Used when scanning to see if we can skip if nothing has changed /// + /// Time stored in UTC public DateTime LastScanned { get; set; } // Relationship diff --git a/API/Entities/Library.cs b/API/Entities/Library.cs index c77fb68dd..fd9956b1f 100644 --- a/API/Entities/Library.cs +++ b/API/Entities/Library.cs @@ -1,5 +1,7 @@ using System; using System.Collections.Generic; +using System.IO; +using System.Linq; using API.Entities.Enums; using API.Entities.Interfaces; @@ -9,6 +11,10 @@ namespace API.Entities { public int Id { get; set; } public string Name { get; set; } + /// + /// Update this summary with a way it's used, else let's remove it. + /// + [Obsolete("This has never been coded for. Likely we can remove it.")] public string CoverImage { get; set; } public LibraryType Type { get; set; } public DateTime Created { get; set; } @@ -16,10 +22,22 @@ namespace API.Entities /// /// Last time Library was scanned /// + /// Time stored in UTC public DateTime LastScanned { get; set; } public ICollection Folders { get; set; } public ICollection AppUsers { get; set; } public ICollection Series { get; set; } + // Methods + /// + /// Has there been any modifications to the FolderPath's directory since the date + /// + /// + public bool AnyModificationsSinceLastScan() + { + // NOTE: I don't think we can do this due to NTFS + return Folders.All(folder => File.GetLastWriteTimeUtc(folder.Path) > folder.LastScanned); + } + } } diff --git a/API/Entities/Series.cs b/API/Entities/Series.cs index f345386d3..00e7dd33d 100644 --- a/API/Entities/Series.cs +++ b/API/Entities/Series.cs @@ -50,7 +50,15 @@ public class Series : IEntityDate, IHasReadTimeEstimate /// Sum of all Volume page counts /// public int Pages { get; set; } - + /// + /// Highest path (that is under library root) that contains the series. + /// + /// must be used before setting + public string FolderPath { get; set; } + /// + /// Last time the folder was scanned + /// + public DateTime LastFolderScanned { get; set; } /// /// The type of all the files attached to this series /// diff --git a/API/Extensions/ApplicationServiceExtensions.cs b/API/Extensions/ApplicationServiceExtensions.cs index 1b637b25f..b7f449aa5 100644 --- a/API/Extensions/ApplicationServiceExtensions.cs +++ b/API/Extensions/ApplicationServiceExtensions.cs @@ -4,6 +4,7 @@ using API.Helpers; using API.Services; using API.Services.Tasks; using API.Services.Tasks.Metadata; +using API.Services.Tasks.Scanner; using API.SignalR; using API.SignalR.Presence; using Kavita.Common; @@ -46,10 +47,12 @@ namespace API.Extensions services.AddScoped(); services.AddScoped(); services.AddScoped(); + services.AddScoped(); services.AddScoped(); services.AddScoped(); services.AddScoped(); + services.AddScoped(); diff --git a/API/Helpers/GenreHelper.cs b/API/Helpers/GenreHelper.cs index aa465f58e..6c74b3e4a 100644 --- a/API/Helpers/GenreHelper.cs +++ b/API/Helpers/GenreHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -34,6 +35,7 @@ public static class GenreHelper } } + public static void KeepOnlySameGenreBetweenLists(ICollection existingGenres, ICollection removeAllExcept, Action action = null) { var existing = existingGenres.ToList(); @@ -61,4 +63,14 @@ public static class GenreHelper metadataGenres.Add(genre); } } + + public static void AddGenreIfNotExists(BlockingCollection metadataGenres, Genre genre) + { + var existingGenre = metadataGenres.FirstOrDefault(p => + p.NormalizedTitle == Parser.Parser.Normalize(genre.Title)); + if (existingGenre == null) + { + metadataGenres.Add(genre); + } + } } diff --git a/API/Helpers/ParserInfoHelpers.cs b/API/Helpers/ParserInfoHelpers.cs index a97601a43..920361800 100644 --- a/API/Helpers/ParserInfoHelpers.cs +++ b/API/Helpers/ParserInfoHelpers.cs @@ -16,7 +16,7 @@ public static class ParserInfoHelpers /// /// public static bool SeriesHasMatchingParserInfoFormat(Series series, - Dictionary> parsedSeries) + Dictionary> parsedSeries) { var format = MangaFormat.Unknown; foreach (var pSeries in parsedSeries.Keys) diff --git a/API/Helpers/PersonHelper.cs b/API/Helpers/PersonHelper.cs index 18dbe1f2e..5aa3624e1 100644 --- a/API/Helpers/PersonHelper.cs +++ b/API/Helpers/PersonHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -103,4 +104,19 @@ public static class PersonHelper metadataPeople.Add(person); } } + + /// + /// Adds the person to the list if it's not already in there + /// + /// + /// + public static void AddPersonIfNotExists(BlockingCollection metadataPeople, Person person) + { + var existingPerson = metadataPeople.SingleOrDefault(p => + p.NormalizedName == Parser.Parser.Normalize(person.Name) && p.Role == person.Role); + if (existingPerson == null) + { + metadataPeople.Add(person); + } + } } diff --git a/API/Helpers/TagHelper.cs b/API/Helpers/TagHelper.cs index 4c230a053..b4d689f66 100644 --- a/API/Helpers/TagHelper.cs +++ b/API/Helpers/TagHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -65,6 +66,16 @@ public static class TagHelper } } + public static void AddTagIfNotExists(BlockingCollection metadataTags, Tag tag) + { + var existingGenre = metadataTags.FirstOrDefault(p => + p.NormalizedTitle == Parser.Parser.Normalize(tag.Title)); + if (existingGenre == null) + { + metadataTags.Add(tag); + } + } + /// /// Remove tags on a list /// diff --git a/API/Parser/DefaultParser.cs b/API/Parser/DefaultParser.cs index 161a1533b..942210532 100644 --- a/API/Parser/DefaultParser.cs +++ b/API/Parser/DefaultParser.cs @@ -5,10 +5,16 @@ using API.Services; namespace API.Parser; +public interface IDefaultParser +{ + ParserInfo Parse(string filePath, string rootPath, LibraryType type = LibraryType.Manga); + void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret); +} + /// /// This is an implementation of the Parser that is the basis for everything /// -public class DefaultParser +public class DefaultParser : IDefaultParser { private readonly IDirectoryService _directoryService; diff --git a/API/Parser/Parser.cs b/API/Parser/Parser.cs index b79ad0889..8c0cfec51 100644 --- a/API/Parser/Parser.cs +++ b/API/Parser/Parser.cs @@ -15,12 +15,14 @@ namespace API.Parser public const string ImageFileExtensions = @"^(\.png|\.jpeg|\.jpg|\.webp|\.gif)"; public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|\.cb7|\.cbt"; - public const string BookFileExtensions = @"\.epub|\.pdf"; + private const string BookFileExtensions = @"\.epub|\.pdf"; public const string MacOsMetadataFileStartsWith = @"._"; public const string SupportedExtensions = ArchiveFileExtensions + "|" + ImageFileExtensions + "|" + BookFileExtensions; + public static readonly string[] SupportedGlobExtensions = new [] {@"**/*.png", @"**/*.cbz", @"**/*.pdf"}; + private const RegexOptions MatchOptions = RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant; diff --git a/API/Services/ArchiveService.cs b/API/Services/ArchiveService.cs index f9f5b7588..25ff8365b 100644 --- a/API/Services/ArchiveService.cs +++ b/API/Services/ArchiveService.cs @@ -140,9 +140,10 @@ namespace API.Services } /// - /// Returns first entry that is an image and is not in a blacklisted folder path. Uses for ordering files + /// Returns first entry that is an image and is not in a blacklisted folder path. Uses for ordering files /// /// + /// /// Entry name of match, null if no match public static string? FirstFileEntry(IEnumerable entryFullNames, string archiveName) { diff --git a/API/Services/CacheService.cs b/API/Services/CacheService.cs index e9bb693eb..0a2edd07b 100644 --- a/API/Services/CacheService.cs +++ b/API/Services/CacheService.cs @@ -100,11 +100,9 @@ namespace API.Services var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId); var extractPath = GetCachePath(chapterId); - if (!_directoryService.Exists(extractPath)) - { - var files = chapter.Files.ToList(); - ExtractChapterFiles(extractPath, files); - } + if (_directoryService.Exists(extractPath)) return chapter; + var files = chapter.Files.ToList(); + ExtractChapterFiles(extractPath, files); return chapter; } @@ -215,9 +213,8 @@ namespace API.Services { // Calculate what chapter the page belongs to var path = GetCachePath(chapter.Id); - var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions); - files = files - .AsEnumerable() + // TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access + var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions) .OrderByNatural(Path.GetFileNameWithoutExtension) .ToArray(); diff --git a/API/Services/DirectoryService.cs b/API/Services/DirectoryService.cs index d3976da67..5dde7f750 100644 --- a/API/Services/DirectoryService.cs +++ b/API/Services/DirectoryService.cs @@ -9,6 +9,7 @@ using System.Threading.Tasks; using API.DTOs.System; using API.Entities.Enums; using API.Extensions; +using Kavita.Common.Helpers; using Microsoft.Extensions.Logging; namespace API.Services @@ -57,6 +58,17 @@ namespace API.Services void RemoveNonImages(string directoryName); void Flatten(string directoryName); Task CheckWriteAccess(string directoryName); + + IEnumerable GetFilesWithCertainExtensions(string path, + string searchPatternExpression = "", + SearchOption searchOption = SearchOption.TopDirectoryOnly); + + IEnumerable GetDirectories(string folderPath); + string GetParentDirectoryName(string fileOrFolder); + #nullable enable + IList ScanFiles(string folderPath, GlobMatcher? matcher = null); + DateTime GetLastWriteTime(string folderPath); +#nullable disable } public class DirectoryService : IDirectoryService { @@ -105,7 +117,7 @@ namespace API.Services /// Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files. /// SearchOption to use, defaults to TopDirectoryOnly /// List of file paths - private IEnumerable GetFilesWithCertainExtensions(string path, + public IEnumerable GetFilesWithCertainExtensions(string path, string searchPatternExpression = "", SearchOption searchOption = SearchOption.TopDirectoryOnly) { @@ -507,10 +519,175 @@ namespace API.Services return dirs; } + /// + /// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope. + /// + /// + /// List of directory paths, empty if path doesn't exist + public IEnumerable GetDirectories(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray.Empty; + return FileSystem.Directory.GetDirectories(folderPath) + .Where(path => ExcludeDirectories.Matches(path).Count == 0); + } + + /// + /// Returns all directories, including subdirectories. Automatically excludes directories that shouldn't be in scope. + /// + /// + /// + public IEnumerable GetAllDirectories(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray.Empty; + var directories = new List(); + + var foundDirs = GetDirectories(folderPath); + foreach (var foundDir in foundDirs) + { + directories.Add(foundDir); + directories.AddRange(GetAllDirectories(foundDir)); + } + + return directories; + } + + /// + /// Returns the parent directories name for a file or folder. Empty string is path is not valid. + /// + /// This does touch I/O with an Attribute lookup + /// + /// + public string GetParentDirectoryName(string fileOrFolder) + { + // TODO: Write Unit tests + try + { + var attr = File.GetAttributes(fileOrFolder); + var isDirectory = attr.HasFlag(FileAttributes.Directory); + if (isDirectory) + { + return Parser.Parser.NormalizePath(FileSystem.DirectoryInfo + .FromDirectoryName(fileOrFolder).Parent + .FullName); + } + + return Parser.Parser.NormalizePath(FileSystem.FileInfo + .FromFileName(fileOrFolder).Directory.Parent + .FullName); + } + catch (Exception) + { + return string.Empty; + } + } + + /// + /// Scans a directory by utilizing a recursive folder search. If a .kavitaignore file is found, will ignore matching patterns + /// + /// + /// + /// + public IList ScanFiles(string folderPath, GlobMatcher? matcher = null) + { + _logger.LogDebug("[ScanFiles] called on {Path}", folderPath); + var files = new List(); + if (!Exists(folderPath)) return files; + + var potentialIgnoreFile = FileSystem.Path.Join(folderPath, ".kavitaignore"); + if (matcher == null) + { + matcher = CreateMatcherFromFile(potentialIgnoreFile); + } + else + { + matcher.Merge(CreateMatcherFromFile(potentialIgnoreFile)); + } + + + IEnumerable directories; + if (matcher == null) + { + directories = GetDirectories(folderPath); + } + else + { + directories = GetDirectories(folderPath) + .Where(folder => matcher != null && + !matcher.ExcludeMatches($"{FileSystem.DirectoryInfo.FromDirectoryName(folder).Name}{FileSystem.Path.AltDirectorySeparatorChar}")); + } + + foreach (var directory in directories) + { + files.AddRange(ScanFiles(directory, matcher)); + } + + + // Get the matcher from either ignore or global (default setup) + if (matcher == null) + { + files.AddRange(GetFilesWithCertainExtensions(folderPath, Parser.Parser.SupportedExtensions)); + } + else + { + var foundFiles = GetFilesWithCertainExtensions(folderPath, + Parser.Parser.SupportedExtensions) + .Where(file => !matcher.ExcludeMatches(FileSystem.FileInfo.FromFileName(file).Name)); + files.AddRange(foundFiles); + } + + return files; + } + + /// + /// Recursively scans a folder and returns the max last write time on any folders + /// + /// This is required vs just an attribute check as NTFS does not bubble up certain events from nested folders. + /// This will also ignore recursive nature if the device is not NTFS + /// + /// Max Last Write Time + public DateTime GetLastWriteTime(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) throw new IOException($"{folderPath} does not exist"); + if (new DriveInfo(FileSystem.Path.GetPathRoot(folderPath)).DriveFormat != "NTFS") + { + return FileSystem.Directory.GetLastWriteTime(folderPath); + } + + var directories = GetAllDirectories(folderPath).ToList(); + if (directories.Count == 0) return FileSystem.Directory.GetLastWriteTime(folderPath); + + return directories.Max(d => FileSystem.Directory.GetLastWriteTime(d)); + } + + + private GlobMatcher CreateMatcherFromFile(string filePath) + { + if (!FileSystem.File.Exists(filePath)) + { + return null; + } + + // Read file in and add each line to Matcher + var lines = FileSystem.File.ReadAllLines(filePath); + if (lines.Length == 0) + { + return null; + } + + GlobMatcher matcher = new(); + foreach (var line in lines) + { + matcher.AddExclude(line); + } + + return matcher; + } + /// /// Recursively scans files and applies an action on them. This uses as many cores the underlying PC has to speed /// up processing. + /// NOTE: This is no longer parallel due to user's machines locking up /// /// Directory to scan /// Action to apply on file path @@ -538,18 +715,16 @@ namespace API.Services string[] files; try { - subDirs = FileSystem.Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0); + subDirs = GetDirectories(currentDir); } // Thrown if we do not have discovery permission on the directory. catch (UnauthorizedAccessException e) { - Console.WriteLine(e.Message); - logger.LogError(e, "Unauthorized access on {Directory}", currentDir); + logger.LogCritical(e, "Unauthorized access on {Directory}", currentDir); continue; } // Thrown if another process has deleted the directory after we retrieved its name. catch (DirectoryNotFoundException e) { - Console.WriteLine(e.Message); - logger.LogError(e, "Directory not found on {Directory}", currentDir); + logger.LogCritical(e, "Directory not found on {Directory}", currentDir); continue; } @@ -558,15 +733,15 @@ namespace API.Services .ToArray(); } catch (UnauthorizedAccessException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "Unauthorized access on a file in {Directory}", currentDir); continue; } catch (DirectoryNotFoundException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "Directory not found on a file in {Directory}", currentDir); continue; } catch (IOException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "IO exception on a file in {Directory}", currentDir); continue; } @@ -577,19 +752,16 @@ namespace API.Services foreach (var file in files) { action(file); fileCount++; - } + } } catch (AggregateException ae) { ae.Handle((ex) => { - if (ex is UnauthorizedAccessException) { - // Here we just output a message and go on. - Console.WriteLine(ex.Message); - _logger.LogError(ex, "Unauthorized access on file"); - return true; - } - // Handle other exceptions here if necessary... + if (ex is not UnauthorizedAccessException) return false; + // Here we just output a message and go on. + _logger.LogError(ex, "Unauthorized access on file"); + return true; + // Handle other exceptions here if necessary... - return false; }); } diff --git a/API/Services/HostedServices/StartupTasksHostedService.cs b/API/Services/HostedServices/StartupTasksHostedService.cs index 099c44cc8..7be79f7f8 100644 --- a/API/Services/HostedServices/StartupTasksHostedService.cs +++ b/API/Services/HostedServices/StartupTasksHostedService.cs @@ -1,6 +1,7 @@ using System; using System.Threading; using System.Threading.Tasks; +using API.Services.Tasks.Scanner; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.Hosting; @@ -23,6 +24,8 @@ namespace API.Services.HostedServices await taskScheduler.ScheduleTasks(); taskScheduler.ScheduleUpdaterTasks(); + + try { // These methods will automatically check if stat collection is disabled to prevent sending any data regardless @@ -34,6 +37,9 @@ namespace API.Services.HostedServices { //If stats startup fail the user can keep using the app } + + var libraryWatcher = scope.ServiceProvider.GetRequiredService(); + //await libraryWatcher.StartWatchingLibraries(); // TODO: Enable this in the next PR } public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask; diff --git a/API/Services/MetadataService.cs b/API/Services/MetadataService.cs index 3c0df0ec7..9a36eb639 100644 --- a/API/Services/MetadataService.cs +++ b/API/Services/MetadataService.cs @@ -37,6 +37,9 @@ public interface IMetadataService /// /// Overrides any cache logic and forces execution Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true); + + Task GenerateCoversForSeries(Series series, bool forceUpdate = false); + Task RemoveAbandonedMetadataKeys(); } public class MetadataService : IMetadataService @@ -77,10 +80,8 @@ public class MetadataService : IMetadataService _logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile.FilePath); chapter.CoverImage = _readingItemService.GetCoverImage(firstFile.FilePath, ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId), firstFile.Format); - - // await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, - // MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter), false); - _updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter)); + _unitOfWork.ChapterRepository.Update(chapter); // BUG: CoverImage isn't saving for Monter Masume with new scan loop + _updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter)); // TODO: IDEA: Instead of firing here where it's not yet saved, maybe collect the ids and fire after save return Task.FromResult(true); } @@ -271,17 +272,18 @@ public class MetadataService : IMetadataService await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.CoverUpdateProgressEvent(library.Id, 1F, ProgressEventType.Ended, $"Complete")); - await RemoveAbandonedMetadataKeys(); - _logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime); } - private async Task RemoveAbandonedMetadataKeys() + public async Task RemoveAbandonedMetadataKeys() { await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated(); await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated(); await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated(); + await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries(); + await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters(); + } /// @@ -292,7 +294,6 @@ public class MetadataService : IMetadataService /// Overrides any cache logic and forces execution public async Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true) { - var sw = Stopwatch.StartNew(); var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId); if (series == null) { @@ -300,8 +301,19 @@ public class MetadataService : IMetadataService return; } + await GenerateCoversForSeries(series, forceUpdate); + } + + /// + /// Generate Cover for a Series. This is used by Scan Loop and should not be invoked directly via User Interaction. + /// + /// A full Series, with metadata, chapters, etc + /// + public async Task GenerateCoversForSeries(Series series, bool forceUpdate = false) + { + var sw = Stopwatch.StartNew(); await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, - MessageFactory.CoverUpdateProgressEvent(libraryId, 0F, ProgressEventType.Started, series.Name)); + MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 0F, ProgressEventType.Started, series.Name)); await ProcessSeriesCoverGen(series, forceUpdate); @@ -309,17 +321,14 @@ public class MetadataService : IMetadataService if (_unitOfWork.HasChanges()) { await _unitOfWork.CommitAsync(); + _logger.LogInformation("[MetadataService] Updated cover images for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, - MessageFactory.CoverUpdateProgressEvent(libraryId, 1F, ProgressEventType.Ended, series.Name)); - - await RemoveAbandonedMetadataKeys(); + MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 1F, ProgressEventType.Ended, series.Name)); await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false); await FlushEvents(); - - _logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } private async Task FlushEvents() diff --git a/API/Services/ReadingItemService.cs b/API/Services/ReadingItemService.cs index 3b2e0bf4c..b29cb0efa 100644 --- a/API/Services/ReadingItemService.cs +++ b/API/Services/ReadingItemService.cs @@ -12,6 +12,7 @@ public interface IReadingItemService string GetCoverImage(string filePath, string fileName, MangaFormat format); void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1); ParserInfo Parse(string path, string rootPath, LibraryType type); + ParserInfo ParseFile(string path, string rootPath, LibraryType type); } public class ReadingItemService : IReadingItemService @@ -20,7 +21,7 @@ public class ReadingItemService : IReadingItemService private readonly IBookService _bookService; private readonly IImageService _imageService; private readonly IDirectoryService _directoryService; - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; public ReadingItemService(IArchiveService archiveService, IBookService bookService, IImageService imageService, IDirectoryService directoryService) { @@ -52,6 +53,71 @@ public class ReadingItemService : IReadingItemService return null; } + /// + /// Processes files found during a library scan. + /// + /// Path of a file + /// + /// Library type to determine parsing to perform + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + var info = Parse(path, rootPath, type); + if (info == null) + { + return null; + } + + + // This catches when original library type is Manga/Comic and when parsing with non + if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume? + { + info = _defaultParser.Parse(path, rootPath, LibraryType.Book); + var info2 = Parse(path, rootPath, type); + info.Merge(info2); + } + + info.ComicInfo = GetComicInfo(path); + if (info.ComicInfo == null) return info; + + if (!string.IsNullOrEmpty(info.ComicInfo.Volume)) + { + info.Volumes = info.ComicInfo.Volume; + } + if (!string.IsNullOrEmpty(info.ComicInfo.Series)) + { + info.Series = info.ComicInfo.Series.Trim(); + } + if (!string.IsNullOrEmpty(info.ComicInfo.Number)) + { + info.Chapters = info.ComicInfo.Number; + } + + // Patch is SeriesSort from ComicInfo + if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort)) + { + info.SeriesSort = info.ComicInfo.TitleSort.Trim(); + } + + if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format)) + { + info.IsSpecial = true; + info.Chapters = Parser.Parser.DefaultChapter; + info.Volumes = Parser.Parser.DefaultVolume; + } + + if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort)) + { + info.SeriesSort = info.ComicInfo.SeriesSort.Trim(); + } + + if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries)) + { + info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim(); + } + + return info; + } + /// /// /// diff --git a/API/Services/SeriesService.cs b/API/Services/SeriesService.cs index f869ea12a..c51aa887d 100644 --- a/API/Services/SeriesService.cs +++ b/API/Services/SeriesService.cs @@ -422,8 +422,17 @@ public class SeriesService : ISeriesService } var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(seriesIds); + var libraryIds = series.Select(s => s.LibraryId); + var libraries = await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(libraryIds); + foreach (var library in libraries) + { + library.LastModified = DateTime.Now; + _unitOfWork.LibraryRepository.Update(library); + } + _unitOfWork.SeriesRepository.Remove(series); + if (!_unitOfWork.HasChanges() || !await _unitOfWork.CommitAsync()) return true; foreach (var s in series) diff --git a/API/Services/TaskScheduler.cs b/API/Services/TaskScheduler.cs index e9030b969..d419a0fa8 100644 --- a/API/Services/TaskScheduler.cs +++ b/API/Services/TaskScheduler.cs @@ -8,8 +8,8 @@ using API.Entities.Enums; using API.Helpers.Converters; using API.Services.Tasks; using API.Services.Tasks.Metadata; +using API.Services.Tasks.Scanner; using Hangfire; -using Hangfire.Storage; using Microsoft.Extensions.Logging; namespace API.Services; @@ -29,8 +29,6 @@ public interface ITaskScheduler void CancelStatsTasks(); Task RunStatCollection(); void ScanSiteThemes(); - - } public class TaskScheduler : ITaskScheduler { @@ -48,6 +46,9 @@ public class TaskScheduler : ITaskScheduler private readonly IWordCountAnalyzerService _wordCountAnalyzerService; public static BackgroundJobServer Client => new BackgroundJobServer(); + public const string ScanQueue = "scan"; + public const string DefaultQueue = "default"; + private static readonly Random Rnd = new Random(); @@ -83,7 +84,7 @@ public class TaskScheduler : ITaskScheduler } else { - RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local); + RecurringJob.AddOrUpdate("scan-libraries", () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local); } setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value; @@ -149,6 +150,7 @@ public class TaskScheduler : ITaskScheduler BackgroundJob.Enqueue(() => _themeService.Scan()); } + #endregion #region UpdateTasks @@ -161,13 +163,31 @@ public class TaskScheduler : ITaskScheduler } #endregion + public void ScanLibraries() + { + if (RunningAnyTasksByMethod(new List() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanLibraries in 3 hours"); + BackgroundJob.Schedule(() => ScanLibraries(), TimeSpan.FromHours(3)); + return; + } + _scannerService.ScanLibraries(); + } + public void ScanLibrary(int libraryId) { - if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId})) + if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}, ScanQueue)) { _logger.LogInformation("A duplicate request to scan library for library occured. Skipping"); return; } + if (RunningAnyTasksByMethod(new List() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours"); + BackgroundJob.Schedule(() => ScanLibrary(libraryId), TimeSpan.FromHours(3)); + return; + } + _logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId); BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId)); // When we do a scan, force cache to re-unpack in case page numbers change @@ -181,7 +201,7 @@ public class TaskScheduler : ITaskScheduler public void RefreshMetadata(int libraryId, bool forceUpdate = true) { - if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate})) + if (HasAlreadyEnqueuedTask("MetadataService","GenerateCoversForLibrary", new object[] {libraryId, forceUpdate})) { _logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping"); return; @@ -193,7 +213,7 @@ public class TaskScheduler : ITaskScheduler public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false) { - if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate})) + if (HasAlreadyEnqueuedTask("MetadataService","GenerateCoversForSeries", new object[] {libraryId, seriesId, forceUpdate})) { _logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping"); return; @@ -205,14 +225,20 @@ public class TaskScheduler : ITaskScheduler public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false) { - if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate})) + if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {seriesId, forceUpdate}, ScanQueue)) { _logger.LogInformation("A duplicate request to scan series occured. Skipping"); return; } + if (RunningAnyTasksByMethod(new List() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 mins"); + BackgroundJob.Schedule(() => ScanSeries(libraryId, seriesId, forceUpdate), TimeSpan.FromMinutes(10)); + return; + } _logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId); - BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None)); + BackgroundJob.Enqueue(() => _scannerService.ScanSeries(seriesId, forceUpdate)); } public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false) @@ -250,7 +276,7 @@ public class TaskScheduler : ITaskScheduler /// object[] of arguments in the order they are passed to enqueued job /// Queue to check against. Defaults to "default" /// - private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default") + public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue) { var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue); return enqueuedJobs.Any(j => j.Value.InEnqueuedState && @@ -258,4 +284,11 @@ public class TaskScheduler : ITaskScheduler j.Value.Job.Method.Name.Equals(methodName) && j.Value.Job.Method.DeclaringType.Name.Equals(className)); } + + public static bool RunningAnyTasksByMethod(IEnumerable classNames, string queue = DefaultQueue) + { + var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue); + return enqueuedJobs.Any(j => !j.Value.InEnqueuedState && + classNames.Contains(j.Value.Job.Method.DeclaringType?.Name)); + } } diff --git a/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs b/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs index 8c71b92d3..1bc20a359 100644 --- a/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs +++ b/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs @@ -142,7 +142,8 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService _logger.LogInformation("[WordCountAnalyzerService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } - private async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true) + + public async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true) { var isEpub = series.Format == MangaFormat.Epub; var existingWordCount = series.WordCount; @@ -208,6 +209,11 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService chapter.MinHoursToRead = est.MinHours; chapter.MaxHoursToRead = est.MaxHours; chapter.AvgHoursToRead = est.AvgHours; + foreach (var file in chapter.Files) + { + file.LastFileAnalysis = DateTime.Now; + _unitOfWork.MangaFileRepository.Update(file); + } _unitOfWork.ChapterRepository.Update(chapter); } diff --git a/API/Services/Tasks/Scanner/LibraryWatcher.cs b/API/Services/Tasks/Scanner/LibraryWatcher.cs new file mode 100644 index 000000000..f4c2224ea --- /dev/null +++ b/API/Services/Tasks/Scanner/LibraryWatcher.cs @@ -0,0 +1,212 @@ +using System; +using System.Collections.Concurrent; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Text.RegularExpressions; +using System.Threading.Tasks; +using API.Data; +using Hangfire; +using Microsoft.Extensions.Hosting; +using Microsoft.Extensions.Logging; + +namespace API.Services.Tasks.Scanner; + +public interface ILibraryWatcher +{ + Task StartWatchingLibraries(); +} + +internal class FolderScanQueueable +{ + public DateTime QueueTime { get; set; } + public string FolderPath { get; set; } +} + +internal class FolderScanQueueableComparer : IEqualityComparer +{ + public bool Equals(FolderScanQueueable x, FolderScanQueueable y) + { + if (ReferenceEquals(x, y)) return true; + if (ReferenceEquals(x, null)) return false; + if (ReferenceEquals(y, null)) return false; + if (x.GetType() != y.GetType()) return false; + return x.FolderPath == y.FolderPath; + } + + public int GetHashCode(FolderScanQueueable obj) + { + return HashCode.Combine(obj.FolderPath); + } +} + +/// +/// Responsible for watching the file system and processing change events. This is mainly responsible for invoking +/// Scanner to quickly pickup on changes. +/// +public class LibraryWatcher : ILibraryWatcher +{ + private readonly IDirectoryService _directoryService; + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + private readonly IScannerService _scannerService; + + private readonly IList _watchers = new List(); + + private readonly Dictionary> _watcherDictionary = new (); + + private IList _libraryFolders = new List(); + + // TODO: This needs to be blocking so we can consume from another thread + private readonly Queue _scanQueue = new Queue(); + //public readonly BlockingCollection ScanQueue = new BlockingCollection(); + private readonly TimeSpan _queueWaitTime; + + + + public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger logger, IScannerService scannerService, IHostEnvironment environment) + { + _directoryService = directoryService; + _unitOfWork = unitOfWork; + _logger = logger; + _scannerService = scannerService; + + _queueWaitTime = environment.IsDevelopment() ? TimeSpan.FromSeconds(10) : TimeSpan.FromMinutes(5); + + } + + public async Task StartWatchingLibraries() + { + _logger.LogInformation("Starting file watchers"); + _libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()).SelectMany(l => l.Folders).ToList(); + + foreach (var library in await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()) + { + foreach (var libraryFolder in library.Folders) + { + _logger.LogInformation("Watching {FolderPath}", libraryFolder); + var watcher = new FileSystemWatcher(libraryFolder); + watcher.NotifyFilter = NotifyFilters.CreationTime + | NotifyFilters.DirectoryName + | NotifyFilters.FileName + | NotifyFilters.LastWrite + | NotifyFilters.Size; + + watcher.Changed += OnChanged; + watcher.Created += OnCreated; + watcher.Deleted += OnDeleted; + watcher.Renamed += OnRenamed; + + watcher.Filter = "*.*"; // TODO: Configure with Parser files + watcher.IncludeSubdirectories = true; + watcher.EnableRaisingEvents = true; + _logger.LogInformation("Watching {Folder}", libraryFolder); + _watchers.Add(watcher); + if (!_watcherDictionary.ContainsKey(libraryFolder)) + { + _watcherDictionary.Add(libraryFolder, new List()); + } + + _watcherDictionary[libraryFolder].Add(watcher); + } + } + } + + private void OnChanged(object sender, FileSystemEventArgs e) + { + if (e.ChangeType != WatcherChangeTypes.Changed) return; + Console.WriteLine($"Changed: {e.FullPath}, {e.Name}"); + ProcessChange(e.FullPath); + } + + private void OnCreated(object sender, FileSystemEventArgs e) + { + Console.WriteLine($"Created: {e.FullPath}, {e.Name}"); + ProcessChange(e.FullPath); + } + + private void OnDeleted(object sender, FileSystemEventArgs e) { + Console.WriteLine($"Deleted: {e.FullPath}, {e.Name}"); + ProcessChange(e.FullPath); + } + + + + private void OnRenamed(object sender, RenamedEventArgs e) + { + Console.WriteLine($"Renamed:"); + Console.WriteLine($" Old: {e.OldFullPath}"); + Console.WriteLine($" New: {e.FullPath}"); + ProcessChange(e.FullPath); + } + + private void ProcessChange(string filePath) + { + if (!new Regex(Parser.Parser.SupportedExtensions).IsMatch(new FileInfo(filePath).Extension)) return; + // Don't do anything if a Library or ScanSeries in progress + if (TaskScheduler.RunningAnyTasksByMethod(new[] {"MetadataService", "ScannerService"})) + { + _logger.LogDebug("Suppressing Change due to scan being inprogress"); + return; + } + + + var parentDirectory = _directoryService.GetParentDirectoryName(filePath); + if (string.IsNullOrEmpty(parentDirectory)) return; + + // We need to find the library this creation belongs to + // Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault + var libraryFolder = _libraryFolders.Select(Parser.Parser.NormalizePath).FirstOrDefault(f => f.Contains(parentDirectory)); + + if (string.IsNullOrEmpty(libraryFolder)) return; + + var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList(); + if (!rootFolder.Any()) return; + + // Select the first folder and join with library folder, this should give us the folder to scan. + var fullPath = _directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.First()); + var queueItem = new FolderScanQueueable() + { + FolderPath = fullPath, + QueueTime = DateTime.Now + }; + if (_scanQueue.Contains(queueItem, new FolderScanQueueableComparer())) + { + ProcessQueue(); + return; + } + + _scanQueue.Enqueue(queueItem); + + ProcessQueue(); + } + + /// + /// Instead of making things complicated with a separate thread, this service will process the queue whenever a change occurs + /// + private void ProcessQueue() + { + var i = 0; + while (i < _scanQueue.Count) + { + var item = _scanQueue.Peek(); + if (item.QueueTime < DateTime.Now.Subtract(_queueWaitTime)) + { + _logger.LogDebug("Scheduling ScanSeriesFolder for {Folder}", item.FolderPath); + BackgroundJob.Enqueue(() => _scannerService.ScanFolder(item.FolderPath)); + _scanQueue.Dequeue(); + i++; + } + else + { + break; + } + } + + if (_scanQueue.Count > 0) + { + Task.Delay(TimeSpan.FromSeconds(10)).ContinueWith(t=> ProcessQueue()); + } + + } +} diff --git a/API/Services/Tasks/Scanner/ParseScannedFiles.cs b/API/Services/Tasks/Scanner/ParseScannedFiles.cs index 785f9ad46..5b46f212c 100644 --- a/API/Services/Tasks/Scanner/ParseScannedFiles.cs +++ b/API/Services/Tasks/Scanner/ParseScannedFiles.cs @@ -1,37 +1,55 @@ using System; using System.Collections.Concurrent; using System.Collections.Generic; -using System.Diagnostics; -using System.IO; using System.Linq; using System.Threading.Tasks; -using API.Data.Metadata; using API.Entities; using API.Entities.Enums; +using API.Extensions; using API.Helpers; using API.Parser; using API.SignalR; -using Microsoft.AspNetCore.SignalR; using Microsoft.Extensions.Logging; namespace API.Services.Tasks.Scanner { public class ParsedSeries { + /// + /// Name of the Series + /// public string Name { get; init; } + /// + /// Normalized Name of the Series + /// public string NormalizedName { get; init; } + /// + /// Format of the Series + /// public MangaFormat Format { get; init; } } + public enum Modified + { + Modified = 1, + NotModified = 2 + } + + public class SeriesModified + { + public string FolderPath { get; set; } + public string SeriesName { get; set; } + public DateTime LastScanned { get; set; } + public MangaFormat Format { get; set; } + } + public class ParseScannedFiles { - private readonly ConcurrentDictionary> _scannedSeries; private readonly ILogger _logger; private readonly IDirectoryService _directoryService; private readonly IReadingItemService _readingItemService; private readonly IEventHub _eventHub; - private readonly DefaultParser _defaultParser; /// /// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos. @@ -47,8 +65,6 @@ namespace API.Services.Tasks.Scanner _logger = logger; _directoryService = directoryService; _readingItemService = readingItemService; - _scannedSeries = new ConcurrentDictionary>(); - _defaultParser = new DefaultParser(_directoryService); _eventHub = eventHub; } @@ -58,7 +74,7 @@ namespace API.Services.Tasks.Scanner /// /// /// - public static IList GetInfosByName(Dictionary> parsedSeries, Series series) + public static IList GetInfosByName(Dictionary> parsedSeries, Series series) { var allKeys = parsedSeries.Keys.Where(ps => SeriesHelper.FindSeries(series, ps)); @@ -72,83 +88,46 @@ namespace API.Services.Tasks.Scanner return infos; } + /// - /// Processes files found during a library scan. - /// Populates a collection of for DB updates later. + /// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained /// - /// Path of a file - /// - /// Library type to determine parsing to perform - private void ProcessFile(string path, string rootPath, LibraryType type) + /// Scan directory by directory and for each, call folderAction + /// A library folder or series folder + /// A callback async Task to be called once all files for each folder path are found + /// If we should bypass any folder last write time checks on the scan and force I/O + public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory, + IDictionary> seriesPaths, Func, string,Task> folderAction, bool forceCheck = false) { - var info = _readingItemService.Parse(path, rootPath, type); - if (info == null) + string normalizedPath; + if (scanDirectoryByDirectory) { - // If the file is an image and literally a cover image, skip processing. - if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path))) + var directories = _directoryService.GetDirectories(folderPath).ToList(); + + foreach (var directory in directories) { - _logger.LogWarning("[Scanner] Could not parse series from {Path}", path); + normalizedPath = Parser.Parser.NormalizePath(directory); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck)) + { + await folderAction(new List(), directory); + } + else + { + // For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication + await folderAction(_directoryService.ScanFiles(directory), directory); + } } + return; } - - // This catches when original library type is Manga/Comic and when parsing with non - if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume? + normalizedPath = Parser.Parser.NormalizePath(folderPath); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck)) { - info = _defaultParser.Parse(path, rootPath, LibraryType.Book); - var info2 = _readingItemService.Parse(path, rootPath, type); - info.Merge(info2); - } - - info.ComicInfo = _readingItemService.GetComicInfo(path); - if (info.ComicInfo != null) - { - if (!string.IsNullOrEmpty(info.ComicInfo.Volume)) - { - info.Volumes = info.ComicInfo.Volume; - } - if (!string.IsNullOrEmpty(info.ComicInfo.Series)) - { - info.Series = info.ComicInfo.Series.Trim(); - } - if (!string.IsNullOrEmpty(info.ComicInfo.Number)) - { - info.Chapters = info.ComicInfo.Number; - } - - // Patch is SeriesSort from ComicInfo - if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort)) - { - info.SeriesSort = info.ComicInfo.TitleSort.Trim(); - } - - if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format)) - { - info.IsSpecial = true; - info.Chapters = Parser.Parser.DefaultChapter; - info.Volumes = Parser.Parser.DefaultVolume; - } - - if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort)) - { - info.SeriesSort = info.ComicInfo.SeriesSort.Trim(); - } - - if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries)) - { - info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim(); - } - } - - try - { - TrackSeries(info); - } - catch (Exception ex) - { - _logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath); + await folderAction(new List(), folderPath); + return; } + await folderAction(_directoryService.ScanFiles(folderPath), folderPath); } @@ -156,13 +135,14 @@ namespace API.Services.Tasks.Scanner /// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing. /// This will check if the name matches an existing series name (multiple fields) /// + /// A localized list of a series' parsed infos /// - private void TrackSeries(ParserInfo info) + private void TrackSeries(ConcurrentDictionary> scannedSeries, ParserInfo info) { if (info.Series == string.Empty) return; // Check if normalized info.Series already exists and if so, update info to use that name instead - info.Series = MergeName(info); + info.Series = MergeName(scannedSeries, info); var normalizedSeries = Parser.Parser.Normalize(info.Series); var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort); @@ -170,7 +150,7 @@ namespace API.Services.Tasks.Scanner try { - var existingKey = _scannedSeries.Keys.SingleOrDefault(ps => + var existingKey = scannedSeries.Keys.SingleOrDefault(ps => ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries) || ps.NormalizedName.Equals(normalizedLocalizedSeries) || ps.NormalizedName.Equals(normalizedSortSeries))); @@ -181,7 +161,7 @@ namespace API.Services.Tasks.Scanner NormalizedName = normalizedSeries }; - _scannedSeries.AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => + scannedSeries.AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => { oldValue ??= new List(); if (!oldValue.Contains(info)) @@ -195,7 +175,7 @@ namespace API.Services.Tasks.Scanner catch (Exception ex) { _logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series); - foreach (var seriesKey in _scannedSeries.Keys.Where(ps => + foreach (var seriesKey in scannedSeries.Keys.Where(ps => ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries) || ps.NormalizedName.Equals(normalizedLocalizedSeries) || ps.NormalizedName.Equals(normalizedSortSeries)))) @@ -205,23 +185,24 @@ namespace API.Services.Tasks.Scanner } } + /// /// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with /// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization. /// /// /// Series Name to group this info into - public string MergeName(ParserInfo info) + public string MergeName(ConcurrentDictionary> scannedSeries, ParserInfo info) { var normalizedSeries = Parser.Parser.Normalize(info.Series); var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries); - // We use FirstOrDefault because this was introduced late in development and users might have 2 series with both names + try { var existingName = - _scannedSeries.SingleOrDefault(p => - (Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries || - Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) && + scannedSeries.SingleOrDefault(p => + (Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) || + Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) && p.Key.Format == info.Format) .Key; @@ -233,7 +214,7 @@ namespace API.Services.Tasks.Scanner catch (Exception ex) { _logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath); - var values = _scannedSeries.Where(p => + var values = scannedSeries.Where(p => (Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries || Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) && p.Key.Format == info.Format); @@ -247,34 +228,69 @@ namespace API.Services.Tasks.Scanner return info.Series; } + /// - /// + /// This is a new version which will process series by folder groups. /// - /// Type of library. Used for selecting the correct file extensions to search for and parsing files - /// The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders - /// Name of the Library + /// + /// + /// /// - public async Task>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable folders, string libraryName) + public async Task ScanLibrariesForSeries(LibraryType libraryType, + IEnumerable folders, string libraryName, bool isLibraryScan, + IDictionary> seriesPaths, Action>> processSeriesInfos, bool forceCheck = false) { - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Started)); + + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("Starting file scan", libraryName, ProgressEventType.Started)); + foreach (var folderPath in folders) { try { - async void Action(string f) + await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) => { - try + var normalizedFolder = Parser.Parser.NormalizePath(folder); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck)) { - ProcessFile(f, folderPath, libraryType); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(f, libraryName, ProgressEventType.Updated)); + var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo() + { + Series = fp.SeriesName, + Format = fp.Format, + }).ToList(); + processSeriesInfos.Invoke(new Tuple>(true, parsedInfos)); + _logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder); + return; } - catch (FileNotFoundException exception) - { - _logger.LogError(exception, "The file {Filename} could not be found", f); - } - } + _logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated)); + var scannedSeries = new ConcurrentDictionary>(); + var infos = files.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)).Where(info => info != null).ToList(); - _directoryService.TraverseTreeParallelForEach(folderPath, Action, Parser.Parser.SupportedExtensions, _logger); + + MergeLocalizedSeriesWithSeries(infos); + + foreach (var info in infos) + { + try + { + TrackSeries(scannedSeries, info); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath); + } + } + + // It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it + // Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow) + foreach (var series in scannedSeries.Keys) + { + if (scannedSeries[series].Count > 0 && processSeriesInfos != null) + { + processSeriesInfos.Invoke(new Tuple>(false, scannedSeries[series])); + } + } + }, forceCheck); } catch (ArgumentException ex) { @@ -282,20 +298,47 @@ namespace API.Services.Tasks.Scanner } } - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Ended)); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, libraryName, ProgressEventType.Ended)); + } - return SeriesWithInfos(); + private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary> seriesPaths, string normalizedFolder, bool forceCheck = false) + { + if (forceCheck) return false; + + return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerMinute) >= + _directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerMinute)); } /// - /// Returns any series where there were parsed infos + /// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so, + /// rewrites the infos with series name instead of the localized name, so they stack. /// - /// - private Dictionary> SeriesWithInfos() + /// + /// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration" + /// World of Acceleration v02.cbz has Series "World of Acceleration" + /// After running this code, we'd have: + /// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration" + /// + /// A collection of ParserInfos + private static void MergeLocalizedSeriesWithSeries(IReadOnlyCollection infos) { - var filtered = _scannedSeries.Where(kvp => kvp.Value.Count > 0); - var series = filtered.ToDictionary(v => v.Key, v => v.Value); - return series; + var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries)); + if (!hasLocalizedSeries) return; + + var localizedSeries = infos.Select(i => i.LocalizedSeries).Distinct() + .FirstOrDefault(i => !string.IsNullOrEmpty(i)); + if (string.IsNullOrEmpty(localizedSeries)) return; + + var nonLocalizedSeries = infos.Select(i => i.Series).Distinct() + .FirstOrDefault(series => !series.Equals(localizedSeries)); + + var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries); + foreach (var infoNeedingMapping in infos.Where(i => + !Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries))) + { + infoNeedingMapping.Series = nonLocalizedSeries; + infoNeedingMapping.LocalizedSeries = localizedSeries; + } } } } diff --git a/API/Services/Tasks/Scanner/ProcessSeries.cs b/API/Services/Tasks/Scanner/ProcessSeries.cs new file mode 100644 index 000000000..5cb5e357f --- /dev/null +++ b/API/Services/Tasks/Scanner/ProcessSeries.cs @@ -0,0 +1,776 @@ +using System; +using System.Collections.Generic; +using System.Collections.Immutable; +using System.Diagnostics; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Data.Metadata; +using API.Entities; +using API.Entities.Enums; +using API.Extensions; +using API.Helpers; +using API.Parser; +using API.Services.Tasks.Metadata; +using API.SignalR; +using Hangfire; +using Microsoft.Extensions.Logging; + +namespace API.Services.Tasks.Scanner; + +public interface IProcessSeries +{ + /// + /// Do not allow this Prime to be invoked by multiple threads. It will break the DB. + /// + /// + Task Prime(); + Task ProcessSeriesAsync(IList parsedInfos, Library library); + void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false); +} + +/// +/// All code needed to Update a Series from a Scan action +/// +public class ProcessSeries : IProcessSeries +{ + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + private readonly IEventHub _eventHub; + private readonly IDirectoryService _directoryService; + private readonly ICacheHelper _cacheHelper; + private readonly IReadingItemService _readingItemService; + private readonly IFileService _fileService; + private readonly IMetadataService _metadataService; + private readonly IWordCountAnalyzerService _wordCountAnalyzerService; + + private IList _genres; + private IList _people; + private IList _tags; + + + + public ProcessSeries(IUnitOfWork unitOfWork, ILogger logger, IEventHub eventHub, + IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService, + IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService) + { + _unitOfWork = unitOfWork; + _logger = logger; + _eventHub = eventHub; + _directoryService = directoryService; + _cacheHelper = cacheHelper; + _readingItemService = readingItemService; + _fileService = fileService; + _metadataService = metadataService; + _wordCountAnalyzerService = wordCountAnalyzerService; + } + + /// + /// Invoke this before processing any series, just once to prime all the needed data during a scan + /// + public async Task Prime() + { + _genres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); + _people = await _unitOfWork.PersonRepository.GetAllPeople(); + _tags = await _unitOfWork.TagRepository.GetAllTagsAsync(); + } + + public async Task ProcessSeriesAsync(IList parsedInfos, Library library) + { + if (!parsedInfos.Any()) return; + + var scanWatch = Stopwatch.StartNew(); + var seriesName = parsedInfos.First().Series; + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Updated, seriesName)); + _logger.LogInformation("[ScannerService] Beginning series update on {SeriesName}", seriesName); + + // Check if there is a Series + var seriesAdded = false; + var series = await _unitOfWork.SeriesRepository.GetFullSeriesByName(parsedInfos.First().Series, library.Id); + if (series == null) + { + seriesAdded = true; + series = DbFactory.Series(parsedInfos.First().Series); + } + if (series.LibraryId == 0) series.LibraryId = library.Id; + + try + { + _logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName); + + UpdateVolumes(series, parsedInfos); + series.Pages = series.Volumes.Sum(v => v.Pages); + + series.NormalizedName = Parser.Parser.Normalize(series.Name); + series.OriginalName ??= parsedInfos[0].Series; + if (series.Format == MangaFormat.Unknown) + { + series.Format = parsedInfos[0].Format; + } + + if (string.IsNullOrEmpty(series.SortName)) + { + series.SortName = series.Name; + } + if (!series.SortNameLocked) + { + series.SortName = series.Name; + if (!string.IsNullOrEmpty(parsedInfos[0].SeriesSort)) + { + series.SortName = parsedInfos[0].SeriesSort; + } + } + + // parsedInfos[0] is not the first volume or chapter. We need to find it + var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p)); + if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries)) + { + series.LocalizedName = localizedSeries; + } + + // Update series FolderPath here (TODO: Move this into it's own private method) + var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path), parsedInfos.Select(f => f.FullFilePath).ToList()); + if (seriesDirs.Keys.Count == 0) + { + _logger.LogCritical("Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are in a folder"); + } + else + { + // Don't save FolderPath if it's a library Folder + if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First())) + { + series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First()); + } + } + + series.Metadata ??= DbFactory.SeriesMetadata(new List()); + UpdateSeriesMetadata(series, library.Type); + + series.LastFolderScanned = DateTime.Now; + _unitOfWork.SeriesRepository.Attach(series); + + try + { + await _unitOfWork.CommitAsync(); + } + catch (Exception ex) + { + await _unitOfWork.RollbackAsync(); + _logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series); + + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}", + string.Empty)); + } + } + catch (Exception ex) + { + _logger.LogError(ex, "[ScannerService] There was an exception updating series for {SeriesName}", series.Name); + } + + if (seriesAdded) + { + await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded, + MessageFactory.SeriesAddedEvent(series.Id, series.Name, series.LibraryId)); + } + + _logger.LogInformation("[ScannerService] Finished series update on {SeriesName} in {Milliseconds} ms", seriesName, scanWatch.ElapsedMilliseconds); + EnqueuePostSeriesProcessTasks(series.LibraryId, series.Id, false); + } + + public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false) + { + BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate)); + BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate)); + } + + private static void UpdateSeriesMetadata(Series series, LibraryType libraryType) + { + var isBook = libraryType == LibraryType.Book; + var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook); + + var firstFile = firstChapter?.Files.FirstOrDefault(); + if (firstFile == null) return; + if (Parser.Parser.IsPdf(firstFile.FilePath)) return; + + var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList(); + + // Update Metadata based on Chapter metadata + series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year); + + if (series.Metadata.ReleaseYear < 1000) + { + // Not a valid year, default to 0 + series.Metadata.ReleaseYear = 0; + } + + // Set the AgeRating as highest in all the comicInfos + if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating); + + series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount); + series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count); + // To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well. + if (series.Metadata.MaxCount != series.Metadata.TotalCount) + { + var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name)); + var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range)); + if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume; + else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter; + } + + + if (!series.Metadata.PublicationStatusLocked) + { + series.Metadata.PublicationStatus = PublicationStatus.OnGoing; + if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0) + { + series.Metadata.PublicationStatus = PublicationStatus.Completed; + } else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0) + { + series.Metadata.PublicationStatus = PublicationStatus.Ended; + } + } + + if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked) + { + series.Metadata.Summary = firstChapter.Summary; + } + + if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked) + { + series.Metadata.Language = firstChapter.Language; + } + + // Handle People + foreach (var chapter in chapters) + { + if (!series.Metadata.WriterLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Writer)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.CoverArtistLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.CoverArtist)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.PublisherLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Publisher)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.CharacterLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Character)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.ColoristLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Colorist)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.EditorLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Editor)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.InkerLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Inker)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.LettererLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Letterer)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.PencillerLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Penciller)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.TranslatorLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Translator)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.TagsLocked) + { + foreach (var tag in chapter.Tags) + { + TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag); + } + } + + if (!series.Metadata.GenresLocked) + { + foreach (var genre in chapter.Genres) + { + GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre); + } + } + } + + // NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it + // I might be able to filter out people that are in locked fields? + var people = chapters.SelectMany(c => c.People).ToList(); + PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People, + people, person => + { + switch (person.Role) + { + case PersonRole.Writer: + if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Penciller: + if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Inker: + if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Colorist: + if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Letterer: + if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.CoverArtist: + if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Editor: + if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Publisher: + if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Character: + if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Translator: + if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person); + break; + default: + series.Metadata.People.Remove(person); + break; + } + }); + } + + private void UpdateVolumes(Series series, IList parsedInfos) + { + var startingVolumeCount = series.Volumes.Count; + // Add new volumes and update chapters per volume + var distinctVolumes = parsedInfos.DistinctVolumes(); + _logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name); + foreach (var volumeNumber in distinctVolumes) + { + var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber); + if (volume == null) + { + volume = DbFactory.Volume(volumeNumber); + volume.SeriesId = series.Id; + series.Volumes.Add(volume); + _unitOfWork.VolumeRepository.Add(volume); + } + + volume.Name = volumeNumber; + + _logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name); + var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray(); + UpdateChapters(series, volume, infos); + volume.Pages = volume.Chapters.Sum(c => c.Pages); + + // Update all the metadata on the Chapters + foreach (var chapter in volume.Chapters) + { + var firstFile = chapter.Files.MinBy(x => x.Chapter); + if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue; + try + { + var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath)); + UpdateChapterFromComicInfo(chapter, firstChapterInfo?.ComicInfo); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was some issue when updating chapter's metadata"); + } + } + } + + // Remove existing volumes that aren't in parsedInfos + var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList(); + if (series.Volumes.Count != nonDeletedVolumes.Count) + { + _logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name", + (series.Volumes.Count - nonDeletedVolumes.Count), series.Name); + var deletedVolumes = series.Volumes.Except(nonDeletedVolumes); + foreach (var volume in deletedVolumes) + { + var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? ""; + if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file)) + { + _logger.LogError( + "[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}", + file); + } + + _logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file); + } + + series.Volumes = nonDeletedVolumes; + } + + _logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}", + series.Name, startingVolumeCount, series.Volumes.Count); + } + + private void UpdateChapters(Series series, Volume volume, IList parsedInfos) + { + // Add new chapters + foreach (var info in parsedInfos) + { + // Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0 + // also are treated like specials for UI grouping. + Chapter chapter; + try + { + chapter = volume.Chapters.GetChapterByRange(info); + } + catch (Exception ex) + { + _logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters); + continue; + } + + if (chapter == null) + { + _logger.LogDebug( + "[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters); + chapter = DbFactory.Chapter(info); + volume.Chapters.Add(chapter); + series.LastChapterAdded = DateTime.Now; + } + else + { + chapter.UpdateFrom(info); + } + + if (chapter == null) continue; + // Add files + var specialTreatment = info.IsSpecialInfo(); + AddOrUpdateFileForChapter(chapter, info); + chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty; + chapter.Range = specialTreatment ? info.Filename : info.Chapters; + } + + + // Remove chapters that aren't in parsedInfos or have no files linked + var existingChapters = volume.Chapters.ToList(); + foreach (var existingChapter in existingChapters) + { + if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter)) + { + _logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series); + volume.Chapters.Remove(existingChapter); + } + else + { + // Ensure we remove any files that no longer exist AND order + existingChapter.Files = existingChapter.Files + .Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath)) + .OrderByNatural(f => f.FilePath).ToList(); + existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages); + } + } + } + + private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info) + { + chapter.Files ??= new List(); + var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath); + if (existingFile != null) + { + existingFile.Format = info.Format; + if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return; + existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format); + // We skip updating DB here with last modified time so that metadata refresh can do it + } + else + { + var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format)); + if (file == null) return; + + chapter.Files.Add(file); + } + } + + #nullable enable + private void UpdateChapterFromComicInfo(Chapter chapter, ComicInfo? info) + { + var firstFile = chapter.Files.MinBy(x => x.Chapter); + if (firstFile == null || + _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return; + + var comicInfo = info; + if (info == null) + { + comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath); + } + + if (comicInfo == null) return; + _logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath); + + chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating); + + if (!string.IsNullOrEmpty(comicInfo.Title)) + { + chapter.TitleName = comicInfo.Title.Trim(); + } + + if (!string.IsNullOrEmpty(comicInfo.Summary)) + { + chapter.Summary = comicInfo.Summary; + } + + if (!string.IsNullOrEmpty(comicInfo.LanguageISO)) + { + chapter.Language = comicInfo.LanguageISO; + } + + if (comicInfo.Count > 0) + { + chapter.TotalCount = comicInfo.Count; + } + + // This needs to check against both Number and Volume to calculate Count + if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0) + { + chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number)); + } + if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0) + { + chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume))); + } + + void AddPerson(Person person) + { + PersonHelper.AddPersonIfNotExists(chapter.People, person); + } + + void AddGenre(Genre genre) + { + //chapter.Genres.Add(genre); + GenreHelper.AddGenreIfNotExists(chapter.Genres, genre); + } + + void AddTag(Tag tag, bool added) + { + //chapter.Tags.Add(tag); + TagHelper.AddTagIfNotExists(chapter.Tags, tag); + } + + + if (comicInfo.Year > 0) + { + var day = Math.Max(comicInfo.Day, 1); + var month = Math.Max(comicInfo.Month, 1); + chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}"); + } + + var people = GetTagValues(comicInfo.Colorist); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist); + UpdatePeople(people, PersonRole.Colorist, + AddPerson); + + people = GetTagValues(comicInfo.Characters); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character); + UpdatePeople(people, PersonRole.Character, + AddPerson); + + + people = GetTagValues(comicInfo.Translator); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator); + UpdatePeople(people, PersonRole.Translator, + AddPerson); + + + people = GetTagValues(comicInfo.Writer); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer); + UpdatePeople(people, PersonRole.Writer, + AddPerson); + + people = GetTagValues(comicInfo.Editor); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor); + UpdatePeople(people, PersonRole.Editor, + AddPerson); + + people = GetTagValues(comicInfo.Inker); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker); + UpdatePeople(people, PersonRole.Inker, + AddPerson); + + people = GetTagValues(comicInfo.Letterer); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer); + UpdatePeople(people, PersonRole.Letterer, + AddPerson); + + + people = GetTagValues(comicInfo.Penciller); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller); + UpdatePeople(people, PersonRole.Penciller, + AddPerson); + + people = GetTagValues(comicInfo.CoverArtist); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist); + UpdatePeople(people, PersonRole.CoverArtist, + AddPerson); + + people = GetTagValues(comicInfo.Publisher); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher); + UpdatePeople(people, PersonRole.Publisher, + AddPerson); + + var genres = GetTagValues(comicInfo.Genre); + GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList()); + UpdateGenre(genres, false, + AddGenre); + + var tags = GetTagValues(comicInfo.Tags); + TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList()); + UpdateTag(tags, false, + AddTag); + } + + private static IList GetTagValues(string comicInfoTagSeparatedByComma) + { + + if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma)) + { + return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList(); + } + return ImmutableList.Empty; + } + #nullable disable + + /// + /// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and + /// add an entry. For each person in name, the callback will be executed. + /// + /// This does not remove people if an empty list is passed into names + /// This is used to add new people to a list without worrying about duplicating rows in the DB + /// + /// + /// + private void UpdatePeople(IEnumerable names, PersonRole role, Action action) + { + + var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList(); + + foreach (var name in names) + { + var normalizedName = Parser.Parser.Normalize(name); + var person = allPeopleTypeRole.FirstOrDefault(p => + p.NormalizedName.Equals(normalizedName)); + if (person == null) + { + person = DbFactory.Person(name, role); + lock (_people) + { + _people.Add(person); + } + } + + action(person); + } + } + + /// + /// + /// + /// + /// + /// + private void UpdateGenre(IEnumerable names, bool isExternal, Action action) + { + foreach (var name in names) + { + if (string.IsNullOrEmpty(name.Trim())) continue; + + var normalizedName = Parser.Parser.Normalize(name); + var genre = _genres.FirstOrDefault(p => + p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); + if (genre == null) + { + genre = DbFactory.Genre(name, false); + lock (_genres) + { + _genres.Add(genre); + } + } + + action(genre); + } + } + + /// + /// + /// + /// + /// + /// Callback for every item. Will give said item back and a bool if item was added + private void UpdateTag(IEnumerable names, bool isExternal, Action action) + { + foreach (var name in names) + { + if (string.IsNullOrEmpty(name.Trim())) continue; + + var added = false; + var normalizedName = Parser.Parser.Normalize(name); + + var tag = _tags.FirstOrDefault(p => + p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); + if (tag == null) + { + added = true; + tag = DbFactory.Tag(name, false); + lock (_tags) + { + _tags.Add(tag); + } + } + + action(tag, added); + } + } + +} diff --git a/API/Services/Tasks/Scanner/ScanLibrary.cs b/API/Services/Tasks/Scanner/ScanLibrary.cs new file mode 100644 index 000000000..2aea6f34e --- /dev/null +++ b/API/Services/Tasks/Scanner/ScanLibrary.cs @@ -0,0 +1,111 @@ +using System; +using System.Collections.Generic; +using System.IO; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Entities; +using API.Helpers; +using API.Parser; +using Kavita.Common.Helpers; +using Microsoft.Extensions.Logging; + +namespace API.Services.Tasks.Scanner; + +/// +/// This is responsible for scanning and updating a Library +/// +public class ScanLibrary +{ + private readonly IDirectoryService _directoryService; + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + + public ScanLibrary(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger logger) + { + _directoryService = directoryService; + _unitOfWork = unitOfWork; + _logger = logger; + } + + + // public Task UpdateLibrary(Library library) + // { + // + // + // } + + + + + /// + /// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list. + /// + /// + /// + /// + public static IList GetInfosByName(Dictionary> parsedSeries, Series series) + { + var allKeys = parsedSeries.Keys.Where(ps => + SeriesHelper.FindSeries(series, ps)); + + var infos = new List(); + foreach (var key in allKeys) + { + infos.AddRange(parsedSeries[key]); + } + + return infos; + } + + + /// + /// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained + /// + /// A library folder or series folder + /// A callback async Task to be called once all files for each folder path are found + public async Task ProcessFiles(string folderPath, bool isLibraryFolder, Func, string,Task> folderAction) + { + if (isLibraryFolder) + { + var directories = _directoryService.GetDirectories(folderPath).ToList(); + + foreach (var directory in directories) + { + // For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication + await folderAction(_directoryService.ScanFiles(directory), directory); + } + } + else + { + //folderAction(ScanFiles(folderPath)); + await folderAction(_directoryService.ScanFiles(folderPath), folderPath); + } + } + + + + private GlobMatcher CreateIgnoreMatcher(string ignoreFile) + { + if (!_directoryService.FileSystem.File.Exists(ignoreFile)) + { + return null; + } + + // Read file in and add each line to Matcher + var lines = _directoryService.FileSystem.File.ReadAllLines(ignoreFile); + if (lines.Length == 0) + { + _logger.LogError("Kavita Ignore file found but empty, ignoring: {IgnoreFile}", ignoreFile); + return null; + } + + GlobMatcher matcher = new(); + foreach (var line in lines) + { + matcher.AddExclude(line); + } + + return matcher; + } +} diff --git a/API/Services/Tasks/ScannerService.cs b/API/Services/Tasks/ScannerService.cs index d04290b11..e0d250fa4 100644 --- a/API/Services/Tasks/ScannerService.cs +++ b/API/Services/Tasks/ScannerService.cs @@ -1,16 +1,14 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; -using System.Collections.Immutable; using System.Diagnostics; using System.IO; using System.Linq; using System.Threading; using System.Threading.Tasks; using API.Data; -using API.Data.Metadata; using API.Data.Repositories; using API.Entities; -using API.Entities.Enums; using API.Extensions; using API.Helpers; using API.Parser; @@ -28,17 +26,47 @@ public interface IScannerService /// cover images if forceUpdate is true. /// /// Library to scan against + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] Task ScanLibrary(int libraryId); + + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] Task ScanLibraries(); + + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - Task ScanSeries(int libraryId, int seriesId, CancellationToken token); + Task ScanSeries(int seriesId, bool bypassFolderOptimizationChecks = true); + + [Queue(TaskScheduler.ScanQueue)] + [DisableConcurrentExecution(60 * 60 * 60)] + [AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)] + Task ScanFolder(string folder); + } +public enum ScanCancelReason +{ + /// + /// Don't cancel, everything is good + /// + NoCancel = 0, + /// + /// A folder is completely empty or missing + /// + FolderMount = 1, + /// + /// There has been no change to the filesystem since last scan + /// + NoChange = 2, +} + +/** + * Responsible for Scanning the disk and importing/updating/deleting files -> DB entities. + */ public class ScannerService : IScannerService { private readonly IUnitOfWork _unitOfWork; @@ -46,73 +74,142 @@ public class ScannerService : IScannerService private readonly IMetadataService _metadataService; private readonly ICacheService _cacheService; private readonly IEventHub _eventHub; - private readonly IFileService _fileService; private readonly IDirectoryService _directoryService; private readonly IReadingItemService _readingItemService; - private readonly ICacheHelper _cacheHelper; - private readonly IWordCountAnalyzerService _wordCountAnalyzerService; + private readonly IProcessSeries _processSeries; public ScannerService(IUnitOfWork unitOfWork, ILogger logger, IMetadataService metadataService, ICacheService cacheService, IEventHub eventHub, - IFileService fileService, IDirectoryService directoryService, IReadingItemService readingItemService, - ICacheHelper cacheHelper, IWordCountAnalyzerService wordCountAnalyzerService) + IDirectoryService directoryService, IReadingItemService readingItemService, + IProcessSeries processSeries) { _unitOfWork = unitOfWork; _logger = logger; _metadataService = metadataService; _cacheService = cacheService; _eventHub = eventHub; - _fileService = fileService; _directoryService = directoryService; _readingItemService = readingItemService; - _cacheHelper = cacheHelper; - _wordCountAnalyzerService = wordCountAnalyzerService; + _processSeries = processSeries; } - [DisableConcurrentExecution(60 * 60 * 60)] - [AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - public async Task ScanSeries(int libraryId, int seriesId, CancellationToken token) + [Queue(TaskScheduler.ScanQueue)] + public async Task ScanFolder(string folder) { - var sw = new Stopwatch(); + // NOTE: I might want to move a lot of this code to the LibraryWatcher or something and just pack libraryId and seriesId + // Validate if we are scanning a new series (that belongs to a library) or an existing series + var seriesId = await _unitOfWork.SeriesRepository.GetSeriesIdByFolder(folder); + if (seriesId > 0) + { + BackgroundJob.Enqueue(() => ScanSeries(seriesId, true)); + return; + } + + var parentDirectory = _directoryService.GetParentDirectoryName(folder); + if (string.IsNullOrEmpty(parentDirectory)) return; // This should never happen as it's calculated before enqueing + + var libraries = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()).ToList(); + var libraryFolders = libraries.SelectMany(l => l.Folders); + var libraryFolder = libraryFolders.Select(Parser.Parser.NormalizePath).SingleOrDefault(f => f.Contains(parentDirectory)); + + if (string.IsNullOrEmpty(libraryFolder)) return; + + var library = libraries.FirstOrDefault(l => l.Folders.Select(Parser.Parser.NormalizePath).Contains(libraryFolder)); + if (library != null) + { + BackgroundJob.Enqueue(() => ScanLibrary(library.Id)); + } + } + + [Queue(TaskScheduler.ScanQueue)] + public async Task ScanSeries(int seriesId, bool bypassFolderOptimizationChecks = true) + { + var sw = Stopwatch.StartNew(); var files = await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId); var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId); var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new[] {seriesId}); - var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); - var folderPaths = library.Folders.Select(f => f.Path).ToList(); - - var seriesFolderPaths = (await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId)) - .Select(f => _directoryService.FileSystem.FileInfo.FromFileName(f.FilePath).Directory.FullName) - .ToList(); + var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(series.LibraryId, LibraryIncludes.Folders); var libraryPaths = library.Folders.Select(f => f.Path).ToList(); + if (await ShouldScanSeries(seriesId, library, libraryPaths, series, bypassFolderOptimizationChecks) != ScanCancelReason.NoCancel) return; - if (!await CheckMounts(library.Name, seriesFolderPaths)) + + var parsedSeries = new Dictionary>(); + var seenSeries = new List(); + var processTasks = new List(); + + var folderPath = series.FolderPath; + if (string.IsNullOrEmpty(folderPath) || !_directoryService.Exists(folderPath)) { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + // We don't care if it's multiple due to new scan loop enforcing all in one root directory + var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); + if (seriesDirs.Keys.Count == 0) + { + _logger.LogCritical("Scan Series has files spread outside a main series folder. Defaulting to library folder (this is expensive)"); + await _eventHub.SendMessageAsync(MessageFactory.Info, MessageFactory.InfoEvent($"{series.Name} is not organized well and scan series will be expensive!", "Scan Series has files spread outside a main series folder. Defaulting to library folder (this is expensive)")); + seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); + } + + folderPath = seriesDirs.Keys.FirstOrDefault(); + } + + if (string.IsNullOrEmpty(folderPath)) + { + _logger.LogCritical("Scan Series could not find a single, valid folder root for files"); + await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Scan Series could not find a single, valid folder root for files")); return; } - if (!await CheckMounts(library.Name, libraryPaths)) - { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); - return; - } - var allPeople = await _unitOfWork.PersonRepository.GetAllPeople(); - var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); - var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync(); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); - // Shouldn't this be libraryPath? - var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); - if (seriesDirs.Keys.Count == 0) + await _processSeries.Prime(); + void TrackFiles(Tuple> parsedInfo) { - _logger.LogDebug("Scan Series has files spread outside a main series folder. Defaulting to library folder"); - seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(folderPaths, files.Select(f => f.FilePath).ToList()); + var skippedScan = parsedInfo.Item1; + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + if (skippedScan) + { + seenSeries.AddRange(parsedFiles.Select(pf => new ParsedSeries() + { + Name = pf.Series, + NormalizedName = Parser.Parser.Normalize(pf.Series), + Format = pf.Format + })); + return; + } + + seenSeries.Add(foundParsedSeries); + processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library)); + parsedSeries.Add(foundParsedSeries, parsedFiles); } _logger.LogInformation("Beginning file scan on {SeriesName}", series.Name); - var (totalFiles, scanElapsedTime, parsedSeries) = await ScanFiles(library, seriesDirs.Keys); + var scanElapsedTime = await ScanFiles(library, new []{folderPath}, false, TrackFiles, bypassFolderOptimizationChecks); + _logger.LogInformation("ScanFiles for {Series} took {Time}", series.Name, scanElapsedTime); + await Task.WhenAll(processTasks); + // We need to handle if parsedSeries is empty but seenSeries has our series + if (seenSeries.Any(s => s.NormalizedName.Equals(series.NormalizedName)) && parsedSeries.Keys.Count == 0) + { + // Nothing has changed + _logger.LogInformation("[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} scan has no work to do", + "All folders have not been changed since last scan. Scan will be aborted.")); + + _processSeries.EnqueuePostSeriesProcessTasks(series.LibraryId, seriesId, false); + return; + } // Remove any parsedSeries keys that don't belong to our series. This can occur when users store 2 series in the same folder RemoveParsedInfosNotForSeries(parsedSeries, series); @@ -128,59 +225,37 @@ public class ScannerService : IScannerService try { _unitOfWork.SeriesRepository.Remove(series); - await CommitAndSend(totalFiles, parsedSeries, sw, scanElapsedTime, series); + await CommitAndSend(1, sw, scanElapsedTime, series); } catch (Exception ex) { - _logger.LogCritical(ex, "There was an error during ScanSeries to delete the series"); + _logger.LogCritical(ex, "There was an error during ScanSeries to delete the series as no files could be found. Aborting scan"); await _unitOfWork.RollbackAsync(); + return; } - } else { - // We need to do an additional check for an edge case: If the scan ran and the files do not match the existing Series name, then it is very likely, - // the files have crap naming and if we don't correct, the series will get deleted due to the parser not being able to fallback onto folder parsing as the root - // is the series folder. - var existingFolder = seriesDirs.Keys.FirstOrDefault(key => key.Contains(series.OriginalName)); - if (seriesDirs.Keys.Count == 1 && !string.IsNullOrEmpty(existingFolder)) - { - seriesDirs = new Dictionary(); - var path = Directory.GetParent(existingFolder)?.FullName; - if (!folderPaths.Contains(path) || !folderPaths.Any(p => p.Contains(path ?? string.Empty))) - { - _logger.LogCritical("[ScanService] Aborted: {SeriesName} has bad naming convention and sits at root of library. Cannot scan series without deletion occuring. Correct file names to have Series Name within it or perform Scan Library", series.OriginalName); - await _eventHub.SendMessageAsync(MessageFactory.Error, - MessageFactory.ErrorEvent($"Scan of {series.Name} aborted", $"{series.OriginalName} has bad naming convention and sits at root of library. Cannot scan series without deletion occuring. Correct file names to have Series Name within it or perform Scan Library")); - return; - } - if (!string.IsNullOrEmpty(path)) - { - seriesDirs[path] = string.Empty; - } - } - - var (totalFiles2, scanElapsedTime2, parsedSeries2) = await ScanFiles(library, seriesDirs.Keys); - _logger.LogInformation("{SeriesName} has bad naming convention, forcing rescan at a higher directory", series.OriginalName); - totalFiles += totalFiles2; - scanElapsedTime += scanElapsedTime2; - parsedSeries = parsedSeries2; - RemoveParsedInfosNotForSeries(parsedSeries, series); + // I think we should just fail and tell user to fix their setup. This is extremely expensive for an edge case + _logger.LogCritical("We weren't able to find any files in the series scan, but there should be. Please correct your naming convention or put Series in a dedicated folder. Aborting scan"); + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"Error scanning {series.Name}", "We weren't able to find any files in the series scan, but there should be. Please correct your naming convention or put Series in a dedicated folder. Aborting scan")); + await _unitOfWork.RollbackAsync(); + return; } + // At this point, parsedSeries will have at least one key and we can perform the update. If it still doesn't, just return and don't do anything + if (parsedSeries.Count == 0) return; } - // At this point, parsedSeries will have at least one key and we can perform the update. If it still doesn't, just return and don't do anything - if (parsedSeries.Count == 0) return; - // Merge any series together that might have different ParsedSeries but belong to another group of ParsedSeries try { await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); + var parsedInfos = ParseScannedFiles.GetInfosByName(parsedSeries, series); + await _processSeries.ProcessSeriesAsync(parsedInfos, library); await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - await CommitAndSend(totalFiles, parsedSeries, sw, scanElapsedTime, series); - await RemoveAbandonedMetadataKeys(); + await CommitAndSend(1, sw, scanElapsedTime, series); } catch (Exception ex) { @@ -189,15 +264,59 @@ public class ScannerService : IScannerService } // Tell UI that this series is done await _eventHub.SendMessageAsync(MessageFactory.ScanSeries, - MessageFactory.ScanSeriesEvent(libraryId, seriesId, series.Name)); - await CleanupDbEntities(); + MessageFactory.ScanSeriesEvent(library.Id, seriesId, series.Name)); + + + await _metadataService.RemoveAbandonedMetadataKeys(); BackgroundJob.Enqueue(() => _cacheService.CleanupChapters(chapterIds)); BackgroundJob.Enqueue(() => _directoryService.ClearDirectory(_directoryService.TempDirectory)); - BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, series.Id, false)); - BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, series.Id, false)); } - private static void RemoveParsedInfosNotForSeries(Dictionary> parsedSeries, Series series) + private async Task ShouldScanSeries(int seriesId, Library library, IList libraryPaths, Series series, bool bypassFolderChecks = false) + { + var seriesFolderPaths = (await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId)) + .Select(f => _directoryService.FileSystem.FileInfo.FromFileName(f.FilePath).Directory.FullName) + .Distinct() + .ToList(); + + if (!await CheckMounts(library.Name, seriesFolderPaths)) + { + _logger.LogCritical( + "Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + return ScanCancelReason.FolderMount; + } + + if (!await CheckMounts(library.Name, libraryPaths)) + { + _logger.LogCritical( + "Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + return ScanCancelReason.FolderMount; + } + + // If all series Folder paths haven't been modified since last scan, abort + // NOTE: On windows, the parent folder will not update LastWriteTime if a subfolder was updated with files. Need to do a bit of light I/O. + if (!bypassFolderChecks) + { + + var allFolders = seriesFolderPaths.SelectMany(path => _directoryService.GetDirectories(path)).ToList(); + allFolders.AddRange(seriesFolderPaths); + + if (allFolders.All(folder => _directoryService.GetLastWriteTime(folder) <= series.LastFolderScanned)) + { + _logger.LogInformation( + "[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", + series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} scan has no work to do", "All folders have not been changed since last scan. Scan will be aborted.")); + return ScanCancelReason.NoChange; + } + } + + + return ScanCancelReason.NoCancel; + } + + private static void RemoveParsedInfosNotForSeries(Dictionary> parsedSeries, Series series) { var keys = parsedSeries.Keys; foreach (var key in keys.Where(key => !SeriesHelper.FindSeries(series, key))) // series.Format != key.Format || @@ -206,18 +325,23 @@ public class ScannerService : IScannerService } } - private async Task CommitAndSend(int totalFiles, - Dictionary> parsedSeries, Stopwatch sw, long scanElapsedTime, Series series) + private async Task CommitAndSend(int seriesCount, Stopwatch sw, long scanElapsedTime, Series series) { if (_unitOfWork.HasChanges()) { await _unitOfWork.CommitAsync(); _logger.LogInformation( - "Processed {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {SeriesName}", - totalFiles, parsedSeries.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, series.Name); + "Processed files and {SeriesCount} series in {ElapsedScanTime} milliseconds for {SeriesName}", + seriesCount, sw.ElapsedMilliseconds + scanElapsedTime, series.Name); } } + /// + /// Ensure that all library folders are mounted. In the case that any are empty or non-existent, emit an event to the UI via EventHub and return false + /// + /// + /// + /// private async Task CheckMounts(string libraryName, IList folders) { // Check if any of the folder roots are not available (ie disconnected from network, etc) and fail if any of them are @@ -236,8 +360,6 @@ public class ScannerService : IScannerService // For Docker instances check if any of the folder roots are not available (ie disconnected volumes, etc) and fail if any of them are if (folders.Any(f => _directoryService.IsDirectoryEmpty(f))) { - // NOTE: Food for thought, move this to throw an exception and let a middleware inform the UI to keep the code clean. (We can throw a custom exception which - // will always propagate to the UI) // That way logging and UI informing is all in one place with full context _logger.LogError("Some of the root folders for the library are empty. " + "Either your mount has been disconnected or you are trying to delete all series in the library. " + @@ -255,13 +377,13 @@ public class ScannerService : IScannerService return true; } + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] public async Task ScanLibraries() { _logger.LogInformation("Starting Scan of All Libraries"); - var libraries = await _unitOfWork.LibraryRepository.GetLibrariesAsync(); - foreach (var lib in libraries) + foreach (var lib in await _unitOfWork.LibraryRepository.GetLibrariesAsync()) { await ScanLibrary(lib.Id); } @@ -275,50 +397,115 @@ public class ScannerService : IScannerService /// ie) all entities will be rechecked for new cover images and comicInfo.xml changes /// /// + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] public async Task ScanLibrary(int libraryId) { - Library library; - try - { - library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); - } - catch (Exception ex) - { - // This usually only fails if user is not authenticated. - _logger.LogError(ex, "[ScannerService] There was an issue fetching Library {LibraryId}", libraryId); - return; - } + var sw = Stopwatch.StartNew(); + var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); + var libraryFolderPaths = library.Folders.Select(fp => fp.Path).ToList(); + if (!await CheckMounts(library.Name, libraryFolderPaths)) return; - if (!await CheckMounts(library.Name, library.Folders.Select(f => f.Path).ToList())) + // If all library Folder paths haven't been modified since last scan, abort + // Unless the user did something on the library (delete series) and thus we can bypass this check + var wasLibraryUpdatedSinceLastScan = (library.LastModified.Truncate(TimeSpan.TicksPerMinute) > + library.LastScanned.Truncate(TimeSpan.TicksPerMinute)) + && library.LastScanned != DateTime.MinValue; + if (!wasLibraryUpdatedSinceLastScan) { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + var haveFoldersChangedSinceLastScan = library.Folders + .All(f => _directoryService.GetLastWriteTime(f.Path).Truncate(TimeSpan.TicksPerMinute) > f.LastScanned.Truncate(TimeSpan.TicksPerMinute)); - return; + // If nothing changed && library folder's have all been scanned at least once + if (!haveFoldersChangedSinceLastScan && library.Folders.All(f => f.LastScanned > DateTime.MinValue)) + { + _logger.LogInformation("[ScannerService] {LibraryName} scan has no work to do. All folders have not been changed since last scan", library.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{library.Name} scan has no work to do", + "All folders have not been changed since last scan. Scan will be aborted.")); + return; + } } + // Validations are done, now we can start actual scan _logger.LogInformation("[ScannerService] Beginning file scan on {LibraryName}", library.Name); - var (totalFiles, scanElapsedTime, series) = await ScanFiles(library, library.Folders.Select(fp => fp.Path)); - _logger.LogInformation("[ScannerService] Finished file scan. Updating database"); + // This doesn't work for something like M:/Manga/ and a series has library folder as root + var shouldUseLibraryScan = !(await _unitOfWork.LibraryRepository.DoAnySeriesFoldersMatch(libraryFolderPaths)); + if (!shouldUseLibraryScan) + { + _logger.LogInformation("Library {LibraryName} consists of one ore more Series folders, using series scan", library.Name); + } + + var totalFiles = 0; + var seenSeries = new List(); + + + await _processSeries.Prime(); + var processTasks = new List(); + void TrackFiles(Tuple> parsedInfo) + { + var skippedScan = parsedInfo.Item1; + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + if (skippedScan) + { + seenSeries.AddRange(parsedFiles.Select(pf => new ParsedSeries() + { + Name = pf.Series, + NormalizedName = Parser.Parser.Normalize(pf.Series), + Format = pf.Format + })); + return; + } + + totalFiles += parsedFiles.Count; + + + seenSeries.Add(foundParsedSeries); + processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library)); + } + + + var scanElapsedTime = await ScanFiles(library, libraryFolderPaths, shouldUseLibraryScan, TrackFiles); + + + await Task.WhenAll(processTasks); + + //await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, string.Empty)); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, library.Name, ProgressEventType.Ended)); + + _logger.LogInformation("[ScannerService] Finished file scan in {ScanAndUpdateTime}. Updating database", scanElapsedTime); + + var time = DateTime.Now; foreach (var folderPath in library.Folders) { - folderPath.LastScanned = DateTime.Now; + folderPath.LastScanned = time; } - var sw = Stopwatch.StartNew(); - await UpdateLibrary(library, series); + library.LastScanned = time; + + // Could I delete anything in a Library's Series where the LastScan date is before scanStart? + // NOTE: This implementation is expensive + await _unitOfWork.SeriesRepository.RemoveSeriesNotInList(seenSeries, library.Id); - library.LastScanned = DateTime.Now; _unitOfWork.LibraryRepository.Update(library); if (await _unitOfWork.CommitAsync()) { _logger.LogInformation( "[ScannerService] Finished scan of {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}", - totalFiles, series.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, library.Name); + totalFiles, seenSeries.Count, sw.ElapsedMilliseconds, library.Name); } else { @@ -326,22 +513,24 @@ public class ScannerService : IScannerService "[ScannerService] There was a critical error that resulted in a failed scan. Please check logs and rescan"); } - await CleanupDbEntities(); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, string.Empty)); + await _metadataService.RemoveAbandonedMetadataKeys(); - BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForLibrary(libraryId, false)); - BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanLibrary(libraryId, false)); BackgroundJob.Enqueue(() => _directoryService.ClearDirectory(_directoryService.TempDirectory)); } - private async Task>>> ScanFiles(Library library, IEnumerable dirs) + private async Task ScanFiles(Library library, IEnumerable dirs, + bool isLibraryScan, Action>> processSeriesInfos = null, bool forceChecks = false) { var scanner = new ParseScannedFiles(_logger, _directoryService, _readingItemService, _eventHub); - var scanWatch = new Stopwatch(); - var parsedSeries = await scanner.ScanLibrariesForSeries(library.Type, dirs, library.Name); - var totalFiles = parsedSeries.Keys.Sum(key => parsedSeries[key].Count); + var scanWatch = Stopwatch.StartNew(); + + await scanner.ScanLibrariesForSeries(library.Type, dirs, library.Name, + isLibraryScan, await _unitOfWork.SeriesRepository.GetFolderPathMap(library.Id), processSeriesInfos, forceChecks); + var scanElapsedTime = scanWatch.ElapsedMilliseconds; - return new Tuple>>(totalFiles, scanElapsedTime, parsedSeries); + return scanElapsedTime; } /// @@ -364,707 +553,8 @@ public class ScannerService : IScannerService _logger.LogInformation("Removed {Count} abandoned collection tags", cleanedUp); } - private async Task UpdateLibrary(Library library, Dictionary> parsedSeries) - { - if (parsedSeries == null) return; - - // Library contains no Series, so we need to fetch series in groups of ChunkSize - var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id); - var stopwatch = Stopwatch.StartNew(); - var totalTime = 0L; - - var allPeople = await _unitOfWork.PersonRepository.GetAllPeople(); - var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); - var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync(); - - // Update existing series - _logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", - library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize); - for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++) - { - if (chunkInfo.TotalChunks == 0) continue; - totalTime += stopwatch.ElapsedMilliseconds; - stopwatch.Restart(); - _logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}", - chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize); - var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams() - { - PageNumber = chunk, - PageSize = chunkInfo.ChunkSize - }); - - // First, remove any series that are not in parsedSeries list - var missingSeries = FindSeriesNotOnDisk(nonLibrarySeries, parsedSeries).ToList(); - - foreach (var missing in missingSeries) - { - _unitOfWork.SeriesRepository.Remove(missing); - } - - var cleanedSeries = SeriesHelper.RemoveMissingSeries(nonLibrarySeries, missingSeries, out var removeCount); - if (removeCount > 0) - { - _logger.LogInformation("[ScannerService] Removed {RemoveMissingSeries} series that are no longer on disk:", removeCount); - foreach (var s in missingSeries) - { - _logger.LogDebug("[ScannerService] Removed {SeriesName} ({Format})", s.Name, s.Format); - } - } - - // Now, we only have to deal with series that exist on disk. Let's recalculate the volumes for each series - var librarySeries = cleanedSeries.ToList(); - - foreach (var series in librarySeries) - { - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); - } - - try - { - await _unitOfWork.CommitAsync(); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB", chunk); - foreach (var series in nonLibrarySeries) - { - _logger.LogCritical("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName); - } - - await _eventHub.SendMessageAsync(MessageFactory.Error, - MessageFactory.ErrorEvent("There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB", - "The following series had constraint issues: " + string.Join(",", nonLibrarySeries.Select(s => s.OriginalName)))); - - continue; - } - _logger.LogInformation( - "[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}", - chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name); - - // Emit any series removed - foreach (var missing in missingSeries) - { - await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved, MessageFactory.SeriesRemovedEvent(missing.Id, missing.Name, library.Id)); - } - - foreach (var series in librarySeries) - { - // This is something more like, the series has finished updating in the backend. It may or may not have been modified. - await _eventHub.SendMessageAsync(MessageFactory.ScanSeries, MessageFactory.ScanSeriesEvent(library.Id, series.Id, series.Name)); - } - } - - - // Add new series that have parsedInfos - _logger.LogDebug("[ScannerService] Adding new series"); - var newSeries = new List(); - var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList(); - _logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series", - allSeries.Count, parsedSeries.Count - allSeries.Count); - // TODO: Once a parsedSeries is processed, remove the key to free up some memory - foreach (var (key, infos) in parsedSeries) - { - // Key is normalized already - Series existingSeries; - try - { - existingSeries = allSeries.SingleOrDefault(s => SeriesHelper.FindSeries(s, key)); - } - catch (Exception e) - { - // NOTE: If I ever want to put Duplicates table, this is where it can go - _logger.LogCritical(e, "[ScannerService] There are multiple series that map to normalized key {Key}. You can manually delete the entity via UI and rescan to fix it. This will be skipped", key.NormalizedName); - var duplicateSeries = allSeries.Where(s => SeriesHelper.FindSeries(s, key)); - foreach (var series in duplicateSeries) - { - _logger.LogCritical("[ScannerService] Duplicate Series Found: {Key} maps with {Series}", key.Name, series.OriginalName); - - } - - continue; - } - - if (existingSeries != null) continue; - - var s = DbFactory.Series(infos[0].Series); - if (!s.SortNameLocked && !string.IsNullOrEmpty(infos[0].SeriesSort)) - { - s.SortName = infos[0].SeriesSort; - } - if (!s.LocalizedNameLocked && !string.IsNullOrEmpty(infos[0].LocalizedSeries)) - { - s.LocalizedName = infos[0].LocalizedSeries; - } - s.Format = key.Format; - s.LibraryId = library.Id; // We have to manually set this since we aren't adding the series to the Library's series. - newSeries.Add(s); - } - - - foreach(var series in newSeries) - { - _logger.LogDebug("[ScannerService] Processing series {SeriesName}", series.OriginalName); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); - _unitOfWork.SeriesRepository.Attach(series); - try - { - await _unitOfWork.CommitAsync(); - _logger.LogInformation( - "[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}", - newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name); - - // Inform UI of new series added - await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded, MessageFactory.SeriesAddedEvent(series.Id, series.Name, library.Id)); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "[ScannerService] There was a critical exception adding new series entry for {SeriesName} with a duplicate index key: {IndexKey} ", - series.Name, $"{series.Name}_{series.NormalizedName}_{series.LocalizedName}_{series.LibraryId}_{series.Format}"); - } - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended)); - - _logger.LogInformation( - "[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}", - newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name); - } - - private async Task UpdateSeries(Series series, Dictionary> parsedSeries, - ICollection allPeople, ICollection allTags, ICollection allGenres, Library library) - { - try - { - _logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - - // Get all associated ParsedInfos to the series. This includes infos that use a different filename that matches Series LocalizedName - var parsedInfos = ParseScannedFiles.GetInfosByName(parsedSeries, series); - UpdateVolumes(series, parsedInfos, allPeople, allTags, allGenres); - series.Pages = series.Volumes.Sum(v => v.Pages); - - series.NormalizedName = Parser.Parser.Normalize(series.Name); - series.Metadata ??= DbFactory.SeriesMetadata(new List()); - if (series.Format == MangaFormat.Unknown) - { - series.Format = parsedInfos[0].Format; - } - series.OriginalName ??= parsedInfos[0].Series; - if (string.IsNullOrEmpty(series.SortName)) - { - series.SortName = series.Name; - } - if (!series.SortNameLocked) - { - series.SortName = series.Name; - if (!string.IsNullOrEmpty(parsedInfos[0].SeriesSort)) - { - series.SortName = parsedInfos[0].SeriesSort; - } - } - - // parsedInfos[0] is not the first volume or chapter. We need to find it - var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p)); - if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries)) - { - series.LocalizedName = localizedSeries; - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - - UpdateSeriesMetadata(series, allPeople, allGenres, allTags, library.Type); - } - catch (Exception ex) - { - _logger.LogError(ex, "[ScannerService] There was an exception updating volumes for {SeriesName}", series.Name); - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - } - - public static IEnumerable FindSeriesNotOnDisk(IEnumerable existingSeries, Dictionary> parsedSeries) + public static IEnumerable FindSeriesNotOnDisk(IEnumerable existingSeries, Dictionary> parsedSeries) { return existingSeries.Where(es => !ParserInfoHelpers.SeriesHasMatchingParserInfoFormat(es, parsedSeries)); } - - private async Task RemoveAbandonedMetadataKeys() - { - await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated(); - await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated(); - await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated(); - } - - - private static void UpdateSeriesMetadata(Series series, ICollection allPeople, ICollection allGenres, ICollection allTags, LibraryType libraryType) - { - var isBook = libraryType == LibraryType.Book; - var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook); - - var firstFile = firstChapter?.Files.FirstOrDefault(); - if (firstFile == null) return; - if (Parser.Parser.IsPdf(firstFile.FilePath)) return; - - var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList(); - - // Update Metadata based on Chapter metadata - series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year); - - if (series.Metadata.ReleaseYear < 1000) - { - // Not a valid year, default to 0 - series.Metadata.ReleaseYear = 0; - } - - // Set the AgeRating as highest in all the comicInfos - if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating); - - series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount); - series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count); - // To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well. - if (series.Metadata.MaxCount != series.Metadata.TotalCount) - { - var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name)); - var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range)); - if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume; - else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter; - } - - - if (!series.Metadata.PublicationStatusLocked) - { - series.Metadata.PublicationStatus = PublicationStatus.OnGoing; - if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0) - { - series.Metadata.PublicationStatus = PublicationStatus.Completed; - } else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0) - { - series.Metadata.PublicationStatus = PublicationStatus.Ended; - } - } - - if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked) - { - series.Metadata.Summary = firstChapter.Summary; - } - - if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked) - { - series.Metadata.Language = firstChapter.Language; - } - - - void HandleAddPerson(Person person) - { - PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); - allPeople.Add(person); - } - - // Handle People - foreach (var chapter in chapters) - { - if (!series.Metadata.WriterLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Writer).Select(p => p.Name), PersonRole.Writer, - HandleAddPerson); - } - - if (!series.Metadata.CoverArtistLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.CoverArtist).Select(p => p.Name), PersonRole.CoverArtist, - HandleAddPerson); - } - - if (!series.Metadata.PublisherLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Publisher).Select(p => p.Name), PersonRole.Publisher, - HandleAddPerson); - } - - if (!series.Metadata.CharacterLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Character).Select(p => p.Name), PersonRole.Character, - HandleAddPerson); - } - - if (!series.Metadata.ColoristLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Colorist).Select(p => p.Name), PersonRole.Colorist, - HandleAddPerson); - } - - if (!series.Metadata.EditorLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Editor).Select(p => p.Name), PersonRole.Editor, - HandleAddPerson); - } - - if (!series.Metadata.InkerLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Inker).Select(p => p.Name), PersonRole.Inker, - HandleAddPerson); - } - - if (!series.Metadata.LettererLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Letterer).Select(p => p.Name), PersonRole.Letterer, - HandleAddPerson); - } - - if (!series.Metadata.PencillerLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Penciller).Select(p => p.Name), PersonRole.Penciller, - HandleAddPerson); - } - - if (!series.Metadata.TranslatorLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Translator).Select(p => p.Name), PersonRole.Translator, - HandleAddPerson); - } - - if (!series.Metadata.TagsLocked) - { - TagHelper.UpdateTag(allTags, chapter.Tags.Select(t => t.Title), false, (tag, _) => - { - TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag); - allTags.Add(tag); - }); - } - - if (!series.Metadata.GenresLocked) - { - GenreHelper.UpdateGenre(allGenres, chapter.Genres.Select(t => t.Title), false, genre => - { - GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre); - allGenres.Add(genre); - }); - } - } - - // NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it - // I might be able to filter out people that are in locked fields? - var people = chapters.SelectMany(c => c.People).ToList(); - PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People, - people, person => - { - switch (person.Role) - { - case PersonRole.Writer: - if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Penciller: - if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Inker: - if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Colorist: - if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Letterer: - if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.CoverArtist: - if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Editor: - if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Publisher: - if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Character: - if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Translator: - if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person); - break; - default: - series.Metadata.People.Remove(person); - break; - } - }); - } - - - - private void UpdateVolumes(Series series, IList parsedInfos, ICollection allPeople, ICollection allTags, ICollection allGenres) - { - var startingVolumeCount = series.Volumes.Count; - // Add new volumes and update chapters per volume - var distinctVolumes = parsedInfos.DistinctVolumes(); - _logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name); - foreach (var volumeNumber in distinctVolumes) - { - var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber); - if (volume == null) - { - volume = DbFactory.Volume(volumeNumber); - series.Volumes.Add(volume); - _unitOfWork.VolumeRepository.Add(volume); - } - - volume.Name = volumeNumber; - - _logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name); - var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray(); - UpdateChapters(series, volume, infos); - volume.Pages = volume.Chapters.Sum(c => c.Pages); - - // Update all the metadata on the Chapters - foreach (var chapter in volume.Chapters) - { - var firstFile = chapter.Files.MinBy(x => x.Chapter); - if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue; - try - { - var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath)); - UpdateChapterFromComicInfo(chapter, allPeople, allTags, allGenres, firstChapterInfo?.ComicInfo); - } - catch (Exception ex) - { - _logger.LogError(ex, "There was some issue when updating chapter's metadata"); - } - } - } - - // Remove existing volumes that aren't in parsedInfos - var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList(); - if (series.Volumes.Count != nonDeletedVolumes.Count) - { - _logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name", - (series.Volumes.Count - nonDeletedVolumes.Count), series.Name); - var deletedVolumes = series.Volumes.Except(nonDeletedVolumes); - foreach (var volume in deletedVolumes) - { - var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? ""; - if (!string.IsNullOrEmpty(file) && File.Exists(file)) - { - _logger.LogError( - "[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}", - file); - } - - _logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file); - } - - series.Volumes = nonDeletedVolumes; - } - - _logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}", - series.Name, startingVolumeCount, series.Volumes.Count); - } - - private void UpdateChapters(Series series, Volume volume, IList parsedInfos) - { - // Add new chapters - foreach (var info in parsedInfos) - { - // Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0 - // also are treated like specials for UI grouping. - Chapter chapter; - try - { - chapter = volume.Chapters.GetChapterByRange(info); - } - catch (Exception ex) - { - _logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters); - continue; - } - - if (chapter == null) - { - _logger.LogDebug( - "[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters); - chapter = DbFactory.Chapter(info); - volume.Chapters.Add(chapter); - series.LastChapterAdded = DateTime.Now; - } - else - { - chapter.UpdateFrom(info); - } - - if (chapter == null) continue; - // Add files - var specialTreatment = info.IsSpecialInfo(); - AddOrUpdateFileForChapter(chapter, info); - chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty; - chapter.Range = specialTreatment ? info.Filename : info.Chapters; - } - - - // Remove chapters that aren't in parsedInfos or have no files linked - var existingChapters = volume.Chapters.ToList(); - foreach (var existingChapter in existingChapters) - { - if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter)) - { - _logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series); - volume.Chapters.Remove(existingChapter); - } - else - { - // Ensure we remove any files that no longer exist AND order - existingChapter.Files = existingChapter.Files - .Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath)) - .OrderByNatural(f => f.FilePath).ToList(); - existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages); - } - } - } - - private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info) - { - chapter.Files ??= new List(); - var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath); - if (existingFile != null) - { - existingFile.Format = info.Format; - if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return; - existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format); - // We skip updating DB here with last modified time so that metadata refresh can do it - } - else - { - var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format)); - if (file == null) return; - - chapter.Files.Add(file); - } - } - - private void UpdateChapterFromComicInfo(Chapter chapter, ICollection allPeople, ICollection allTags, ICollection allGenres, ComicInfo? info) - { - var firstFile = chapter.Files.MinBy(x => x.Chapter); - if (firstFile == null || - _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return; - - var comicInfo = info; - if (info == null) - { - comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath); - } - - if (comicInfo == null) return; - _logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath); - - chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating); - - if (!string.IsNullOrEmpty(comicInfo.Title)) - { - chapter.TitleName = comicInfo.Title.Trim(); - } - - if (!string.IsNullOrEmpty(comicInfo.Summary)) - { - chapter.Summary = comicInfo.Summary; - } - - if (!string.IsNullOrEmpty(comicInfo.LanguageISO)) - { - chapter.Language = comicInfo.LanguageISO; - } - - if (comicInfo.Count > 0) - { - chapter.TotalCount = comicInfo.Count; - } - - // This needs to check against both Number and Volume to calculate Count - if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0) - { - chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number)); - } - if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0) - { - chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume))); - } - - - if (comicInfo.Year > 0) - { - var day = Math.Max(comicInfo.Day, 1); - var month = Math.Max(comicInfo.Month, 1); - chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}"); - } - - var people = GetTagValues(comicInfo.Colorist); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Colorist, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Characters); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Character, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Translator); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Translator, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Writer); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Writer, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Editor); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Editor, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Inker); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Inker, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Letterer); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Letterer, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Penciller); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Penciller, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.CoverArtist); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.CoverArtist, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Publisher); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Publisher, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - var genres = GetTagValues(comicInfo.Genre); - GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList()); - GenreHelper.UpdateGenre(allGenres, genres, false, - genre => chapter.Genres.Add(genre)); - - var tags = GetTagValues(comicInfo.Tags); - TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList()); - TagHelper.UpdateTag(allTags, tags, false, - (tag, _) => - { - chapter.Tags.Add(tag); - }); - } - - private static IList GetTagValues(string comicInfoTagSeparatedByComma) - { - - if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma)) - { - return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList(); - } - return ImmutableList.Empty; - } } diff --git a/API/SignalR/MessageFactory.cs b/API/SignalR/MessageFactory.cs index 47aa07f02..f8a8de873 100644 --- a/API/SignalR/MessageFactory.cs +++ b/API/SignalR/MessageFactory.cs @@ -108,7 +108,10 @@ namespace API.SignalR /// When files are being scanned to calculate word count /// private const string WordCountAnalyzerProgress = "WordCountAnalyzerProgress"; - + /// + /// A generic message that can occur in background processing to inform user, but no direct action is needed + /// + public const string Info = "Info"; public static SignalRMessage ScanSeriesEvent(int libraryId, int seriesId, string seriesName) @@ -261,9 +264,7 @@ namespace API.SignalR }; } - /** - * A generic error that will show on events widget in the UI - */ + public static SignalRMessage ErrorEvent(string title, string subtitle) { return new SignalRMessage @@ -281,6 +282,23 @@ namespace API.SignalR }; } + public static SignalRMessage InfoEvent(string title, string subtitle) + { + return new SignalRMessage + { + Name = Info, + Title = title, + SubTitle = subtitle, + Progress = ProgressType.None, + EventType = ProgressEventType.Single, + Body = new + { + Title = title, + SubTitle = subtitle, + } + }; + } + public static SignalRMessage LibraryModifiedEvent(int libraryId, string action) { return new SignalRMessage diff --git a/API/Startup.cs b/API/Startup.cs index 31342e7d9..768e46f8d 100644 --- a/API/Startup.cs +++ b/API/Startup.cs @@ -152,8 +152,10 @@ namespace API .UseMemoryStorage()); // Add the processing server as IHostedService - services.AddHangfireServer(); - + services.AddHangfireServer(options => + { + options.Queues = new[] {TaskScheduler.ScanQueue, TaskScheduler.DefaultQueue}; + }); // Add IHostedService for startup tasks // Any services that should be bootstrapped go here services.AddHostedService(); diff --git a/API/config/appsettings.Development.json b/API/config/appsettings.Development.json index 78d892e05..0d7c12bda 100644 --- a/API/config/appsettings.Development.json +++ b/API/config/appsettings.Development.json @@ -5,7 +5,7 @@ "TokenKey": "super secret unguessable key", "Logging": { "LogLevel": { - "Default": "Critical", + "Default": "Debug", "Microsoft": "Information", "Microsoft.Hosting.Lifetime": "Error", "Hangfire": "Information", diff --git a/Kavita.Common/Helpers/GlobMatcher.cs b/Kavita.Common/Helpers/GlobMatcher.cs new file mode 100644 index 000000000..3abd06f22 --- /dev/null +++ b/Kavita.Common/Helpers/GlobMatcher.cs @@ -0,0 +1,64 @@ +using System.Collections.Generic; +using System.Linq; +using DotNet.Globbing; + +namespace Kavita.Common.Helpers; + +/** + * Matches against strings using Glob syntax + */ +public class GlobMatcher +{ + private readonly IList _includes = new List(); + private readonly IList _excludes = new List(); + + public void AddInclude(string pattern) + { + _includes.Add(Glob.Parse(pattern)); + } + + public void AddExclude(string pattern) + { + _excludes.Add(Glob.Parse(pattern)); + } + + public bool ExcludeMatches(string file) + { + // NOTE: Glob.IsMatch() returns the opposite of what you'd expect + return _excludes.Any(p => p.IsMatch(file)); + } + + + /// + /// + /// + /// + /// + /// True if any + public bool IsMatch(string file, bool mustMatchIncludes = false) + { + // NOTE: Glob.IsMatch() returns the opposite of what you'd expect + if (_excludes.Any(p => p.IsMatch(file))) return true; + if (mustMatchIncludes) + { + return _includes.Any(p => p.IsMatch(file)); + } + + return false; + } + + public void Merge(GlobMatcher matcher) + { + if (matcher == null) return; + foreach (var glob in matcher._excludes) + { + _excludes.Add(glob); + } + + foreach (var glob in matcher._includes) + { + _includes.Add(glob); + } + + } +} diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index c0db9f43f..bc15a8c6e 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -9,6 +9,7 @@ + diff --git a/UI/Web/src/app/_models/events/info-event.ts b/UI/Web/src/app/_models/events/info-event.ts new file mode 100644 index 000000000..9ad5a1616 --- /dev/null +++ b/UI/Web/src/app/_models/events/info-event.ts @@ -0,0 +1,32 @@ +import { EVENTS } from "src/app/_services/message-hub.service"; + +export interface InfoEvent { + /** + * Payload of the event subtype + */ + body: any; + /** + * Subtype event + */ + name: EVENTS.Info; + /** + * Title to display in events widget + */ + title: string; + /** + * Optional subtitle to display. Defaults to empty string + */ + subTitle: string; + /** + * Type of event. Helps events widget to understand how to handle said event + */ + eventType: 'single'; + /** + * Type of progress. Helps widget understand how to display spinner + */ + progress: 'none'; + /** + * When event was sent + */ + eventTime: string; +} \ No newline at end of file diff --git a/UI/Web/src/app/_models/series.ts b/UI/Web/src/app/_models/series.ts index 8ceda4fc3..ae52f902a 100644 --- a/UI/Web/src/app/_models/series.ts +++ b/UI/Web/src/app/_models/series.ts @@ -55,4 +55,8 @@ export interface Series { minHoursToRead: number; maxHoursToRead: number; avgHoursToRead: number; + /** + * Highest level folder containing this series + */ + folderPath: string; } diff --git a/UI/Web/src/app/_services/message-hub.service.ts b/UI/Web/src/app/_services/message-hub.service.ts index 5ceb31e50..961afd6cb 100644 --- a/UI/Web/src/app/_services/message-hub.service.ts +++ b/UI/Web/src/app/_services/message-hub.service.ts @@ -71,7 +71,11 @@ export enum EVENTS { /** * When files are being scanned to calculate word count */ - WordCountAnalyzerProgress = 'WordCountAnalyzerProgress' + WordCountAnalyzerProgress = 'WordCountAnalyzerProgress', + /** + * When the user needs to be informed, but it's not a big deal + */ + Info = 'Info', } export interface Message { @@ -217,6 +221,13 @@ export class MessageHubService { }); }); + this.hubConnection.on(EVENTS.Info, resp => { + this.messagesSource.next({ + event: EVENTS.Info, + payload: resp.body + }); + }); + this.hubConnection.on(EVENTS.SeriesAdded, resp => { this.messagesSource.next({ event: EVENTS.SeriesAdded, diff --git a/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html b/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html index c417bbed3..7194667f9 100644 --- a/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html +++ b/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html @@ -345,10 +345,12 @@
Created: {{series.created | date:'shortDate'}}
-
Last Read: {{series.latestReadDate | date:'shortDate'}}
-
Last Added To: {{series.lastChapterAdded | date:'shortDate'}}
+
Last Read: {{series.latestReadDate | date:'shortDate' | defaultDate}}
+
Last Added To: {{series.lastChapterAdded | date:'shortDate' | defaultDate}}
+
Folder Path: {{series.folderPath | defaultValue}}
+
Max Items: {{metadata.maxCount}}
Total Items: {{metadata.totalCount}}
Publication Status: {{metadata.publicationStatus | publicationStatus}}
diff --git a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts index 2e31fff40..81676a474 100644 --- a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts +++ b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts @@ -1,8 +1,8 @@ import { CdkVirtualScrollViewport } from '@angular/cdk/scrolling'; import { DOCUMENT } from '@angular/common'; -import { AfterContentInit, AfterViewInit, ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core'; +import { AfterViewInit, ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core'; import { VirtualScrollerComponent } from '@iharbeck/ngx-virtual-scroller'; -import { first, Subject, takeUntil, takeWhile } from 'rxjs'; +import { Subject } from 'rxjs'; import { FilterSettings } from 'src/app/metadata-filter/filter-settings'; import { Breakpoint, UtilityService } from 'src/app/shared/_services/utility.service'; import { JumpKey } from 'src/app/_models/jumpbar/jump-key'; @@ -77,6 +77,7 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, private jumpbarService: JumpbarService) { this.filter = this.seriesService.createSeriesFilter(); this.changeDetectionRef.markForCheck(); + } @HostListener('window:resize', ['$event']) @@ -108,10 +109,11 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, this.virtualScroller.refresh(); }); } + } ngAfterViewInit(): void { - // NOTE: I can't seem to figure out a way to resume the JumpKey with the scroller. + // NOTE: I can't seem to figure out a way to resume the JumpKey with the scroller. // this.virtualScroller.vsUpdate.pipe(takeWhile(() => this.hasResumedJumpKey), takeUntil(this.onDestory)).subscribe(() => { // const resumeKey = this.jumpbarService.getResumeKey(this.header); // console.log('Resume key:', resumeKey); @@ -130,7 +132,6 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, ngOnChanges(): void { this.jumpBarKeysToRender = [...this.jumpBarKeys]; this.resizeJumpBar(); - } diff --git a/UI/Web/src/app/dashboard/dashboard.component.ts b/UI/Web/src/app/dashboard/dashboard.component.ts index a7d4426c0..7b7323608 100644 --- a/UI/Web/src/app/dashboard/dashboard.component.ts +++ b/UI/Web/src/app/dashboard/dashboard.component.ts @@ -79,7 +79,8 @@ export class DashboardComponent implements OnInit, OnDestroy { ); this.loadRecentlyAdded$.pipe(debounceTime(1000), takeUntil(this.onDestroy)).subscribe(() => { - this.loadRecentlyAdded(); + this.loadRecentlyUpdated(); + this.loadRecentlyAddedSeries(); this.cdRef.markForCheck(); }); } @@ -104,7 +105,7 @@ export class DashboardComponent implements OnInit, OnDestroy { reloadSeries() { this.loadOnDeck(); - this.loadRecentlyAdded(); + this.loadRecentlyUpdated(); this.loadRecentlyAddedSeries(); } @@ -144,7 +145,7 @@ export class DashboardComponent implements OnInit, OnDestroy { } - loadRecentlyAdded() { + loadRecentlyUpdated() { let api = this.seriesService.getRecentlyUpdatedSeries(); if (this.libraryId > 0) { api = this.seriesService.getRecentlyUpdatedSeries(); diff --git a/UI/Web/src/app/library-detail/library-detail.component.html b/UI/Web/src/app/library-detail/library-detail.component.html index 74a4ab8ac..c3e70a238 100644 --- a/UI/Web/src/app/library-detail/library-detail.component.html +++ b/UI/Web/src/app/library-detail/library-detail.component.html @@ -26,6 +26,7 @@ [trackByIdentity]="trackByIdentity" [filterOpen]="filterOpen" [jumpBarKeys]="jumpKeys" + [refresh]="refresh" (applyFilter)="updateFilter($event)" > diff --git a/UI/Web/src/app/library-detail/library-detail.component.ts b/UI/Web/src/app/library-detail/library-detail.component.ts index ab9946b28..af2f1d996 100644 --- a/UI/Web/src/app/library-detail/library-detail.component.ts +++ b/UI/Web/src/app/library-detail/library-detail.component.ts @@ -41,6 +41,7 @@ export class LibraryDetailComponent implements OnInit, OnDestroy { filterOpen: EventEmitter = new EventEmitter(); filterActive: boolean = false; filterActiveCheck!: SeriesFilter; + refresh: EventEmitter = new EventEmitter(); jumpKeys: Array = []; @@ -141,15 +142,38 @@ export class LibraryDetailComponent implements OnInit, OnDestroy { } ngOnInit(): void { - this.hubService.messages$.pipe(debounceTime(6000), takeUntil(this.onDestroy)).subscribe((event) => { + this.hubService.messages$.pipe(takeUntil(this.onDestroy)).subscribe((event) => { if (event.event === EVENTS.SeriesAdded) { const seriesAdded = event.payload as SeriesAddedEvent; if (seriesAdded.libraryId !== this.libraryId) return; - this.loadPage(); + if (!this.utilityService.deepEqual(this.filter, this.filterActiveCheck)) { + this.loadPage(); + return; + } + this.seriesService.getSeries(seriesAdded.seriesId).subscribe(s => { + this.series = [...this.series, s].sort((s1: Series, s2: Series) => { + if (s1.sortName < s2.sortName) return -1; + if (s1.sortName > s2.sortName) return 1; + return 0; + }); + this.pagination.totalItems++; + this.cdRef.markForCheck(); + this.refresh.emit(); + }); + + } else if (event.event === EVENTS.SeriesRemoved) { const seriesRemoved = event.payload as SeriesRemovedEvent; if (seriesRemoved.libraryId !== this.libraryId) return; - this.loadPage(); + if (!this.utilityService.deepEqual(this.filter, this.filterActiveCheck)) { + this.loadPage(); + return; + } + + this.series = this.series.filter(s => s.id != seriesRemoved.seriesId); + this.pagination.totalItems--; + this.cdRef.markForCheck(); + this.refresh.emit(); } }); } @@ -228,5 +252,5 @@ export class LibraryDetailComponent implements OnInit, OnDestroy { this.router.navigate(['library', this.libraryId, 'series', series.id]); } - trackByIdentity = (index: number, item: Series) => `${item.name}_${item.localizedName}_${item.pagesRead}`; + trackByIdentity = (index: number, item: Series) => `${item.id}_${item.name}_${item.localizedName}_${item.pagesRead}`; } diff --git a/UI/Web/src/app/manga-reader/manga-reader.component.ts b/UI/Web/src/app/manga-reader/manga-reader.component.ts index 4da92b8de..5a564021d 100644 --- a/UI/Web/src/app/manga-reader/manga-reader.component.ts +++ b/UI/Web/src/app/manga-reader/manga-reader.component.ts @@ -486,11 +486,6 @@ export class MangaReaderComponent implements OnInit, AfterViewInit, OnDestroy { this.updateForm(); - this.generalSettingsForm.get('darkness')?.valueChanges.pipe(takeUntil(this.onDestroy)).subscribe(val => { - console.log('brightness: ', val); - //this.cdRef.markForCheck(); - }); - this.generalSettingsForm.get('layoutMode')?.valueChanges.pipe(takeUntil(this.onDestroy)).subscribe(val => { const changeOccurred = parseInt(val, 10) !== this.layoutMode; diff --git a/UI/Web/src/app/nav/events-widget/events-widget.component.html b/UI/Web/src/app/nav/events-widget/events-widget.component.html index 5675eba01..051b55d78 100644 --- a/UI/Web/src/app/nav/events-widget/events-widget.component.html +++ b/UI/Web/src/app/nav/events-widget/events-widget.component.html @@ -1,17 +1,27 @@ - + + + -
    +
      + + +
    • + Dismiss All +
    • +
      +
    • Title goes here
      @@ -46,6 +56,13 @@
+
  • +
    +
    Scan didn't run becasuse nothing to do
    +
    Click for more information
    +
    + +
  • @@ -59,6 +76,7 @@
    PDFs
  • + @@ -119,12 +137,25 @@ - + + + + + + + diff --git a/UI/Web/src/app/nav/events-widget/events-widget.component.scss b/UI/Web/src/app/nav/events-widget/events-widget.component.scss index 4f10fafe8..acdbae134 100644 --- a/UI/Web/src/app/nav/events-widget/events-widget.component.scss +++ b/UI/Web/src/app/nav/events-widget/events-widget.component.scss @@ -69,6 +69,11 @@ border-radius: 60px; } +.colored-info { + background-color: var(--event-widget-info-bg-color) !important; + border-radius: 60px; +} + .update-available { cursor: pointer; @@ -95,4 +100,23 @@ font-size: 11px; position: absolute; } +} + +.info { + cursor: pointer; + position: relative; + .h6 { + color: var(--event-widget-info-bg-color); + } + + i.fa { + color: var(--primary-color) !important; + } + + .btn-close { + top: 10px; + right: 10px; + font-size: 11px; + position: absolute; + } } \ No newline at end of file diff --git a/UI/Web/src/app/nav/events-widget/events-widget.component.ts b/UI/Web/src/app/nav/events-widget/events-widget.component.ts index 4c6a76148..a86d0dc93 100644 --- a/UI/Web/src/app/nav/events-widget/events-widget.component.ts +++ b/UI/Web/src/app/nav/events-widget/events-widget.component.ts @@ -7,6 +7,7 @@ import { ConfirmService } from 'src/app/shared/confirm.service'; import { UpdateNotificationModalComponent } from 'src/app/shared/update-notification/update-notification-modal.component'; import { DownloadService } from 'src/app/shared/_services/download.service'; import { ErrorEvent } from 'src/app/_models/events/error-event'; +import { InfoEvent } from 'src/app/_models/events/info-event'; import { NotificationProgressEvent } from 'src/app/_models/events/notification-progress-event'; import { UpdateVersionEvent } from 'src/app/_models/events/update-version-event'; import { User } from 'src/app/_models/user'; @@ -38,6 +39,9 @@ export class EventsWidgetComponent implements OnInit, OnDestroy { errorSource = new BehaviorSubject([]); errors$ = this.errorSource.asObservable(); + infoSource = new BehaviorSubject([]); + infos$ = this.infoSource.asObservable(); + private updateNotificationModalRef: NgbModalRef | null = null; activeEvents: number = 0; @@ -64,6 +68,7 @@ export class EventsWidgetComponent implements OnInit, OnDestroy { ngOnInit(): void { this.messageHub.messages$.pipe(takeUntil(this.onDestroy)).subscribe(event => { if (event.event === EVENTS.NotificationProgress) { + console.log('[Event Widget]: Event came in ', event.payload); this.processNotificationProgressEvent(event); } else if (event.event === EVENTS.Error) { const values = this.errorSource.getValue(); @@ -71,6 +76,12 @@ export class EventsWidgetComponent implements OnInit, OnDestroy { this.errorSource.next(values); this.activeEvents += 1; this.cdRef.markForCheck(); + } else if (event.event === EVENTS.Info) { + const values = this.infoSource.getValue(); + values.push(event.payload as InfoEvent); + this.infoSource.next(values); + this.activeEvents += 1; + this.cdRef.markForCheck(); } }); @@ -139,28 +150,46 @@ export class EventsWidgetComponent implements OnInit, OnDestroy { }); } - async seeMoreError(error: ErrorEvent) { + async seeMore(event: ErrorEvent | InfoEvent) { const config = new ConfirmConfig(); config.buttons = [ - {text: 'Dismiss', type: 'primary'}, {text: 'Ok', type: 'secondary'}, ]; - config.header = error.title; - config.content = error.subTitle; - var result = await this.confirmService.alert(error.subTitle || error.title, config); + if (event.name === EVENTS.Error) { + config.buttons = [{text: 'Dismiss', type: 'primary'}, ...config.buttons]; + } + config.header = event.title; + config.content = event.subTitle; + var result = await this.confirmService.alert(event.subTitle || event.title, config); if (result) { - this.removeError(error); + this.removeErrorOrInfo(event); } } - removeError(error: ErrorEvent, event?: MouseEvent) { + clearAllErrorOrInfos() { + const infoCount = this.infoSource.getValue().length; + const errorCount = this.errorSource.getValue().length; + this.infoSource.next([]); + this.errorSource.next([]); + this.activeEvents -= Math.max(infoCount + errorCount, 0); + this.cdRef.markForCheck(); + } + + removeErrorOrInfo(messageEvent: ErrorEvent | InfoEvent, event?: MouseEvent) { if (event) { event.stopPropagation(); event.preventDefault(); } - let data = this.errorSource.getValue(); - data = data.filter(m => m !== error); - this.errorSource.next(data); + let data = []; + if (messageEvent.name === EVENTS.Info) { + data = this.infoSource.getValue(); + data = data.filter(m => m !== messageEvent); + this.infoSource.next(data); + } else { + data = this.errorSource.getValue(); + data = data.filter(m => m !== messageEvent); + this.errorSource.next(data); + } this.activeEvents = Math.max(this.activeEvents - 1, 0); this.cdRef.markForCheck(); } diff --git a/UI/Web/src/app/nav/nav-header/nav-header.component.html b/UI/Web/src/app/nav/nav-header/nav-header.component.html index c233e0cfc..c2ba5e7c8 100644 --- a/UI/Web/src/app/nav/nav-header/nav-header.component.html +++ b/UI/Web/src/app/nav/nav-header/nav-header.component.html @@ -2,7 +2,185 @@
    Skip to main content - Kavita + + + + + Kavita + From 66a998425bbf6edc4643004075dbb24735455f39 Mon Sep 17 00:00:00 2001 From: majora2007 Date: Sat, 20 Aug 2022 16:48:23 +0000 Subject: [PATCH 006/134] Bump versions by dotnet-bump-version. --- Kavita.Common/Kavita.Common.csproj | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index 620ee909d..d3d79f460 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -4,7 +4,7 @@ net6.0 kavitareader.com Kavita - 0.5.5.3 + 0.5.5.4 en From 329970f01b638af2ba42c635ac3c770624a09318 Mon Sep 17 00:00:00 2001 From: tjarls Date: Sat, 20 Aug 2022 21:31:00 +0100 Subject: [PATCH 007/134] Simplify parent lookup with Directory.GetParent (#1455) * Simplify parent lookup with Directory.GetParent * Address comments --- API.Tests/Services/DirectoryServiceTests.cs | 31 +++++++++++++++++++++ API/Services/DirectoryService.cs | 31 ++++++--------------- 2 files changed, 40 insertions(+), 22 deletions(-) diff --git a/API.Tests/Services/DirectoryServiceTests.cs b/API.Tests/Services/DirectoryServiceTests.cs index fac04bf9e..2e02641b9 100644 --- a/API.Tests/Services/DirectoryServiceTests.cs +++ b/API.Tests/Services/DirectoryServiceTests.cs @@ -963,5 +963,36 @@ namespace API.Tests.Services } #endregion + + #region GetParentDirectory + + [Theory] + [InlineData(@"C:/file.txt", "C:/")] + [InlineData(@"C:/folder/file.txt", "C:/folder")] + [InlineData(@"C:/folder/subfolder/file.txt", "C:/folder/subfolder")] + public void GetParentDirectoryName_ShouldFindParentOfFiles(string path, string expected) + { + var fileSystem = new MockFileSystem(new Dictionary + { + { path, new MockFileData(string.Empty)} + }); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(expected, ds.GetParentDirectoryName(path)); + } + [Theory] + [InlineData(@"C:/folder", "C:/")] + [InlineData(@"C:/folder/subfolder", "C:/folder")] + [InlineData(@"C:/folder/subfolder/another", "C:/folder/subfolder")] + public void GetParentDirectoryName_ShouldFindParentOfDirectories(string path, string expected) + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory(path); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(expected, ds.GetParentDirectoryName(path)); + } + + #endregion } } diff --git a/API/Services/DirectoryService.cs b/API/Services/DirectoryService.cs index 8c7d29603..5de343ea4 100644 --- a/API/Services/DirectoryService.cs +++ b/API/Services/DirectoryService.cs @@ -1,4 +1,4 @@ -using System; +using System; using System.Collections.Generic; using System.Collections.Immutable; using System.IO; @@ -554,31 +554,18 @@ namespace API.Services /// /// Returns the parent directories name for a file or folder. Empty string is path is not valid. /// - /// This does touch I/O with an Attribute lookup /// /// public string GetParentDirectoryName(string fileOrFolder) { - // TODO: Write Unit tests - try - { - var attr = File.GetAttributes(fileOrFolder); - var isDirectory = attr.HasFlag(FileAttributes.Directory); - if (isDirectory) - { - return Parser.Parser.NormalizePath(FileSystem.DirectoryInfo - .FromDirectoryName(fileOrFolder).Parent - .FullName); - } - - return Parser.Parser.NormalizePath(FileSystem.FileInfo - .FromFileName(fileOrFolder).Directory.Parent - .FullName); - } - catch (Exception) - { - return string.Empty; - } + try + { + return Parser.Parser.NormalizePath(Directory.GetParent(fileOrFolder).FullName); + } + catch (Exception) + { + return string.Empty; + } } /// From 354be09c4c878820a5e401e788902c690466c72b Mon Sep 17 00:00:00 2001 From: majora2007 Date: Sat, 20 Aug 2022 20:43:40 +0000 Subject: [PATCH 008/134] Bump versions by dotnet-bump-version. --- Kavita.Common/Kavita.Common.csproj | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index d3d79f460..203b1ff96 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -4,7 +4,7 @@ net6.0 kavitareader.com Kavita - 0.5.5.4 + 0.5.5.5 en From 1c9544fc47af772f2a0efe00bc0a5b037bb14ef0 Mon Sep 17 00:00:00 2001 From: Joseph Milazzo Date: Mon, 22 Aug 2022 12:14:31 -0500 Subject: [PATCH 009/134] Scan Loop Fixes (#1459) * Added Last Folder Scanned time to series info modal. Tweaked the info event detail modal to have a primary and thus be auto-dismissable * Added an error event when multiple series are found in processing a series. * Fixed a bug where a series could get stuck with other series due to a bad select query. Started adding the force flag hook for the UI and designing the confirm. Confirm service now also has ability to hide the close button. Updated error events and logging in the loop, to be more informative * Fixed a bug where confirm service wasn't showing the proper body content. * Hooked up force scan series * refresh metadata now has force update * Fixed up the messaging with the prompt on scan, hooked it up properly in the scan library to avoid the check if the whole library needs to even be scanned. Fixed a bug where NormalizedLocalizedName wasn't being calculated on new entities. Started adding unit tests for this problematic repo method. * Fixed a bug where we updated NormalizedLocalizedName before we set it. * Send an info to the UI when series are spread between multiple library level folders. * Added some logger output when there are no files found in a folder. Return early if there are no files found, so we can avoid some small loops of code. * Fixed an issue where multiple series in a folder with localized series would cause unintended grouping. This is not supported and hence we will warn them and allow the bad grouping. * Added a case where scan series fails due to the folder being removed. We will now log an error * Normalize paths when finding the highest directory till root. * Fixed an issue with Scan Series where changing a series' folder to a different path but the original series folder existed with another series in it, would cause the series to not be deleted. * Fixed some bugs around specials causing a series merge issue on scan series. * Removed a bug marker * Cleaned up some of the scan loop and removed a test I don't need. * Remove any prompts for force flow, it doesn't work well. Leave the API as is though. * Fixed up a check for duplicate ScanLibrary calls --- API.Tests/Repository/SeriesRepositoryTests.cs | 156 ++++++++++++++++++ API.Tests/Services/ParseScannedFilesTests.cs | 90 ---------- API/Controllers/LibraryController.cs | 8 +- API/DTOs/SeriesDto.cs | 4 + API/Data/DbFactory.cs | 20 +++ API/Data/Repositories/SeriesRepository.cs | 18 +- API/Services/DirectoryService.cs | 4 +- API/Services/TaskScheduler.cs | 23 ++- .../Tasks/Scanner/ParseScannedFiles.cs | 63 +++---- API/Services/Tasks/Scanner/ProcessSeries.cs | 18 +- API/Services/Tasks/ScannerService.cs | 94 ++++++----- UI/Web/src/app/_models/series.ts | 4 + .../app/_services/action-factory.service.ts | 8 +- UI/Web/src/app/_services/action.service.ts | 26 ++- UI/Web/src/app/_services/library.service.ts | 8 +- UI/Web/src/app/_services/series.service.ts | 4 +- .../edit-series-modal.component.html | 5 +- .../card-actionables.component.html | 2 +- .../series-card/series-card.component.ts | 2 +- .../library-detail.component.ts | 2 +- .../events-widget/events-widget.component.ts | 12 +- .../series-detail/series-detail.component.ts | 2 +- .../confirm-dialog/_models/confirm-button.ts | 2 +- .../confirm-dialog/_models/confirm-config.ts | 4 + .../confirm-dialog.component.html | 5 +- UI/Web/src/app/shared/confirm.service.ts | 3 + .../sidenav/side-nav/side-nav.component.ts | 2 +- 27 files changed, 367 insertions(+), 222 deletions(-) create mode 100644 API.Tests/Repository/SeriesRepositoryTests.cs diff --git a/API.Tests/Repository/SeriesRepositoryTests.cs b/API.Tests/Repository/SeriesRepositoryTests.cs new file mode 100644 index 000000000..16f365d88 --- /dev/null +++ b/API.Tests/Repository/SeriesRepositoryTests.cs @@ -0,0 +1,156 @@ +using System.Collections.Generic; +using System.Data.Common; +using System.IO.Abstractions.TestingHelpers; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Entities; +using API.Entities.Enums; +using API.Helpers; +using API.Services; +using AutoMapper; +using Microsoft.Data.Sqlite; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; +using Microsoft.Extensions.Logging; +using NSubstitute; +using Xunit; + +namespace API.Tests.Repository; + +public class SeriesRepositoryTests +{ + private readonly IUnitOfWork _unitOfWork; + + private readonly DbConnection _connection; + private readonly DataContext _context; + + private const string CacheDirectory = "C:/kavita/config/cache/"; + private const string CoverImageDirectory = "C:/kavita/config/covers/"; + private const string BackupDirectory = "C:/kavita/config/backups/"; + private const string DataDirectory = "C:/data/"; + + public SeriesRepositoryTests() + { + var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options; + _connection = RelationalOptionsExtension.Extract(contextOptions).Connection; + + _context = new DataContext(contextOptions); + Task.Run(SeedDb).GetAwaiter().GetResult(); + + var config = new MapperConfiguration(cfg => cfg.AddProfile()); + var mapper = config.CreateMapper(); + _unitOfWork = new UnitOfWork(_context, mapper, null); + } + + #region Setup + + private static DbConnection CreateInMemoryDatabase() + { + var connection = new SqliteConnection("Filename=:memory:"); + + connection.Open(); + + return connection; + } + + private async Task SeedDb() + { + await _context.Database.MigrateAsync(); + var filesystem = CreateFileSystem(); + + await Seed.SeedSettings(_context, + new DirectoryService(Substitute.For>(), filesystem)); + + var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync(); + setting.Value = CacheDirectory; + + setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync(); + setting.Value = BackupDirectory; + + _context.ServerSetting.Update(setting); + + var lib = new Library() + { + Name = "Manga", Folders = new List() {new FolderPath() {Path = "C:/data/"}} + }; + + _context.AppUser.Add(new AppUser() + { + UserName = "majora2007", + Libraries = new List() + { + lib + } + }); + + return await _context.SaveChangesAsync() > 0; + } + + private async Task ResetDb() + { + _context.Series.RemoveRange(_context.Series.ToList()); + _context.AppUserRating.RemoveRange(_context.AppUserRating.ToList()); + _context.Genre.RemoveRange(_context.Genre.ToList()); + _context.CollectionTag.RemoveRange(_context.CollectionTag.ToList()); + _context.Person.RemoveRange(_context.Person.ToList()); + + await _context.SaveChangesAsync(); + } + + private static MockFileSystem CreateFileSystem() + { + var fileSystem = new MockFileSystem(); + fileSystem.Directory.SetCurrentDirectory("C:/kavita/"); + fileSystem.AddDirectory("C:/kavita/config/"); + fileSystem.AddDirectory(CacheDirectory); + fileSystem.AddDirectory(CoverImageDirectory); + fileSystem.AddDirectory(BackupDirectory); + fileSystem.AddDirectory(DataDirectory); + + return fileSystem; + } + + #endregion + + private async Task SetupSeriesData() + { + var library = new Library() + { + Name = "Manga", + Type = LibraryType.Manga, + Folders = new List() + { + new FolderPath() {Path = "C:/data/manga/"} + } + }; + + library.Series = new List() + { + DbFactory.Series("The Idaten Deities Know Only Peace", "Heion Sedai no Idaten-tachi"), + }; + + _unitOfWork.LibraryRepository.Add(library); + await _unitOfWork.CommitAsync(); + } + + + [InlineData("Heion Sedai no Idaten-tachi", "", "The Idaten Deities Know Only Peace")] // Matching on localized name in DB + public async Task GetFullSeriesByAnyName_Should(string seriesName, string localizedName, string? expected) + { + var firstSeries = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(1); + var series = + await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(seriesName, localizedName, + 1); + if (expected == null) + { + Assert.Null(series); + } + else + { + Assert.NotNull(series); + Assert.Equal(expected, series.Name); + } + } + +} diff --git a/API.Tests/Services/ParseScannedFilesTests.cs b/API.Tests/Services/ParseScannedFilesTests.cs index 2bcbb1271..b63205b7a 100644 --- a/API.Tests/Services/ParseScannedFilesTests.cs +++ b/API.Tests/Services/ParseScannedFilesTests.cs @@ -156,96 +156,6 @@ public class ParseScannedFilesTests #endregion - #region GetInfosByName - - [Fact] - public void GetInfosByName_ShouldReturnGivenMatchingSeriesName() - { - var fileSystem = new MockFileSystem(); - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - var infos = new List() - { - ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), - ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) - }; - var parsedSeries = new Dictionary> - { - { - new ParsedSeries() - { - Format = MangaFormat.Archive, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - infos - }, - { - new ParsedSeries() - { - Format = MangaFormat.Pdf, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - new List() - } - }; - - var series = DbFactory.Series("Accel World"); - series.Format = MangaFormat.Pdf; - - Assert.Empty(ParseScannedFiles.GetInfosByName(parsedSeries, series)); - - series.Format = MangaFormat.Archive; - Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count()); - - } - - [Fact] - public void GetInfosByName_ShouldReturnGivenMatchingNormalizedSeriesName() - { - var fileSystem = new MockFileSystem(); - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - var infos = new List() - { - ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), - ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) - }; - var parsedSeries = new Dictionary> - { - { - new ParsedSeries() - { - Format = MangaFormat.Archive, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - infos - }, - { - new ParsedSeries() - { - Format = MangaFormat.Pdf, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - new List() - } - }; - - var series = DbFactory.Series("accel world"); - series.Format = MangaFormat.Archive; - Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count()); - - } - - #endregion - #region MergeName // NOTE: I don't think I can test MergeName as it relies on Tracking Files, which is more complicated than I need diff --git a/API/Controllers/LibraryController.cs b/API/Controllers/LibraryController.cs index 7b99763a2..321d0d06f 100644 --- a/API/Controllers/LibraryController.cs +++ b/API/Controllers/LibraryController.cs @@ -168,17 +168,17 @@ namespace API.Controllers [Authorize(Policy = "RequireAdminRole")] [HttpPost("scan")] - public ActionResult Scan(int libraryId) + public ActionResult Scan(int libraryId, bool force = false) { - _taskScheduler.ScanLibrary(libraryId); + _taskScheduler.ScanLibrary(libraryId, force); return Ok(); } [Authorize(Policy = "RequireAdminRole")] [HttpPost("refresh-metadata")] - public ActionResult RefreshMetadata(int libraryId) + public ActionResult RefreshMetadata(int libraryId, bool force = true) { - _taskScheduler.RefreshMetadata(libraryId); + _taskScheduler.RefreshMetadata(libraryId, force); return Ok(); } diff --git a/API/DTOs/SeriesDto.cs b/API/DTOs/SeriesDto.cs index 2904bf57c..bbf65e9fb 100644 --- a/API/DTOs/SeriesDto.cs +++ b/API/DTOs/SeriesDto.cs @@ -58,5 +58,9 @@ namespace API.DTOs /// The highest level folder for this Series /// public string FolderPath { get; set; } + /// + /// The last time the folder for this series was scanned + /// + public DateTime LastFolderScanned { get; set; } } } diff --git a/API/Data/DbFactory.cs b/API/Data/DbFactory.cs index ad97958da..58cd834ef 100644 --- a/API/Data/DbFactory.cs +++ b/API/Data/DbFactory.cs @@ -24,6 +24,26 @@ namespace API.Data OriginalName = name, LocalizedName = name, NormalizedName = Parser.Parser.Normalize(name), + NormalizedLocalizedName = Parser.Parser.Normalize(name), + SortName = name, + Volumes = new List(), + Metadata = SeriesMetadata(Array.Empty()) + }; + } + + public static Series Series(string name, string localizedName) + { + if (string.IsNullOrEmpty(localizedName)) + { + localizedName = name; + } + return new Series + { + Name = name, + OriginalName = name, + LocalizedName = localizedName, + NormalizedName = Parser.Parser.Normalize(name), + NormalizedLocalizedName = Parser.Parser.Normalize(localizedName), SortName = name, Volumes = new List(), Metadata = SeriesMetadata(Array.Empty()) diff --git a/API/Data/Repositories/SeriesRepository.cs b/API/Data/Repositories/SeriesRepository.cs index 528d46902..3324d8713 100644 --- a/API/Data/Repositories/SeriesRepository.cs +++ b/API/Data/Repositories/SeriesRepository.cs @@ -1220,15 +1220,19 @@ public class SeriesRepository : ISeriesRepository /// public Task GetFullSeriesByAnyName(string seriesName, string localizedName, int libraryId) { - var localizedSeries = Parser.Parser.Normalize(seriesName); + var normalizedSeries = Parser.Parser.Normalize(seriesName); var normalizedLocalized = Parser.Parser.Normalize(localizedName); - return _context.Series - .Where(s => s.NormalizedName.Equals(localizedSeries) - || s.NormalizedName.Equals(normalizedLocalized) - || s.NormalizedLocalizedName.Equals(localizedSeries) - || s.NormalizedLocalizedName.Equals(normalizedLocalized)) + var query = _context.Series .Where(s => s.LibraryId == libraryId) - .Include(s => s.Metadata) + .Where(s => s.NormalizedName.Equals(normalizedSeries) + || (s.NormalizedLocalizedName.Equals(normalizedSeries) && s.NormalizedLocalizedName != string.Empty)); + if (!string.IsNullOrEmpty(normalizedLocalized)) + { + query = query.Where(s => + s.NormalizedName.Equals(normalizedLocalized) || s.NormalizedLocalizedName.Equals(normalizedLocalized)); + } + + return query.Include(s => s.Metadata) .ThenInclude(m => m.People) .Include(s => s.Metadata) .ThenInclude(m => m.Genres) diff --git a/API/Services/DirectoryService.cs b/API/Services/DirectoryService.cs index 5de343ea4..3c064dc11 100644 --- a/API/Services/DirectoryService.cs +++ b/API/Services/DirectoryService.cs @@ -492,10 +492,10 @@ namespace API.Services { var stopLookingForDirectories = false; var dirs = new Dictionary(); - foreach (var folder in libraryFolders) + foreach (var folder in libraryFolders.Select(Parser.Parser.NormalizePath)) { if (stopLookingForDirectories) break; - foreach (var file in filePaths) + foreach (var file in filePaths.Select(Parser.Parser.NormalizePath)) { if (!file.Contains(folder)) continue; diff --git a/API/Services/TaskScheduler.cs b/API/Services/TaskScheduler.cs index d419a0fa8..df7f20152 100644 --- a/API/Services/TaskScheduler.cs +++ b/API/Services/TaskScheduler.cs @@ -19,7 +19,7 @@ public interface ITaskScheduler Task ScheduleTasks(); Task ScheduleStatsTasks(); void ScheduleUpdaterTasks(); - void ScanLibrary(int libraryId); + void ScanLibrary(int libraryId, bool force = false); void CleanupChapters(int[] chapterIds); void RefreshMetadata(int libraryId, bool forceUpdate = true); void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false); @@ -174,9 +174,12 @@ public class TaskScheduler : ITaskScheduler _scannerService.ScanLibraries(); } - public void ScanLibrary(int libraryId) + public void ScanLibrary(int libraryId, bool force = false) { - if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}, ScanQueue)) + var alreadyEnqueued = + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) || + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue); + if (alreadyEnqueued) { _logger.LogInformation("A duplicate request to scan library for library occured. Skipping"); return; @@ -184,12 +187,12 @@ public class TaskScheduler : ITaskScheduler if (RunningAnyTasksByMethod(new List() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue)) { _logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours"); - BackgroundJob.Schedule(() => ScanLibrary(libraryId), TimeSpan.FromHours(3)); + BackgroundJob.Schedule(() => ScanLibrary(libraryId, force), TimeSpan.FromHours(3)); return; } _logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId); - BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId)); + BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, force)); // When we do a scan, force cache to re-unpack in case page numbers change BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory()); } @@ -201,7 +204,11 @@ public class TaskScheduler : ITaskScheduler public void RefreshMetadata(int libraryId, bool forceUpdate = true) { - if (HasAlreadyEnqueuedTask("MetadataService","GenerateCoversForLibrary", new object[] {libraryId, forceUpdate})) + var alreadyEnqueued = HasAlreadyEnqueuedTask("MetadataService", "GenerateCoversForLibrary", + new object[] {libraryId, true}) || + HasAlreadyEnqueuedTask("MetadataService", "GenerateCoversForLibrary", + new object[] {libraryId, false}); + if (alreadyEnqueued) { _logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping"); return; @@ -232,7 +239,7 @@ public class TaskScheduler : ITaskScheduler } if (RunningAnyTasksByMethod(new List() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue)) { - _logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 mins"); + _logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 minutes"); BackgroundJob.Schedule(() => ScanSeries(libraryId, seriesId, forceUpdate), TimeSpan.FromMinutes(10)); return; } @@ -276,7 +283,7 @@ public class TaskScheduler : ITaskScheduler /// object[] of arguments in the order they are passed to enqueued job /// Queue to check against. Defaults to "default" /// - public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue) + private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue) { var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue); return enqueuedJobs.Any(j => j.Value.InEnqueuedState && diff --git a/API/Services/Tasks/Scanner/ParseScannedFiles.cs b/API/Services/Tasks/Scanner/ParseScannedFiles.cs index 5b46f212c..d993c2c9e 100644 --- a/API/Services/Tasks/Scanner/ParseScannedFiles.cs +++ b/API/Services/Tasks/Scanner/ParseScannedFiles.cs @@ -3,10 +3,8 @@ using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using System.Threading.Tasks; -using API.Entities; using API.Entities.Enums; using API.Extensions; -using API.Helpers; using API.Parser; using API.SignalR; using Microsoft.Extensions.Logging; @@ -68,26 +66,6 @@ namespace API.Services.Tasks.Scanner _eventHub = eventHub; } - /// - /// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list. - /// - /// - /// - /// - public static IList GetInfosByName(Dictionary> parsedSeries, Series series) - { - var allKeys = parsedSeries.Keys.Where(ps => - SeriesHelper.FindSeries(series, ps)); - - var infos = new List(); - foreach (var key in allKeys) - { - infos.AddRange(parsedSeries[key]); - } - - return infos; - } - /// /// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained @@ -192,7 +170,7 @@ namespace API.Services.Tasks.Scanner /// /// /// Series Name to group this info into - public string MergeName(ConcurrentDictionary> scannedSeries, ParserInfo info) + private string MergeName(ConcurrentDictionary> scannedSeries, ParserInfo info) { var normalizedSeries = Parser.Parser.Normalize(info.Series); var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries); @@ -230,7 +208,7 @@ namespace API.Services.Tasks.Scanner /// - /// This is a new version which will process series by folder groups. + /// This will process series by folder groups. /// /// /// @@ -263,8 +241,16 @@ namespace API.Services.Tasks.Scanner } _logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder); await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated)); + if (files.Count == 0) + { + _logger.LogInformation("[ScannerService] {Folder} is empty", folder); + return; + } var scannedSeries = new ConcurrentDictionary>(); - var infos = files.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)).Where(info => info != null).ToList(); + var infos = files + .Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)) + .Where(info => info != null) + .ToList(); MergeLocalizedSeriesWithSeries(infos); @@ -320,17 +306,36 @@ namespace API.Services.Tasks.Scanner /// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration" /// /// A collection of ParserInfos - private static void MergeLocalizedSeriesWithSeries(IReadOnlyCollection infos) + private void MergeLocalizedSeriesWithSeries(IReadOnlyCollection infos) { var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries)); if (!hasLocalizedSeries) return; - var localizedSeries = infos.Select(i => i.LocalizedSeries).Distinct() + var localizedSeries = infos + .Where(i => !i.IsSpecial) + .Select(i => i.LocalizedSeries) + .Distinct() .FirstOrDefault(i => !string.IsNullOrEmpty(i)); if (string.IsNullOrEmpty(localizedSeries)) return; - var nonLocalizedSeries = infos.Select(i => i.Series).Distinct() - .FirstOrDefault(series => !series.Equals(localizedSeries)); + // NOTE: If we have multiple series in a folder with a localized title, then this will fail. It will group into one series. User needs to fix this themselves. + string nonLocalizedSeries; + var nonLocalizedSeriesFound = infos.Where(i => !i.IsSpecial).Select(i => i.Series).Distinct().ToList(); + if (nonLocalizedSeriesFound.Count == 1) + { + nonLocalizedSeries = nonLocalizedSeriesFound.First(); + } + else + { + // There can be a case where there are multiple series in a folder that causes merging. + if (nonLocalizedSeriesFound.Count > 2) + { + _logger.LogError("[ScannerService] There are multiple series within one folder that contain localized series. This will cause them to group incorrectly. Please separate series into their own dedicated folder: {LocalizedSeries}", string.Join(", ", nonLocalizedSeriesFound)); + } + nonLocalizedSeries = nonLocalizedSeriesFound.FirstOrDefault(s => !s.Equals(localizedSeries)); + } + + if (string.IsNullOrEmpty(nonLocalizedSeries)) return; var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries); foreach (var infoNeedingMapping in infos.Where(i => diff --git a/API/Services/Tasks/Scanner/ProcessSeries.cs b/API/Services/Tasks/Scanner/ProcessSeries.cs index 29b9cab1d..bebfae4ea 100644 --- a/API/Services/Tasks/Scanner/ProcessSeries.cs +++ b/API/Services/Tasks/Scanner/ProcessSeries.cs @@ -88,7 +88,7 @@ public class ProcessSeries : IProcessSeries // Check if there is a Series var firstInfo = parsedInfos.First(); - Series series = null; + Series series; try { series = @@ -97,29 +97,29 @@ public class ProcessSeries : IProcessSeries } catch (Exception ex) { - _logger.LogError(ex, "There was an exception finding existing series for {SeriesName} with Localized name of {LocalizedName}. This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan", firstInfo.Series, firstInfo.LocalizedSeries); + _logger.LogError(ex, "There was an exception finding existing series for {SeriesName} with Localized name of {LocalizedName} for library {LibraryId}. This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan", firstInfo.Series, firstInfo.LocalizedSeries, library.Id); + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"There was an exception finding existing series for {firstInfo.Series} with Localized name of {firstInfo.LocalizedSeries} for library {library.Id}", + "This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan.")); return; } if (series == null) { seriesAdded = true; - series = DbFactory.Series(firstInfo.Series); - series.LocalizedName = firstInfo.LocalizedSeries; + series = DbFactory.Series(firstInfo.Series, firstInfo.LocalizedSeries); } if (series.LibraryId == 0) series.LibraryId = library.Id; try { - _logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName); UpdateVolumes(series, parsedInfos); series.Pages = series.Volumes.Sum(v => v.Pages); series.NormalizedName = Parser.Parser.Normalize(series.Name); - series.NormalizedLocalizedName = Parser.Parser.Normalize(series.LocalizedName); series.OriginalName ??= parsedInfos[0].Series; if (series.Format == MangaFormat.Unknown) { @@ -144,13 +144,17 @@ public class ProcessSeries : IProcessSeries if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries)) { series.LocalizedName = localizedSeries; + series.NormalizedLocalizedName = Parser.Parser.Normalize(series.LocalizedName); } // Update series FolderPath here (TODO: Move this into it's own private method) var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path), parsedInfos.Select(f => f.FullFilePath).ToList()); if (seriesDirs.Keys.Count == 0) { - _logger.LogCritical("Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are in a folder"); + _logger.LogCritical("Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are under a single folder from library"); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} has files spread outside a single series folder", + "This has negative performance effects. Please ensure all series are under a single folder from library")); } else { diff --git a/API/Services/Tasks/ScannerService.cs b/API/Services/Tasks/ScannerService.cs index 332270fef..54953cbdf 100644 --- a/API/Services/Tasks/ScannerService.cs +++ b/API/Services/Tasks/ScannerService.cs @@ -29,7 +29,7 @@ public interface IScannerService [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - Task ScanLibrary(int libraryId); + Task ScanLibrary(int libraryId, bool forceUpdate = false); [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] @@ -62,6 +62,10 @@ public enum ScanCancelReason /// There has been no change to the filesystem since last scan /// NoChange = 2, + /// + /// The underlying folder is missing + /// + FolderMissing = 3 } /** @@ -117,10 +121,15 @@ public class ScannerService : IScannerService var library = libraries.FirstOrDefault(l => l.Folders.Select(Parser.Parser.NormalizePath).Contains(libraryFolder)); if (library != null) { - BackgroundJob.Enqueue(() => ScanLibrary(library.Id)); + BackgroundJob.Enqueue(() => ScanLibrary(library.Id, false)); } } + /// + /// + /// + /// + /// Not Used. Scan series will always force [Queue(TaskScheduler.ScanQueue)] public async Task ScanSeries(int seriesId, bool bypassFolderOptimizationChecks = true) { @@ -130,12 +139,7 @@ public class ScannerService : IScannerService var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new[] {seriesId}); var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(series.LibraryId, LibraryIncludes.Folders); var libraryPaths = library.Folders.Select(f => f.Path).ToList(); - if (await ShouldScanSeries(seriesId, library, libraryPaths, series, bypassFolderOptimizationChecks) != ScanCancelReason.NoCancel) return; - - - var parsedSeries = new Dictionary>(); - var seenSeries = new List(); - var processTasks = new List(); + if (await ShouldScanSeries(seriesId, library, libraryPaths, series, true) != ScanCancelReason.NoCancel) return; var folderPath = series.FolderPath; if (string.IsNullOrEmpty(folderPath) || !_directoryService.Exists(folderPath)) @@ -150,22 +154,32 @@ public class ScannerService : IScannerService } folderPath = seriesDirs.Keys.FirstOrDefault(); + + // We should check if folderPath is a library folder path and if so, return early and tell user to correct their setup. + if (libraryPaths.Contains(folderPath)) + { + _logger.LogCritical("[ScannerSeries] {SeriesName} scan aborted. Files for series are not in a nested folder under library path. Correct this and rescan", series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Files for series are not in a nested folder under library path. Correct this and rescan.")); + return; + } } if (string.IsNullOrEmpty(folderPath)) { - _logger.LogCritical("Scan Series could not find a single, valid folder root for files"); + _logger.LogCritical("[ScannerSeries] Scan Series could not find a single, valid folder root for files"); await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Scan Series could not find a single, valid folder root for files")); return; } + var parsedSeries = new Dictionary>(); + var processTasks = new List(); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); await _processSeries.Prime(); void TrackFiles(Tuple> parsedInfo) { - var skippedScan = parsedInfo.Item1; var parsedFiles = parsedInfo.Item2; if (parsedFiles.Count == 0) return; @@ -176,44 +190,21 @@ public class ScannerService : IScannerService Format = parsedFiles.First().Format }; - if (skippedScan) + if (!foundParsedSeries.NormalizedName.Equals(series.NormalizedName)) { - seenSeries.AddRange(parsedFiles.Select(pf => new ParsedSeries() - { - Name = pf.Series, - NormalizedName = Parser.Parser.Normalize(pf.Series), - Format = pf.Format - })); return; } - seenSeries.Add(foundParsedSeries); processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library)); parsedSeries.Add(foundParsedSeries, parsedFiles); } _logger.LogInformation("Beginning file scan on {SeriesName}", series.Name); - var scanElapsedTime = await ScanFiles(library, new []{folderPath}, false, TrackFiles, bypassFolderOptimizationChecks); + var scanElapsedTime = await ScanFiles(library, new []{folderPath}, false, TrackFiles, true); _logger.LogInformation("ScanFiles for {Series} took {Time}", series.Name, scanElapsedTime); await Task.WhenAll(processTasks); - // At this point, we've already inserted the series into the DB OR we haven't and seenSeries has our series - // We now need to do any leftover work, like removing - // We need to handle if parsedSeries is empty but seenSeries has our series - if (seenSeries.Any(s => s.NormalizedName.Equals(series.NormalizedName)) && parsedSeries.Keys.Count == 0) - { - // Nothing has changed - _logger.LogInformation("[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", series.Name); - await _eventHub.SendMessageAsync(MessageFactory.Info, - MessageFactory.InfoEvent($"{series.Name} scan has no work to do", - "All folders have not been changed since last scan. Scan will be aborted.")); - - _processSeries.EnqueuePostSeriesProcessTasks(series.LibraryId, seriesId, false); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - return; - } - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); // Remove any parsedSeries keys that don't belong to our series. This can occur when users store 2 series in the same folder @@ -222,8 +213,8 @@ public class ScannerService : IScannerService // If nothing was found, first validate any of the files still exist. If they don't then we have a deletion and can skip the rest of the logic flow if (parsedSeries.Count == 0) { - var anyFilesExist = - (await _unitOfWork.SeriesRepository.GetFilesForSeries(series.Id)).Any(m => File.Exists(m.FilePath)); + var seriesFiles = (await _unitOfWork.SeriesRepository.GetFilesForSeries(series.Id)); + var anyFilesExist = seriesFiles.Where(f => f.FilePath.Contains(series.FolderPath)).Any(m => File.Exists(m.FilePath)); if (!anyFilesExist) { @@ -287,21 +278,34 @@ public class ScannerService : IScannerService } // If all series Folder paths haven't been modified since last scan, abort - // NOTE: On windows, the parent folder will not update LastWriteTime if a subfolder was updated with files. Need to do a bit of light I/O. if (!bypassFolderChecks) { var allFolders = seriesFolderPaths.SelectMany(path => _directoryService.GetDirectories(path)).ToList(); allFolders.AddRange(seriesFolderPaths); - if (allFolders.All(folder => _directoryService.GetLastWriteTime(folder) <= series.LastFolderScanned)) + try { - _logger.LogInformation( - "[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", + if (allFolders.All(folder => _directoryService.GetLastWriteTime(folder) <= series.LastFolderScanned)) + { + _logger.LogInformation( + "[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", + series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} scan has no work to do", + "All folders have not been changed since last scan. Scan will be aborted.")); + return ScanCancelReason.NoChange; + } + } + catch (IOException ex) + { + // If there is an exception it means that the folder doesn't exist. So we should delete the series + _logger.LogError(ex, "[ScannerService] Scan series for {SeriesName} found the folder path no longer exists", series.Name); await _eventHub.SendMessageAsync(MessageFactory.Info, - MessageFactory.InfoEvent($"{series.Name} scan has no work to do", "All folders have not been changed since last scan. Scan will be aborted.")); - return ScanCancelReason.NoChange; + MessageFactory.ErrorEvent($"{series.Name} scan has no work to do", + "The folder the series is in is missing. Delete series manually or perform a library scan.")); + return ScanCancelReason.NoCancel; } } @@ -393,7 +397,7 @@ public class ScannerService : IScannerService [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - public async Task ScanLibrary(int libraryId) + public async Task ScanLibrary(int libraryId, bool forceUpdate = false) { var sw = Stopwatch.StartNew(); var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); @@ -405,7 +409,7 @@ public class ScannerService : IScannerService var wasLibraryUpdatedSinceLastScan = (library.LastModified.Truncate(TimeSpan.TicksPerMinute) > library.LastScanned.Truncate(TimeSpan.TicksPerMinute)) && library.LastScanned != DateTime.MinValue; - if (!wasLibraryUpdatedSinceLastScan) + if (!forceUpdate && !wasLibraryUpdatedSinceLastScan) { var haveFoldersChangedSinceLastScan = library.Folders .All(f => _directoryService.GetLastWriteTime(f.Path).Truncate(TimeSpan.TicksPerMinute) > f.LastScanned.Truncate(TimeSpan.TicksPerMinute)); diff --git a/UI/Web/src/app/_models/series.ts b/UI/Web/src/app/_models/series.ts index ae52f902a..9c3c9bd7e 100644 --- a/UI/Web/src/app/_models/series.ts +++ b/UI/Web/src/app/_models/series.ts @@ -48,6 +48,10 @@ export interface Series { * DateTime representing last time a chapter was added to the Series */ lastChapterAdded: string; + /** + * DateTime representing last time the series folder was scanned + */ + lastFolderScanned: string; /** * Number of words in the series */ diff --git a/UI/Web/src/app/_services/action-factory.service.ts b/UI/Web/src/app/_services/action-factory.service.ts index 9223c57ac..6b38dbaa4 100644 --- a/UI/Web/src/app/_services/action-factory.service.ts +++ b/UI/Web/src/app/_services/action-factory.service.ts @@ -18,9 +18,9 @@ export enum Action { */ MarkAsUnread = 1, /** - * Invoke a Scan Library + * Invoke a Scan on Series/Library */ - ScanLibrary = 2, + Scan = 2, /** * Delete the entity */ @@ -129,7 +129,7 @@ export class ActionFactoryService { }); this.seriesActions.push({ - action: Action.ScanLibrary, + action: Action.Scan, title: 'Scan Series', callback: this.dummyCallback, requiresAdmin: true @@ -171,7 +171,7 @@ export class ActionFactoryService { }); this.libraryActions.push({ - action: Action.ScanLibrary, + action: Action.Scan, title: 'Scan Library', callback: this.dummyCallback, requiresAdmin: true diff --git a/UI/Web/src/app/_services/action.service.ts b/UI/Web/src/app/_services/action.service.ts index d863887a4..ba905174c 100644 --- a/UI/Web/src/app/_services/action.service.ts +++ b/UI/Web/src/app/_services/action.service.ts @@ -52,11 +52,15 @@ export class ActionService implements OnDestroy { * @param callback Optional callback to perform actions after API completes * @returns */ - scanLibrary(library: Partial, callback?: LibraryActionCallback) { + async scanLibrary(library: Partial, callback?: LibraryActionCallback) { if (!library.hasOwnProperty('id') || library.id === undefined) { return; } - this.libraryService.scan(library?.id).pipe(take(1)).subscribe((res: any) => { + + // Prompt user if we should do a force or not + const force = false; // await this.promptIfForce(); + + this.libraryService.scan(library.id, force).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + library.name); if (callback) { callback(library); @@ -83,7 +87,9 @@ export class ActionService implements OnDestroy { return; } - this.libraryService.refreshMetadata(library?.id).pipe(take(1)).subscribe((res: any) => { + const forceUpdate = true; //await this.promptIfForce(); + + this.libraryService.refreshMetadata(library?.id, forceUpdate).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + library.name); if (callback) { callback(library); @@ -152,7 +158,7 @@ export class ActionService implements OnDestroy { * @param series Series, must have libraryId and name populated * @param callback Optional callback to perform actions after API completes */ - scanSeries(series: Series, callback?: SeriesActionCallback) { + async scanSeries(series: Series, callback?: SeriesActionCallback) { this.seriesService.scan(series.libraryId, series.id).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + series.name); if (callback) { @@ -545,4 +551,16 @@ export class ActionService implements OnDestroy { } }); } + + private async promptIfForce(extraContent: string = '') { + // Prompt user if we should do a force or not + const config = this.confirmService.defaultConfirm; + config.header = 'Force Scan'; + config.buttons = [ + {text: 'Yes', type: 'secondary'}, + {text: 'No', type: 'primary'}, + ]; + const msg = 'Do you want to force this scan? This is will ignore optimizations that reduce processing and I/O. ' + extraContent; + return !await this.confirmService.confirm(msg, config); // Not because primary is the false state + } } diff --git a/UI/Web/src/app/_services/library.service.ts b/UI/Web/src/app/_services/library.service.ts index ce03c2666..5aac12cfd 100644 --- a/UI/Web/src/app/_services/library.service.ts +++ b/UI/Web/src/app/_services/library.service.ts @@ -76,16 +76,16 @@ export class LibraryService { return this.httpClient.post(this.baseUrl + 'library/grant-access', {username, selectedLibraries}); } - scan(libraryId: number) { - return this.httpClient.post(this.baseUrl + 'library/scan?libraryId=' + libraryId, {}); + scan(libraryId: number, force = false) { + return this.httpClient.post(this.baseUrl + 'library/scan?libraryId=' + libraryId + '&force=' + force, {}); } analyze(libraryId: number) { return this.httpClient.post(this.baseUrl + 'library/analyze?libraryId=' + libraryId, {}); } - refreshMetadata(libraryId: number) { - return this.httpClient.post(this.baseUrl + 'library/refresh-metadata?libraryId=' + libraryId, {}); + refreshMetadata(libraryId: number, forceUpdate = false) { + return this.httpClient.post(this.baseUrl + 'library/refresh-metadata?libraryId=' + libraryId + '&force=' + forceUpdate, {}); } create(model: {name: string, type: number, folders: string[]}) { diff --git a/UI/Web/src/app/_services/series.service.ts b/UI/Web/src/app/_services/series.service.ts index 2c7cbe71c..cc9c4ef60 100644 --- a/UI/Web/src/app/_services/series.service.ts +++ b/UI/Web/src/app/_services/series.service.ts @@ -153,8 +153,8 @@ export class SeriesService { return this.httpClient.post(this.baseUrl + 'series/refresh-metadata', {libraryId: series.libraryId, seriesId: series.id}); } - scan(libraryId: number, seriesId: number) { - return this.httpClient.post(this.baseUrl + 'series/scan', {libraryId: libraryId, seriesId: seriesId}); + scan(libraryId: number, seriesId: number, force = false) { + return this.httpClient.post(this.baseUrl + 'series/scan', {libraryId: libraryId, seriesId: seriesId, forceUpdate: force}); } analyzeFiles(libraryId: number, seriesId: number) { diff --git a/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html b/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html index 7194667f9..cb60bfb40 100644 --- a/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html +++ b/UI/Web/src/app/cards/_modals/edit-series-modal/edit-series-modal.component.html @@ -344,9 +344,10 @@
    Format: {{series.format | mangaFormat}}
    -
    Created: {{series.created | date:'shortDate'}}
    +
    Created: {{series.created | date:'shortDate'}}
    Last Read: {{series.latestReadDate | date:'shortDate' | defaultDate}}
    -
    Last Added To: {{series.lastChapterAdded | date:'shortDate' | defaultDate}}
    +
    Last Added To: {{series.lastChapterAdded | date:'short' | defaultDate}}
    +
    Last Scanned: {{series.lastFolderScanned | date:'short' | defaultDate}}
    Folder Path: {{series.folderPath | defaultValue}}
    diff --git a/UI/Web/src/app/cards/card-item/card-actionables/card-actionables.component.html b/UI/Web/src/app/cards/card-item/card-actionables/card-actionables.component.html index 504366f5d..cd07213e9 100644 --- a/UI/Web/src/app/cards/card-item/card-actionables/card-actionables.component.html +++ b/UI/Web/src/app/cards/card-item/card-actionables/card-actionables.component.html @@ -7,5 +7,5 @@
    - +
    \ No newline at end of file diff --git a/UI/Web/src/app/cards/series-card/series-card.component.ts b/UI/Web/src/app/cards/series-card/series-card.component.ts index d7b1c98bd..40767c661 100644 --- a/UI/Web/src/app/cards/series-card/series-card.component.ts +++ b/UI/Web/src/app/cards/series-card/series-card.component.ts @@ -82,7 +82,7 @@ export class SeriesCardComponent implements OnInit, OnChanges, OnDestroy { case(Action.MarkAsUnread): this.markAsUnread(series); break; - case(Action.ScanLibrary): + case(Action.Scan): this.scanLibrary(series); break; case(Action.RefreshMetadata): diff --git a/UI/Web/src/app/library-detail/library-detail.component.ts b/UI/Web/src/app/library-detail/library-detail.component.ts index af2f1d996..eb62bacae 100644 --- a/UI/Web/src/app/library-detail/library-detail.component.ts +++ b/UI/Web/src/app/library-detail/library-detail.component.ts @@ -203,7 +203,7 @@ export class LibraryDetailComponent implements OnInit, OnDestroy { lib = {id: this.libraryId, name: this.libraryName}; } switch (action) { - case(Action.ScanLibrary): + case(Action.Scan): this.actionService.scanLibrary(lib); break; case(Action.RefreshMetadata): diff --git a/UI/Web/src/app/nav/events-widget/events-widget.component.ts b/UI/Web/src/app/nav/events-widget/events-widget.component.ts index a86d0dc93..2f7bfef62 100644 --- a/UI/Web/src/app/nav/events-widget/events-widget.component.ts +++ b/UI/Web/src/app/nav/events-widget/events-widget.component.ts @@ -152,11 +152,15 @@ export class EventsWidgetComponent implements OnInit, OnDestroy { async seeMore(event: ErrorEvent | InfoEvent) { const config = new ConfirmConfig(); - config.buttons = [ - {text: 'Ok', type: 'secondary'}, - ]; if (event.name === EVENTS.Error) { - config.buttons = [{text: 'Dismiss', type: 'primary'}, ...config.buttons]; + config.buttons = [ + {text: 'Ok', type: 'secondary'}, + {text: 'Dismiss', type: 'primary'} + ]; + } else { + config.buttons = [ + {text: 'Ok', type: 'primary'}, + ]; } config.header = event.title; config.content = event.subTitle; diff --git a/UI/Web/src/app/series-detail/series-detail.component.ts b/UI/Web/src/app/series-detail/series-detail.component.ts index 5c0c81232..bec4cb6b4 100644 --- a/UI/Web/src/app/series-detail/series-detail.component.ts +++ b/UI/Web/src/app/series-detail/series-detail.component.ts @@ -345,7 +345,7 @@ export class SeriesDetailComponent implements OnInit, OnDestroy, AfterContentChe this.loadSeries(series.id); }); break; - case(Action.ScanLibrary): + case(Action.Scan): this.actionService.scanSeries(series, () => { this.actionInProgress = false; this.changeDetectionRef.markForCheck(); diff --git a/UI/Web/src/app/shared/confirm-dialog/_models/confirm-button.ts b/UI/Web/src/app/shared/confirm-dialog/_models/confirm-button.ts index a54ace910..12352ad58 100644 --- a/UI/Web/src/app/shared/confirm-dialog/_models/confirm-button.ts +++ b/UI/Web/src/app/shared/confirm-dialog/_models/confirm-button.ts @@ -3,5 +3,5 @@ export interface ConfirmButton { /** * Type for css class. ie) primary, secondary */ - type: string; + type: 'secondary' | 'primary'; } \ No newline at end of file diff --git a/UI/Web/src/app/shared/confirm-dialog/_models/confirm-config.ts b/UI/Web/src/app/shared/confirm-dialog/_models/confirm-config.ts index fe8d989af..9f0d2db8e 100644 --- a/UI/Web/src/app/shared/confirm-dialog/_models/confirm-config.ts +++ b/UI/Web/src/app/shared/confirm-dialog/_models/confirm-config.ts @@ -5,4 +5,8 @@ export class ConfirmConfig { header: string = 'Confirm'; content: string = ''; buttons: Array = []; + /** + * If the close button shouldn't be rendered + */ + disableEscape: boolean = false; } diff --git a/UI/Web/src/app/shared/confirm-dialog/confirm-dialog.component.html b/UI/Web/src/app/shared/confirm-dialog/confirm-dialog.component.html index 30d38d2f9..2f4754513 100644 --- a/UI/Web/src/app/shared/confirm-dialog/confirm-dialog.component.html +++ b/UI/Web/src/app/shared/confirm-dialog/confirm-dialog.component.html @@ -2,9 +2,7 @@ @@ -12,5 +10,4 @@
    - diff --git a/UI/Web/src/app/shared/confirm.service.ts b/UI/Web/src/app/shared/confirm.service.ts index f1cbbb881..48b7dbc2a 100644 --- a/UI/Web/src/app/shared/confirm.service.ts +++ b/UI/Web/src/app/shared/confirm.service.ts @@ -34,6 +34,9 @@ export class ConfirmService { config = this.defaultConfirm; config.content = content; } + if (content !== undefined && content !== '' && config!.content === '') { + config!.content = content; + } const modalRef = this.modalService.open(ConfirmDialogComponent); modalRef.componentInstance.config = config; diff --git a/UI/Web/src/app/sidenav/side-nav/side-nav.component.ts b/UI/Web/src/app/sidenav/side-nav/side-nav.component.ts index 1403e007e..0635ec55a 100644 --- a/UI/Web/src/app/sidenav/side-nav/side-nav.component.ts +++ b/UI/Web/src/app/sidenav/side-nav/side-nav.component.ts @@ -78,7 +78,7 @@ export class SideNavComponent implements OnInit, OnDestroy { handleAction(action: Action, library: Library) { switch (action) { - case(Action.ScanLibrary): + case(Action.Scan): this.actionService.scanLibrary(library); break; case(Action.RefreshMetadata): From ac9f1c722e7fb1c727432f86b50ebfcda8d225eb Mon Sep 17 00:00:00 2001 From: majora2007 Date: Mon, 22 Aug 2022 17:30:48 +0000 Subject: [PATCH 010/134] Bump versions by dotnet-bump-version. --- Kavita.Common/Kavita.Common.csproj | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index 203b1ff96..549377d66 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -4,7 +4,7 @@ net6.0 kavitareader.com Kavita - 0.5.5.5 + 0.5.5.6 en From 268f4368fac482f980b348e88226fc8c2d1145fa Mon Sep 17 00:00:00 2001 From: Joseph Milazzo Date: Mon, 22 Aug 2022 15:27:36 -0500 Subject: [PATCH 011/134] Scroll Resume (#1460) * When we navigate from a page then back, resume back on the last scroll key (if clicked) * Resume jump key position when navigating back to a page. Removed some extra blank space on collection detail when a collection doesn't have a summary or cover image. * Ignore progress events on series cards * Added a url to swagger for /, which could be reverse proxy url --- API/Startup.cs | 6 +++ .../card-detail-layout.component.scss | 4 ++ .../card-detail-layout.component.ts | 39 ++++++++----------- .../cards/card-item/card-item.component.ts | 31 ++++++++------- .../collection-detail.component.html | 4 +- 5 files changed, 44 insertions(+), 40 deletions(-) diff --git a/API/Startup.cs b/API/Startup.cs index cb9e14aca..b4ce9b179 100644 --- a/API/Startup.cs +++ b/API/Startup.cs @@ -111,6 +111,12 @@ namespace API } }); + c.AddServer(new OpenApiServer() + { + Description = "Custom Url", + Url = "/" + }); + c.AddServer(new OpenApiServer() { Description = "Local Server", diff --git a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss index a4c9473d2..936610164 100644 --- a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss +++ b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss @@ -75,6 +75,10 @@ &:hover { color: var(--primary-color); } + + .active { + font-weight: bold; + } } } diff --git a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts index 81676a474..f931879de 100644 --- a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts +++ b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.ts @@ -1,6 +1,7 @@ import { CdkVirtualScrollViewport } from '@angular/cdk/scrolling'; import { DOCUMENT } from '@angular/common'; -import { AfterViewInit, ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core'; +import { ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core'; +import { Router } from '@angular/router'; import { VirtualScrollerComponent } from '@iharbeck/ngx-virtual-scroller'; import { Subject } from 'rxjs'; import { FilterSettings } from 'src/app/metadata-filter/filter-settings'; @@ -13,15 +14,13 @@ import { ActionItem } from 'src/app/_services/action-factory.service'; import { JumpbarService } from 'src/app/_services/jumpbar.service'; import { SeriesService } from 'src/app/_services/series.service'; -const keySize = 25; // Height of the JumpBar button - @Component({ selector: 'app-card-detail-layout', templateUrl: './card-detail-layout.component.html', styleUrls: ['./card-detail-layout.component.scss'], changeDetection: ChangeDetectionStrategy.OnPush }) -export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, AfterViewInit { +export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges { @Input() header: string = ''; @Input() isLoading: boolean = false; @@ -74,7 +73,7 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, constructor(private seriesService: SeriesService, public utilityService: UtilityService, @Inject(DOCUMENT) private document: Document, private changeDetectionRef: ChangeDetectorRef, - private jumpbarService: JumpbarService) { + private jumpbarService: JumpbarService, private router: Router) { this.filter = this.seriesService.createSeriesFilter(); this.changeDetectionRef.markForCheck(); @@ -109,29 +108,23 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, this.virtualScroller.refresh(); }); } - } - ngAfterViewInit(): void { - // NOTE: I can't seem to figure out a way to resume the JumpKey with the scroller. - // this.virtualScroller.vsUpdate.pipe(takeWhile(() => this.hasResumedJumpKey), takeUntil(this.onDestory)).subscribe(() => { - // const resumeKey = this.jumpbarService.getResumeKey(this.header); - // console.log('Resume key:', resumeKey); - // if (resumeKey !== '') { - // const keys = this.jumpBarKeys.filter(k => k.key === resumeKey); - // if (keys.length >= 1) { - // console.log('Scrolling to ', keys[0].key); - // this.scrollTo(keys[0]); - // this.hasResumedJumpKey = true; - // } - // } - // this.hasResumedJumpKey = true; - // }); - } ngOnChanges(): void { this.jumpBarKeysToRender = [...this.jumpBarKeys]; this.resizeJumpBar(); + + + if (!this.hasResumedJumpKey && this.jumpBarKeysToRender.length > 0) { + const resumeKey = this.jumpbarService.getResumeKey(this.router.url); + if (resumeKey === '') return; + const keys = this.jumpBarKeysToRender.filter(k => k.key === resumeKey); + if (keys.length < 1) return; + + this.hasResumedJumpKey = true; + setTimeout(() => this.scrollTo(keys[0]), 100); + } } @@ -161,7 +154,7 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges, } this.virtualScroller.scrollToIndex(targetIndex, true, 0, 1000); - this.jumpbarService.saveResumeKey(this.header, jumpKey.key); + this.jumpbarService.saveResumeKey(this.router.url, jumpKey.key); this.changeDetectionRef.markForCheck(); return; } diff --git a/UI/Web/src/app/cards/card-item/card-item.component.ts b/UI/Web/src/app/cards/card-item/card-item.component.ts index dd31d7f91..585a263fe 100644 --- a/UI/Web/src/app/cards/card-item/card-item.component.ts +++ b/UI/Web/src/app/cards/card-item/card-item.component.ts @@ -1,5 +1,4 @@ import { ChangeDetectionStrategy, ChangeDetectorRef, Component, EventEmitter, HostListener, Input, OnDestroy, OnInit, Output } from '@angular/core'; -import { ToastrService } from 'ngx-toastr'; import { Observable, Subject } from 'rxjs'; import { filter, map, takeUntil } from 'rxjs/operators'; import { DownloadEvent, DownloadService } from 'src/app/shared/_services/download.service'; @@ -125,7 +124,7 @@ export class CardItemComponent implements OnInit, OnDestroy { constructor(public imageService: ImageService, private libraryService: LibraryService, public utilityService: UtilityService, private downloadService: DownloadService, - private toastr: ToastrService, public bulkSelectionService: BulkSelectionService, + public bulkSelectionService: BulkSelectionService, private messageHub: MessageHubService, private accountService: AccountService, private scrollService: ScrollService, private readonly cdRef: ChangeDetectorRef) {} @@ -188,20 +187,22 @@ export class CardItemComponent implements OnInit, OnDestroy { chapter.pagesRead = updateEvent.pagesRead; } } else { + // Ignore + return; // re-request progress for the series - const s = this.utilityService.asSeries(this.entity); - let pagesRead = 0; - if (s.hasOwnProperty('volumes')) { - s.volumes.forEach(v => { - v.chapters.forEach(c => { - if (c.id === updateEvent.chapterId) { - c.pagesRead = updateEvent.pagesRead; - } - pagesRead += c.pagesRead; - }); - }); - s.pagesRead = pagesRead; - } + // const s = this.utilityService.asSeries(this.entity); + // let pagesRead = 0; + // if (s.hasOwnProperty('volumes')) { + // s.volumes.forEach(v => { + // v.chapters.forEach(c => { + // if (c.id === updateEvent.chapterId) { + // c.pagesRead = updateEvent.pagesRead; + // } + // pagesRead += c.pagesRead; + // }); + // }); + // s.pagesRead = pagesRead; + // } } } diff --git a/UI/Web/src/app/collections/collection-detail/collection-detail.component.html b/UI/Web/src/app/collections/collection-detail/collection-detail.component.html index 8a0f42c07..9c580472d 100644 --- a/UI/Web/src/app/collections/collection-detail/collection-detail.component.html +++ b/UI/Web/src/app/collections/collection-detail/collection-detail.component.html @@ -10,15 +10,15 @@
    -
    +
    +
    -
    Date: Mon, 22 Aug 2022 20:41:43 +0000 Subject: [PATCH 012/134] Bump versions by dotnet-bump-version. --- Kavita.Common/Kavita.Common.csproj | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index 549377d66..c27c7a398 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -4,7 +4,7 @@ net6.0 kavitareader.com Kavita - 0.5.5.6 + 0.5.5.7 en From ff26a45c9badc5f4483bd8fb08543f40e834f335 Mon Sep 17 00:00:00 2001 From: Robbie Davis Date: Tue, 23 Aug 2022 10:14:37 -0400 Subject: [PATCH 013/134] Misc UI fixes (#1461) * Misc fixes - Fixed modal being stretched when not needed. - Fixed Logo vertical align - Fixed drawer content scroll, and from it being squished due to overridden by bootstrap. * series detail cover image stretch fix - Fixes: Fixes series detail cover image being stretched on larger resolutions * fixing empty lists scrollbar * Fixing want to read error * fixing unnecessary scrollbar * Fixing recently updated tooltip --- .../bulk-add-to-collection.component.scss | 4 ++-- .../card-detail-drawer.component.scss | 2 +- .../card-detail-layout.component.html | 6 +++--- .../card-detail-layout.component.scss | 12 ++++++++++-- .../app/cards/card-item/card-item.component.ts | 9 +++++++-- .../all-collections.component.html | 4 ++++ .../app/nav/nav-header/nav-header.component.html | 2 +- .../app/nav/nav-header/nav-header.component.scss | 2 +- .../reading-lists/reading-lists.component.html | 6 +++++- .../series-detail/series-detail.component.html | 4 ++-- .../series-detail/series-detail.component.scss | 4 ++++ UI/Web/src/app/shared/image/image.component.ts | 16 ++++++++++++++++ .../want-to-read/want-to-read.component.html | 15 ++++++++++++--- UI/Web/src/theme/components/_offcanvas.scss | 2 +- 14 files changed, 69 insertions(+), 19 deletions(-) diff --git a/UI/Web/src/app/cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component.scss b/UI/Web/src/app/cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component.scss index 1cfe40c07..29342b14b 100644 --- a/UI/Web/src/app/cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component.scss +++ b/UI/Web/src/app/cards/_modals/bulk-add-to-collection/bulk-add-to-collection.component.scss @@ -9,11 +9,11 @@ .collection { overflow: auto; .modal-body { - height: calc(100vh - 235px); + max-height: calc(100vh - 235px); min-height: 150px; .list-group { overflow: auto; - height: calc(100vh - 355px); + max-height: calc(100vh - 355px); min-height: 32px; } } diff --git a/UI/Web/src/app/cards/card-detail-drawer/card-detail-drawer.component.scss b/UI/Web/src/app/cards/card-detail-drawer/card-detail-drawer.component.scss index 8873c06ac..8296d7865 100644 --- a/UI/Web/src/app/cards/card-detail-drawer/card-detail-drawer.component.scss +++ b/UI/Web/src/app/cards/card-detail-drawer/card-detail-drawer.component.scss @@ -12,7 +12,7 @@ .tab-content { overflow: auto; - height: calc(40vh - 63px); // drawer height - offcanvas heading height + height: calc(40vh - (46px + 1rem)); // drawer height - offcanvas heading height } .h6 { diff --git a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.html b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.html index 1eb306484..4839f00e8 100644 --- a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.html +++ b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.html @@ -13,13 +13,13 @@
    -
    +

    - +
    @@ -29,7 +29,7 @@
    - +
    diff --git a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss index 936610164..89727299c 100644 --- a/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss +++ b/UI/Web/src/app/cards/card-detail-layout/card-detail-layout.component.scss @@ -2,8 +2,12 @@ display: flex; flex-direction: row; width: 100%; - height: calc((var(--vh) *100) - 162px); + height: calc((var(--vh) *100) - 173px); margin-bottom: 10px; + + &.empty { + height: auto; + } } .content-container { @@ -85,12 +89,16 @@ .virtual-scroller, virtual-scroller { width: 100%; //height: calc(100vh - 160px); // 64 is a random number, 523 for me. - height: calc(var(--vh) * 100 - 160px); + height: calc(var(--vh) * 100 - 173px); //height: calc(100vh - 160px); //background-color: red; //max-height: calc(var(--vh)*100 - 170px); } +virtual-scroller.empty { + display: none; + } + h2 { display: inline-block; word-break: break-all; diff --git a/UI/Web/src/app/cards/card-item/card-item.component.ts b/UI/Web/src/app/cards/card-item/card-item.component.ts index 585a263fe..a564fa4b4 100644 --- a/UI/Web/src/app/cards/card-item/card-item.component.ts +++ b/UI/Web/src/app/cards/card-item/card-item.component.ts @@ -151,8 +151,13 @@ export class CardItemComponent implements OnInit, OnDestroy { if (this.utilityService.isChapter(this.entity)) { const chapterTitle = this.utilityService.asChapter(this.entity).titleName; - if (chapterTitle === '' || chapterTitle === null) { - this.tooltipTitle = (this.utilityService.asChapter(this.entity).volumeTitle + ' ' + this.title).trim(); + if (chapterTitle === '' || chapterTitle === null || chapterTitle === undefined) { + const volumeTitle = this.utilityService.asChapter(this.entity).volumeTitle + if (volumeTitle === '' || volumeTitle === null || volumeTitle === undefined) { + this.tooltipTitle = (this.title).trim(); + } else { + this.tooltipTitle = (this.utilityService.asChapter(this.entity).volumeTitle + ' ' + this.title).trim(); + } } else { this.tooltipTitle = chapterTitle; } diff --git a/UI/Web/src/app/collections/all-collections/all-collections.component.html b/UI/Web/src/app/collections/all-collections/all-collections.component.html index eb2c29a60..40f5c928e 100644 --- a/UI/Web/src/app/collections/all-collections/all-collections.component.html +++ b/UI/Web/src/app/collections/all-collections/all-collections.component.html @@ -13,4 +13,8 @@ + + + There are no collections. Try creating one . + \ No newline at end of file diff --git a/UI/Web/src/app/nav/nav-header/nav-header.component.html b/UI/Web/src/app/nav/nav-header/nav-header.component.html index c2ba5e7c8..fbc803bc3 100644 --- a/UI/Web/src/app/nav/nav-header/nav-header.component.html +++ b/UI/Web/src/app/nav/nav-header/nav-header.component.html @@ -179,7 +179,7 @@ NZ0ZV4zm4/L1dfnYNCrjTFq9G03rmj5D+Y4i0OHuL3GFPJytaM54AAAAAElFTkSuQmCC id="image71" x="34.013355" y="59.091763"/> - Kavita +