mirror of
https://github.com/Kareadita/Kavita.git
synced 2025-07-09 03:04:19 -04:00
New Scan Loop (#1447)
* Staging the code for the new scan loop. * Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real. * Started writing unit test for new loop code * Implemented a basic method to scan a folder path with ignore support (not implemented, code in place) * Added some code to the parser to build out the idea of processing series in batches based on some top level folder. * Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue. * Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support). * Wrote some notes on update library scan loop. * Removed migration for merge * Reapplied the SeriesFolder migration after merge * Refactored a check that used multiple db calls into one. * Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then. * Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned. * Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them. * Fixed an issue where ignore files nested wouldn't stack with higher level ignores * Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking. * Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it. * Refactored ScanFiles out to Directory Service. * Refactored more code out to keep the code clean. * More unit tests * Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work). * Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning. * Prep for unit tests (updating broken ones with new implementations) * Just some notes. Not sure I want to finish this work. * Refactored the LibraryWatcher with some comments and state variables. * Undid the migrations in case I don't move forward with this branch * Started to clean the code and prepare for finishing this work. * Fixed a bad merge * Updated signatures to cleanup the code and commit to the new strategy for scanning. * Swapped out the code with async processing of series on a small library * The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations. * Refactored UpdateSeries out of Scanner and into a dedicated file. * Refactored how ProcessTasks are awaited to allow more async * Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush * Moved where we start to stopwatch to encapsulate the full scan * Cleaned up SignalR events to report correctly (still needs a redesign) * Remove the "remove" code until I figure it out * Put in extremely expensive series deletion code for library scan. * Have Genre and Tag update the DB immediately to avoid dup issues * Taking a break * Moving to a lock with People was successful. Need to apply to others. * Refactored code for series level and tag and genre with new locking strategy. * New scan loop works. Next up optimization * Swapped out the Kavita log with svg for faster load * Refactored metadata updates to occur when the series are being updated. * Code cleanup * Added a new type of generic message (Info) to inform the user. * Code cleanup * Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds. Fixed a bug where File Analysis was running everytime for each non-epub file. * Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet. * Some code cleanup * Added experimental signalr update code to have a more natural refresh of library-detail page * Hooked in ability to send new series events to UI * Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series. * Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors. Added --event-widget-info-bg-color * Remove --drawer-background-color since it's not used * When new series added, inject directly into the view. * Some debug code cleanup * Fixed up the unit tests * Ensure all config directories exist on startup * Disabled Library Watching (that will go in next build) * Ensure update for series is admin only * Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again. * Removed SeriesFolder migration * Added the SeriesFolder migration * Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail. * The scan optimizations now work for NTFS systems. * Removed a TODO * Migrated all the times to use DateTime.Now and not Utc. * Refactored some repo calls to use the includes flag pattern * Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed. * Added another optimization which will use just folder attribute of last write time if the drive is not NTFS. * Fixed a unit test * Some code cleanup
This commit is contained in:
parent
8708b9ced5
commit
0eac193248
@ -1,69 +0,0 @@
|
||||
using System.IO;
|
||||
using System.IO.Abstractions;
|
||||
using System.Threading.Tasks;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using API.SignalR;
|
||||
using BenchmarkDotNet.Attributes;
|
||||
using BenchmarkDotNet.Order;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
|
||||
namespace API.Benchmark
|
||||
{
|
||||
[MemoryDiagnoser]
|
||||
[Orderer(SummaryOrderPolicy.FastestToSlowest)]
|
||||
[RankColumn]
|
||||
//[SimpleJob(launchCount: 1, warmupCount: 3, targetCount: 5, invocationCount: 100, id: "Test"), ShortRunJob]
|
||||
public class ParseScannedFilesBenchmarks
|
||||
{
|
||||
private readonly ParseScannedFiles _parseScannedFiles;
|
||||
private readonly ILogger<ParseScannedFiles> _logger = Substitute.For<ILogger<ParseScannedFiles>>();
|
||||
private readonly ILogger<BookService> _bookLogger = Substitute.For<ILogger<BookService>>();
|
||||
private readonly IArchiveService _archiveService = Substitute.For<ArchiveService>();
|
||||
|
||||
public ParseScannedFilesBenchmarks()
|
||||
{
|
||||
var directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem());
|
||||
_parseScannedFiles = new ParseScannedFiles(
|
||||
Substitute.For<ILogger>(),
|
||||
directoryService,
|
||||
new ReadingItemService(_archiveService, new BookService(_bookLogger, directoryService, new ImageService(Substitute.For<ILogger<ImageService>>(), directoryService)), Substitute.For<ImageService>(), directoryService),
|
||||
Substitute.For<IEventHub>());
|
||||
}
|
||||
|
||||
// [Benchmark]
|
||||
// public void Test()
|
||||
// {
|
||||
// var libraryPath = Path.Join(Directory.GetCurrentDirectory(),
|
||||
// "../../../Services/Test Data/ScannerService/Manga");
|
||||
// var parsedSeries = _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new string[] {libraryPath},
|
||||
// out var totalFiles, out var scanElapsedTime);
|
||||
// }
|
||||
|
||||
/// <summary>
|
||||
/// Generate a list of Series and another list with
|
||||
/// </summary>
|
||||
[Benchmark]
|
||||
public async Task MergeName()
|
||||
{
|
||||
var libraryPath = Path.Join(Directory.GetCurrentDirectory(),
|
||||
"../../../Services/Test Data/ScannerService/Manga");
|
||||
var p1 = new ParserInfo()
|
||||
{
|
||||
Chapters = "0",
|
||||
Edition = "",
|
||||
Format = MangaFormat.Archive,
|
||||
FullFilePath = Path.Join(libraryPath, "A Town Where You Live", "A_Town_Where_You_Live_v01.zip"),
|
||||
IsSpecial = false,
|
||||
Series = "A Town Where You Live",
|
||||
Title = "A Town Where You Live",
|
||||
Volumes = "1"
|
||||
};
|
||||
await _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new [] {libraryPath}, "Manga");
|
||||
_parseScannedFiles.MergeName(p1);
|
||||
}
|
||||
}
|
||||
}
|
@ -14,7 +14,7 @@ namespace API.Tests.Extensions
|
||||
{
|
||||
public class ParserInfoListExtensions
|
||||
{
|
||||
private readonly DefaultParser _defaultParser;
|
||||
private readonly IDefaultParser _defaultParser;
|
||||
public ParserInfoListExtensions()
|
||||
{
|
||||
_defaultParser =
|
||||
|
@ -26,7 +26,7 @@ namespace API.Tests.Helpers
|
||||
};
|
||||
}
|
||||
|
||||
public static void AddToParsedInfo(IDictionary<ParsedSeries, List<ParserInfo>> collectedSeries, ParserInfo info)
|
||||
public static void AddToParsedInfo(IDictionary<ParsedSeries, IList<ParserInfo>> collectedSeries, ParserInfo info)
|
||||
{
|
||||
var existingKey = collectedSeries.Keys.FirstOrDefault(ps =>
|
||||
ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series));
|
||||
@ -38,7 +38,7 @@ namespace API.Tests.Helpers
|
||||
};
|
||||
if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>))
|
||||
{
|
||||
((ConcurrentDictionary<ParsedSeries, List<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
((ConcurrentDictionary<ParsedSeries, IList<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
{
|
||||
oldValue ??= new List<ParserInfo>();
|
||||
if (!oldValue.Contains(info))
|
||||
|
@ -16,7 +16,7 @@ public class ParserInfoHelperTests
|
||||
[Fact]
|
||||
public void SeriesHasMatchingParserInfoFormat_ShouldBeFalse()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
@ -45,7 +45,7 @@ public class ParserInfoHelperTests
|
||||
[Fact]
|
||||
public void SeriesHasMatchingParserInfoFormat_ShouldBeTrue()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
|
@ -180,6 +180,7 @@ namespace API.Tests.Parser
|
||||
[InlineData("Highschool of the Dead - Full Color Edition v02 [Uasaha] (Yen Press)", "Highschool of the Dead - Full Color Edition")]
|
||||
[InlineData("諌山創] 進撃の巨人 第23巻", "諌山創] 進撃の巨人")]
|
||||
[InlineData("(一般コミック) [奥浩哉] いぬやしき 第09巻", "いぬやしき")]
|
||||
[InlineData("Highschool of the Dead - 02", "Highschool of the Dead")]
|
||||
public void ParseSeriesTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename));
|
||||
|
@ -405,5 +405,75 @@ public class BookmarkServiceTests
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region Misc
|
||||
|
||||
[Fact]
|
||||
public async Task ShouldNotDeleteBookmarkOnChapterDeletion()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CacheDirectory}1/0001.jpg", new MockFileData("123"));
|
||||
filesystem.AddFile($"{BookmarkDirectory}1/1/0001.jpg", new MockFileData("123"));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "Joe",
|
||||
Bookmarks = new List<AppUserBookmark>()
|
||||
{
|
||||
new AppUserBookmark()
|
||||
{
|
||||
Page = 1,
|
||||
ChapterId = 1,
|
||||
FileName = $"1/1/0001.jpg",
|
||||
SeriesId = 1,
|
||||
VolumeId = 1
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var bookmarkService = Create(ds);
|
||||
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Bookmarks);
|
||||
|
||||
var vol = await _unitOfWork.VolumeRepository.GetVolumeAsync(1);
|
||||
vol.Chapters = new List<Chapter>();
|
||||
_unitOfWork.VolumeRepository.Update(vol);
|
||||
await _unitOfWork.CommitAsync();
|
||||
|
||||
|
||||
Assert.Equal(1, ds.GetFiles(BookmarkDirectory, searchOption:SearchOption.AllDirectories).Count());
|
||||
Assert.NotNull(await _unitOfWork.UserRepository.GetBookmarkAsync(1));
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
||||
|
@ -55,6 +55,11 @@ namespace API.Tests.Services
|
||||
{
|
||||
throw new System.NotImplementedException();
|
||||
}
|
||||
|
||||
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
|
||||
{
|
||||
throw new System.NotImplementedException();
|
||||
}
|
||||
}
|
||||
public class CacheServiceTests
|
||||
{
|
||||
|
@ -841,5 +841,127 @@ namespace API.Tests.Services
|
||||
Assert.Equal(expected, DirectoryService.GetHumanReadableBytes(bytes));
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region ScanFiles
|
||||
|
||||
[Fact]
|
||||
public Task ScanFiles_ShouldFindNoFiles_AllAreIgnored()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("*.*"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
|
||||
var allFiles = ds.ScanFiles("C:/Data/");
|
||||
|
||||
Assert.Equal(0, allFiles.Count);
|
||||
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public Task ScanFiles_ShouldFindNoNestedFiles_IgnoreNestedFiles()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*"));
|
||||
fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
var allFiles = ds.ScanFiles("C:/Data/");
|
||||
|
||||
Assert.Equal(1, allFiles.Count); // Ignore files are not counted in files, only valid extensions
|
||||
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public Task ScanFiles_NestedIgnore_IgnoreNestedFilesInOneDirectoryOnly()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddDirectory("C:/Data/Specials/");
|
||||
fileSystem.AddDirectory("C:/Data/Specials/ArtBooks/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*"));
|
||||
fileSystem.AddFile("C:/Data/Specials/.kavitaignore", new MockFileData("**/ArtBooks/*"));
|
||||
fileSystem.AddFile("C:/Data/Specials/Hi.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Specials/ArtBooks/art book 01.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
var allFiles = ds.ScanFiles("C:/Data/");
|
||||
|
||||
Assert.Equal(2, allFiles.Count); // Ignore files are not counted in files, only valid extensions
|
||||
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public Task ScanFiles_ShouldFindAllFiles()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.txt", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Nothing.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
var allFiles = ds.ScanFiles("C:/Data/");
|
||||
|
||||
Assert.Equal(5, allFiles.Count);
|
||||
|
||||
return Task.CompletedTask;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetAllDirectories
|
||||
|
||||
[Fact]
|
||||
public void GetAllDirectories_ShouldFindAllNestedDirectories()
|
||||
{
|
||||
const string testDirectory = "C:/manga/base/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1"));
|
||||
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 2"));
|
||||
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "A"));
|
||||
fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "B"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
Assert.Equal(2, ds.GetAllDirectories(fileSystem.Path.Join(testDirectory, "folder 1")).Count());
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
||||
}
|
||||
|
@ -1,4 +1,5 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
@ -14,6 +15,8 @@ using API.Services.Tasks.Scanner;
|
||||
using API.SignalR;
|
||||
using API.Tests.Helpers;
|
||||
using AutoMapper;
|
||||
using DotNet.Globbing;
|
||||
using Flurl.Util;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
@ -25,9 +28,9 @@ namespace API.Tests.Services;
|
||||
|
||||
internal class MockReadingItemService : IReadingItemService
|
||||
{
|
||||
private readonly DefaultParser _defaultParser;
|
||||
private readonly IDefaultParser _defaultParser;
|
||||
|
||||
public MockReadingItemService(DefaultParser defaultParser)
|
||||
public MockReadingItemService(IDefaultParser defaultParser)
|
||||
{
|
||||
_defaultParser = defaultParser;
|
||||
}
|
||||
@ -56,6 +59,11 @@ internal class MockReadingItemService : IReadingItemService
|
||||
{
|
||||
return _defaultParser.Parse(path, rootPath, type);
|
||||
}
|
||||
|
||||
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
|
||||
{
|
||||
return _defaultParser.Parse(path, rootPath, type);
|
||||
}
|
||||
}
|
||||
|
||||
public class ParseScannedFilesTests
|
||||
@ -163,7 +171,7 @@ public class ParseScannedFilesTests
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
|
||||
};
|
||||
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
|
||||
var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>
|
||||
{
|
||||
{
|
||||
new ParsedSeries()
|
||||
@ -208,7 +216,7 @@ public class ParseScannedFilesTests
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
|
||||
};
|
||||
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
|
||||
var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>
|
||||
{
|
||||
{
|
||||
new ParsedSeries()
|
||||
@ -240,46 +248,71 @@ public class ParseScannedFilesTests
|
||||
|
||||
#region MergeName
|
||||
|
||||
[Fact]
|
||||
public async Task MergeName_ShouldMergeMatchingFormatAndName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
|
||||
await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
|
||||
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false)));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task MergeName_ShouldMerge_MismatchedFormatSameName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
|
||||
await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
|
||||
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false)));
|
||||
}
|
||||
// NOTE: I don't think I can test MergeName as it relies on Tracking Files, which is more complicated than I need
|
||||
// [Fact]
|
||||
// public async Task MergeName_ShouldMergeMatchingFormatAndName()
|
||||
// {
|
||||
// var fileSystem = new MockFileSystem();
|
||||
// fileSystem.AddDirectory("C:/Data/");
|
||||
// fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
// fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
// fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
//
|
||||
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
// var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
// new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
//
|
||||
// var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
// var parsedFiles = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
|
||||
//
|
||||
// void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
|
||||
// {
|
||||
// var skippedScan = parsedInfo.Item1;
|
||||
// var parsedFiles = parsedInfo.Item2;
|
||||
// if (parsedFiles.Count == 0) return;
|
||||
//
|
||||
// var foundParsedSeries = new ParsedSeries()
|
||||
// {
|
||||
// Name = parsedFiles.First().Series,
|
||||
// NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series),
|
||||
// Format = parsedFiles.First().Format
|
||||
// };
|
||||
//
|
||||
// parsedSeries.Add(foundParsedSeries, parsedFiles);
|
||||
// }
|
||||
//
|
||||
// await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName",
|
||||
// false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles);
|
||||
//
|
||||
// Assert.Equal("Accel World",
|
||||
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false)));
|
||||
// Assert.Equal("Accel World",
|
||||
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false)));
|
||||
// Assert.Equal("Accel World",
|
||||
// psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false)));
|
||||
// }
|
||||
//
|
||||
// [Fact]
|
||||
// public async Task MergeName_ShouldMerge_MismatchedFormatSameName()
|
||||
// {
|
||||
// var fileSystem = new MockFileSystem();
|
||||
// fileSystem.AddDirectory("C:/Data/");
|
||||
// fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
// fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
// fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
//
|
||||
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
// var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
// new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
//
|
||||
//
|
||||
// await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
|
||||
//
|
||||
// Assert.Equal("Accel World",
|
||||
// psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false)));
|
||||
// Assert.Equal("Accel World",
|
||||
// psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false)));
|
||||
// }
|
||||
|
||||
#endregion
|
||||
|
||||
@ -299,14 +332,150 @@ public class ParseScannedFilesTests
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
var parsedSeries = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
|
||||
void TrackFiles(Tuple<bool, IList<ParserInfo>> parsedInfo)
|
||||
{
|
||||
var skippedScan = parsedInfo.Item1;
|
||||
var parsedFiles = parsedInfo.Item2;
|
||||
if (parsedFiles.Count == 0) return;
|
||||
|
||||
var foundParsedSeries = new ParsedSeries()
|
||||
{
|
||||
Name = parsedFiles.First().Series,
|
||||
NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series),
|
||||
Format = parsedFiles.First().Format
|
||||
};
|
||||
|
||||
parsedSeries.Add(foundParsedSeries, parsedFiles);
|
||||
}
|
||||
|
||||
|
||||
await psf.ScanLibrariesForSeries(LibraryType.Manga,
|
||||
new List<string>() {"C:/Data/"}, "libraryName", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles);
|
||||
|
||||
var parsedSeries = await psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, "libraryName");
|
||||
|
||||
Assert.Equal(3, parsedSeries.Values.Count);
|
||||
Assert.NotEmpty(parsedSeries.Keys.Where(p => p.Format == MangaFormat.Archive && p.Name.Equals("Accel World")));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
|
||||
#region ProcessFiles
|
||||
|
||||
private static MockFileSystem CreateTestFilesystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ProcessFiles_ForLibraryMode_OnlyCallsFolderActionForEachTopLevelFolder()
|
||||
{
|
||||
var fileSystem = CreateTestFilesystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
var directoriesSeen = new HashSet<string>();
|
||||
await psf.ProcessFiles("C:/Data/", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),
|
||||
(files, directoryPath) =>
|
||||
{
|
||||
directoriesSeen.Add(directoryPath);
|
||||
return Task.CompletedTask;
|
||||
});
|
||||
|
||||
Assert.Equal(2, directoriesSeen.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ProcessFiles_ForNonLibraryMode_CallsFolderActionOnce()
|
||||
{
|
||||
var fileSystem = CreateTestFilesystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
var directoriesSeen = new HashSet<string>();
|
||||
await psf.ProcessFiles("C:/Data/", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, directoryPath) =>
|
||||
{
|
||||
directoriesSeen.Add(directoryPath);
|
||||
return Task.CompletedTask;
|
||||
});
|
||||
|
||||
Assert.Single(directoriesSeen);
|
||||
directoriesSeen.TryGetValue("C:/Data/", out var actual);
|
||||
Assert.Equal("C:/Data/", actual);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ProcessFiles_ShouldCallFolderActionTwice()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
var callCount = 0;
|
||||
await psf.ProcessFiles("C:/Data", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) =>
|
||||
{
|
||||
callCount++;
|
||||
|
||||
return Task.CompletedTask;
|
||||
});
|
||||
|
||||
Assert.Equal(2, callCount);
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Due to this not being a library, it's going to consider everything under C:/Data as being one folder aka a series folder
|
||||
/// </summary>
|
||||
[Fact]
|
||||
public async Task ProcessFiles_ShouldCallFolderActionOnce()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World");
|
||||
fileSystem.AddDirectory("C:/Data/Accel World/Specials/");
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)), Substitute.For<IEventHub>());
|
||||
|
||||
var callCount = 0;
|
||||
await psf.ProcessFiles("C:/Data", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) =>
|
||||
{
|
||||
callCount++;
|
||||
return Task.CompletedTask;
|
||||
});
|
||||
|
||||
Assert.Equal(1, callCount);
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
||||
|
@ -16,7 +16,7 @@ namespace API.Tests.Services
|
||||
[Fact]
|
||||
public void FindSeriesNotOnDisk_Should_Remove1()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
@ -48,7 +48,7 @@ namespace API.Tests.Services
|
||||
[Fact]
|
||||
public void FindSeriesNotOnDisk_Should_RemoveNothing_Test()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
var infos = new Dictionary<ParsedSeries, IList<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive});
|
||||
@ -125,6 +125,8 @@ namespace API.Tests.Services
|
||||
// }
|
||||
|
||||
|
||||
// TODO: I want a test for UpdateSeries where if I have chapter 10 and now it's mapping into Vol 2 Chapter 10,
|
||||
// if I can do it without deleting the underlying chapter (aka id change)
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -354,7 +354,7 @@ namespace API.Controllers
|
||||
lib.AppUsers.Remove(user);
|
||||
}
|
||||
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList();
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList();
|
||||
}
|
||||
|
||||
foreach (var lib in libraries)
|
||||
@ -458,11 +458,11 @@ namespace API.Controllers
|
||||
{
|
||||
_logger.LogInformation("{UserName} is being registered as admin. Granting access to all libraries",
|
||||
user.UserName);
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync()).ToList();
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync(LibraryIncludes.AppUser)).ToList();
|
||||
}
|
||||
else
|
||||
{
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList();
|
||||
libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList();
|
||||
}
|
||||
|
||||
foreach (var lib in libraries)
|
||||
|
@ -60,6 +60,7 @@ namespace API.Controllers
|
||||
|
||||
try
|
||||
{
|
||||
|
||||
var path = _cacheService.GetCachedFile(chapter);
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"Pdf doesn't exist when it should.");
|
||||
|
||||
@ -90,7 +91,7 @@ namespace API.Controllers
|
||||
try
|
||||
{
|
||||
var path = _cacheService.GetCachedPagePath(chapter, page);
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}");
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}. Try refreshing to allow re-cache.");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
return PhysicalFile(path, "image/" + format, Path.GetFileName(path), true);
|
||||
|
@ -54,5 +54,9 @@ namespace API.DTOs
|
||||
public int MaxHoursToRead { get; set; }
|
||||
/// <inheritdoc cref="IHasReadTimeEstimate.AvgHoursToRead"/>
|
||||
public int AvgHoursToRead { get; set; }
|
||||
/// <summary>
|
||||
/// The highest level folder for this Series
|
||||
/// </summary>
|
||||
public string FolderPath { get; set; }
|
||||
}
|
||||
}
|
||||
|
@ -43,6 +43,7 @@ namespace API.Data
|
||||
public DbSet<Tag> Tag { get; set; }
|
||||
public DbSet<SiteTheme> SiteTheme { get; set; }
|
||||
public DbSet<SeriesRelation> SeriesRelation { get; set; }
|
||||
public DbSet<FolderPath> FolderPath { get; set; }
|
||||
|
||||
|
||||
protected override void OnModelCreating(ModelBuilder builder)
|
||||
|
1605
API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs
generated
Normal file
1605
API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs
generated
Normal file
File diff suppressed because it is too large
Load Diff
37
API/Data/Migrations/20220817173731_SeriesFolder.cs
Normal file
37
API/Data/Migrations/20220817173731_SeriesFolder.cs
Normal file
@ -0,0 +1,37 @@
|
||||
using System;
|
||||
using Microsoft.EntityFrameworkCore.Migrations;
|
||||
|
||||
#nullable disable
|
||||
|
||||
namespace API.Data.Migrations
|
||||
{
|
||||
public partial class SeriesFolder : Migration
|
||||
{
|
||||
protected override void Up(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.AddColumn<string>(
|
||||
name: "FolderPath",
|
||||
table: "Series",
|
||||
type: "TEXT",
|
||||
nullable: true);
|
||||
|
||||
migrationBuilder.AddColumn<DateTime>(
|
||||
name: "LastFolderScanned",
|
||||
table: "Series",
|
||||
type: "TEXT",
|
||||
nullable: false,
|
||||
defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified));
|
||||
}
|
||||
|
||||
protected override void Down(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.DropColumn(
|
||||
name: "FolderPath",
|
||||
table: "Series");
|
||||
|
||||
migrationBuilder.DropColumn(
|
||||
name: "LastFolderScanned",
|
||||
table: "Series");
|
||||
}
|
||||
}
|
||||
}
|
@ -782,12 +782,18 @@ namespace API.Data.Migrations
|
||||
b.Property<DateTime>("Created")
|
||||
.HasColumnType("TEXT");
|
||||
|
||||
b.Property<string>("FolderPath")
|
||||
.HasColumnType("TEXT");
|
||||
|
||||
b.Property<int>("Format")
|
||||
.HasColumnType("INTEGER");
|
||||
|
||||
b.Property<DateTime>("LastChapterAdded")
|
||||
.HasColumnType("TEXT");
|
||||
|
||||
b.Property<DateTime>("LastFolderScanned")
|
||||
.HasColumnType("TEXT");
|
||||
|
||||
b.Property<DateTime>("LastModified")
|
||||
.HasColumnType("TEXT");
|
||||
|
||||
|
@ -56,6 +56,7 @@ public class CollectionTagRepository : ICollectionTagRepository
|
||||
/// </summary>
|
||||
public async Task<int> RemoveTagsWithoutSeries()
|
||||
{
|
||||
// TODO: Write a Unit test to validate this works
|
||||
var tagsToDelete = await _context.CollectionTag
|
||||
.Include(c => c.SeriesMetadatas)
|
||||
.Where(c => c.SeriesMetadatas.Count == 0)
|
||||
|
@ -34,19 +34,19 @@ public interface ILibraryRepository
|
||||
Task<IEnumerable<LibraryDto>> GetLibraryDtosAsync();
|
||||
Task<bool> LibraryExists(string libraryName);
|
||||
Task<Library> GetLibraryForIdAsync(int libraryId, LibraryIncludes includes);
|
||||
Task<Library> GetFullLibraryForIdAsync(int libraryId);
|
||||
Task<Library> GetFullLibraryForIdAsync(int libraryId, int seriesId);
|
||||
Task<IEnumerable<LibraryDto>> GetLibraryDtosForUsernameAsync(string userName);
|
||||
Task<IEnumerable<Library>> GetLibrariesAsync();
|
||||
Task<IEnumerable<Library>> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None);
|
||||
Task<bool> DeleteLibrary(int libraryId);
|
||||
Task<IEnumerable<Library>> GetLibrariesForUserIdAsync(int userId);
|
||||
Task<LibraryType> GetLibraryTypeAsync(int libraryId);
|
||||
Task<IEnumerable<Library>> GetLibraryForIdsAsync(IList<int> libraryIds);
|
||||
Task<IEnumerable<Library>> GetLibraryForIdsAsync(IEnumerable<int> libraryIds, LibraryIncludes includes = LibraryIncludes.None);
|
||||
Task<int> GetTotalFiles();
|
||||
IEnumerable<JumpKeyDto> GetJumpBarAsync(int libraryId);
|
||||
Task<IList<AgeRatingDto>> GetAllAgeRatingsDtosForLibrariesAsync(List<int> libraryIds);
|
||||
Task<IList<LanguageDto>> GetAllLanguagesForLibrariesAsync(List<int> libraryIds);
|
||||
IEnumerable<PublicationStatusDto> GetAllPublicationStatusesDtosForLibrariesAsync(List<int> libraryIds);
|
||||
Task<bool> DoAnySeriesFoldersMatch(IEnumerable<string> folders);
|
||||
Library GetLibraryByFolder(string folder);
|
||||
}
|
||||
|
||||
public class LibraryRepository : ILibraryRepository
|
||||
@ -87,11 +87,19 @@ public class LibraryRepository : ILibraryRepository
|
||||
.ToListAsync();
|
||||
}
|
||||
|
||||
public async Task<IEnumerable<Library>> GetLibrariesAsync()
|
||||
/// <summary>
|
||||
/// Returns all libraries including their AppUsers + extra includes
|
||||
/// </summary>
|
||||
/// <param name="includes"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<IEnumerable<Library>> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None)
|
||||
{
|
||||
return await _context.Library
|
||||
var query = _context.Library
|
||||
.Include(l => l.AppUsers)
|
||||
.ToListAsync();
|
||||
.Select(l => l);
|
||||
|
||||
query = AddIncludesToQuery(query, includes);
|
||||
return await query.ToListAsync();
|
||||
}
|
||||
|
||||
public async Task<bool> DeleteLibrary(int libraryId)
|
||||
@ -120,11 +128,13 @@ public class LibraryRepository : ILibraryRepository
|
||||
.SingleAsync();
|
||||
}
|
||||
|
||||
public async Task<IEnumerable<Library>> GetLibraryForIdsAsync(IList<int> libraryIds)
|
||||
public async Task<IEnumerable<Library>> GetLibraryForIdsAsync(IEnumerable<int> libraryIds, LibraryIncludes includes = LibraryIncludes.None)
|
||||
{
|
||||
return await _context.Library
|
||||
.Where(x => libraryIds.Contains(x.Id))
|
||||
.ToListAsync();
|
||||
var query = _context.Library
|
||||
.Where(x => libraryIds.Contains(x.Id));
|
||||
|
||||
AddIncludesToQuery(query, includes);
|
||||
return await query.ToListAsync();
|
||||
}
|
||||
|
||||
public async Task<int> GetTotalFiles()
|
||||
@ -317,4 +327,23 @@ public class LibraryRepository : ILibraryRepository
|
||||
.OrderBy(s => s.Title);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Checks if any series folders match the folders passed in
|
||||
/// </summary>
|
||||
/// <param name="folders"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<bool> DoAnySeriesFoldersMatch(IEnumerable<string> folders)
|
||||
{
|
||||
var normalized = folders.Select(Parser.Parser.NormalizePath);
|
||||
return await _context.Series.AnyAsync(s => normalized.Contains(s.FolderPath));
|
||||
}
|
||||
|
||||
public Library? GetLibraryByFolder(string folder)
|
||||
{
|
||||
var normalized = Parser.Parser.NormalizePath(folder);
|
||||
return _context.Library
|
||||
.Include(l => l.Folders)
|
||||
.AsSplitQuery()
|
||||
.SingleOrDefault(l => l.Folders.Select(f => f.Path).Contains(normalized));
|
||||
}
|
||||
}
|
||||
|
@ -1,6 +1,5 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Globalization;
|
||||
using System.Linq;
|
||||
using System.Text.RegularExpressions;
|
||||
using System.Threading.Tasks;
|
||||
@ -19,12 +18,11 @@ using API.Extensions;
|
||||
using API.Helpers;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using AutoMapper;
|
||||
using AutoMapper.QueryableExtensions;
|
||||
using Kavita.Common.Extensions;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using SQLitePCL;
|
||||
|
||||
|
||||
namespace API.Data.Repositories;
|
||||
|
||||
@ -120,6 +118,11 @@ public interface ISeriesRepository
|
||||
Task<SeriesDto> GetSeriesForMangaFile(int mangaFileId, int userId);
|
||||
Task<SeriesDto> GetSeriesForChapter(int chapterId, int userId);
|
||||
Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter);
|
||||
Task<int> GetSeriesIdByFolder(string folder);
|
||||
Task<Series> GetSeriesByFolderPath(string folder);
|
||||
Task<Series> GetFullSeriesByName(string series, int libraryId);
|
||||
Task RemoveSeriesNotInList(IList<ParsedSeries> seenSeries, int libraryId);
|
||||
Task<IDictionary<string, IList<SeriesModified>>> GetFolderPathMap(int libraryId);
|
||||
}
|
||||
|
||||
public class SeriesRepository : ISeriesRepository
|
||||
@ -156,6 +159,7 @@ public class SeriesRepository : ISeriesRepository
|
||||
/// Returns if a series name and format exists already in a library
|
||||
/// </summary>
|
||||
/// <param name="name">Name of series</param>
|
||||
/// <param name="libraryId"></param>
|
||||
/// <param name="format">Format of series</param>
|
||||
/// <returns></returns>
|
||||
public async Task<bool> DoesSeriesNameExistInLibrary(string name, int libraryId, MangaFormat format)
|
||||
@ -179,6 +183,7 @@ public class SeriesRepository : ISeriesRepository
|
||||
/// Used for <see cref="ScannerService"/> to
|
||||
/// </summary>
|
||||
/// <param name="libraryId"></param>
|
||||
/// <param name="userParams"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<PagedList<Series>> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams)
|
||||
{
|
||||
@ -432,6 +437,7 @@ public class SeriesRepository : ISeriesRepository
|
||||
/// Returns Volumes, Metadata (Incl Genres and People), and Collection Tags
|
||||
/// </summary>
|
||||
/// <param name="seriesId"></param>
|
||||
/// <param name="includes"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<Series> GetSeriesByIdAsync(int seriesId, SeriesIncludes includes = SeriesIncludes.Volumes | SeriesIncludes.Metadata)
|
||||
{
|
||||
@ -1136,21 +1142,82 @@ public class SeriesRepository : ISeriesRepository
|
||||
.SingleOrDefaultAsync();
|
||||
}
|
||||
|
||||
public async Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter)
|
||||
/// <summary>
|
||||
/// Given a folder path return a Series with the <see cref="Series.FolderPath"/> that matches.
|
||||
/// </summary>
|
||||
/// <remarks>This will apply normalization on the path.</remarks>
|
||||
/// <param name="folder"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<int> GetSeriesIdByFolder(string folder)
|
||||
{
|
||||
var libraryIds = GetLibraryIdsForUser(userId);
|
||||
var query = _context.AppUser
|
||||
.Where(user => user.Id == userId)
|
||||
.SelectMany(u => u.WantToRead)
|
||||
.Where(s => libraryIds.Contains(s.LibraryId))
|
||||
.AsSplitQuery()
|
||||
.AsNoTracking();
|
||||
|
||||
var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query);
|
||||
|
||||
return await PagedList<SeriesDto>.CreateAsync(filteredQuery.ProjectTo<SeriesDto>(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize);
|
||||
var normalized = Parser.Parser.NormalizePath(folder);
|
||||
var series = await _context.Series
|
||||
.Where(s => s.FolderPath.Equals(normalized))
|
||||
.SingleOrDefaultAsync();
|
||||
return series?.Id ?? 0;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Return a Series by Folder path. Null if not found.
|
||||
/// </summary>
|
||||
/// <param name="folder">This will be normalized in the query</param>
|
||||
/// <returns></returns>
|
||||
public async Task<Series> GetSeriesByFolderPath(string folder)
|
||||
{
|
||||
var normalized = Parser.Parser.NormalizePath(folder);
|
||||
return await _context.Series.SingleOrDefaultAsync(s => s.FolderPath.Equals(normalized));
|
||||
}
|
||||
|
||||
public Task<Series> GetFullSeriesByName(string series, int libraryId)
|
||||
{
|
||||
return _context.Series
|
||||
.Where(s => s.NormalizedName.Equals(Parser.Parser.Normalize(series)) && s.LibraryId == libraryId)
|
||||
.Include(s => s.Metadata)
|
||||
.ThenInclude(m => m.People)
|
||||
.Include(s => s.Metadata)
|
||||
.ThenInclude(m => m.Genres)
|
||||
.Include(s => s.Library)
|
||||
.Include(s => s.Volumes)
|
||||
.ThenInclude(v => v.Chapters)
|
||||
.ThenInclude(cm => cm.People)
|
||||
|
||||
.Include(s => s.Volumes)
|
||||
.ThenInclude(v => v.Chapters)
|
||||
.ThenInclude(c => c.Tags)
|
||||
|
||||
.Include(s => s.Volumes)
|
||||
.ThenInclude(v => v.Chapters)
|
||||
.ThenInclude(c => c.Genres)
|
||||
|
||||
|
||||
.Include(s => s.Metadata)
|
||||
.ThenInclude(m => m.Tags)
|
||||
|
||||
.Include(s => s.Volumes)
|
||||
.ThenInclude(v => v.Chapters)
|
||||
.ThenInclude(c => c.Files)
|
||||
.AsSplitQuery()
|
||||
.SingleOrDefaultAsync();
|
||||
}
|
||||
|
||||
public async Task RemoveSeriesNotInList(IList<ParsedSeries> seenSeries, int libraryId)
|
||||
{
|
||||
if (seenSeries.Count == 0) return;
|
||||
var ids = new List<int>();
|
||||
foreach (var parsedSeries in seenSeries)
|
||||
{
|
||||
ids.Add(await _context.Series
|
||||
.Where(s => s.Format == parsedSeries.Format && s.NormalizedName == parsedSeries.NormalizedName && s.LibraryId == libraryId)
|
||||
.Select(s => s.Id).SingleAsync());
|
||||
}
|
||||
|
||||
var seriesToRemove = await _context.Series
|
||||
.Where(s => s.LibraryId == libraryId)
|
||||
.Where(s => !ids.Contains(s.Id))
|
||||
.ToListAsync();
|
||||
|
||||
_context.Series.RemoveRange(seriesToRemove);
|
||||
}
|
||||
|
||||
public async Task<PagedList<SeriesDto>> GetHighlyRated(int userId, int libraryId, UserParams userParams)
|
||||
{
|
||||
@ -1320,4 +1387,53 @@ public class SeriesRepository : ISeriesRepository
|
||||
.AsEnumerable();
|
||||
return ret;
|
||||
}
|
||||
|
||||
public async Task<PagedList<SeriesDto>> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter)
|
||||
{
|
||||
var libraryIds = GetLibraryIdsForUser(userId);
|
||||
var query = _context.AppUser
|
||||
.Where(user => user.Id == userId)
|
||||
.SelectMany(u => u.WantToRead)
|
||||
.Where(s => libraryIds.Contains(s.LibraryId))
|
||||
.AsSplitQuery()
|
||||
.AsNoTracking();
|
||||
|
||||
var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query);
|
||||
|
||||
return await PagedList<SeriesDto>.CreateAsync(filteredQuery.ProjectTo<SeriesDto>(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize);
|
||||
}
|
||||
|
||||
public async Task<IDictionary<string, IList<SeriesModified>>> GetFolderPathMap(int libraryId)
|
||||
{
|
||||
var info = await _context.Series
|
||||
.Where(s => s.LibraryId == libraryId)
|
||||
.AsNoTracking()
|
||||
.Where(s => s.FolderPath != null)
|
||||
.Select(s => new SeriesModified()
|
||||
{
|
||||
LastScanned = s.LastFolderScanned,
|
||||
SeriesName = s.Name,
|
||||
FolderPath = s.FolderPath,
|
||||
Format = s.Format
|
||||
}).ToListAsync();
|
||||
|
||||
var map = new Dictionary<string, IList<SeriesModified>>();
|
||||
foreach (var series in info)
|
||||
{
|
||||
if (!map.ContainsKey(series.FolderPath))
|
||||
{
|
||||
map.Add(series.FolderPath, new List<SeriesModified>()
|
||||
{
|
||||
series
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
map[series.FolderPath].Add(series);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
}
|
||||
|
@ -8,8 +8,9 @@ namespace API.Entities
|
||||
public int Id { get; set; }
|
||||
public string Path { get; set; }
|
||||
/// <summary>
|
||||
/// Used when scanning to see if we can skip if nothing has changed. (not implemented)
|
||||
/// Used when scanning to see if we can skip if nothing has changed
|
||||
/// </summary>
|
||||
/// <remarks>Time stored in UTC</remarks>
|
||||
public DateTime LastScanned { get; set; }
|
||||
|
||||
// Relationship
|
||||
|
@ -1,5 +1,7 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using API.Entities.Enums;
|
||||
using API.Entities.Interfaces;
|
||||
|
||||
@ -9,6 +11,10 @@ namespace API.Entities
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public string Name { get; set; }
|
||||
/// <summary>
|
||||
/// Update this summary with a way it's used, else let's remove it.
|
||||
/// </summary>
|
||||
[Obsolete("This has never been coded for. Likely we can remove it.")]
|
||||
public string CoverImage { get; set; }
|
||||
public LibraryType Type { get; set; }
|
||||
public DateTime Created { get; set; }
|
||||
@ -16,10 +22,22 @@ namespace API.Entities
|
||||
/// <summary>
|
||||
/// Last time Library was scanned
|
||||
/// </summary>
|
||||
/// <remarks>Time stored in UTC</remarks>
|
||||
public DateTime LastScanned { get; set; }
|
||||
public ICollection<FolderPath> Folders { get; set; }
|
||||
public ICollection<AppUser> AppUsers { get; set; }
|
||||
public ICollection<Series> Series { get; set; }
|
||||
|
||||
// Methods
|
||||
/// <summary>
|
||||
/// Has there been any modifications to the FolderPath's directory since the <see cref="FolderPath.LastScanned"/> date
|
||||
/// </summary>
|
||||
/// <returns></returns>
|
||||
public bool AnyModificationsSinceLastScan()
|
||||
{
|
||||
// NOTE: I don't think we can do this due to NTFS
|
||||
return Folders.All(folder => File.GetLastWriteTimeUtc(folder.Path) > folder.LastScanned);
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -50,7 +50,15 @@ public class Series : IEntityDate, IHasReadTimeEstimate
|
||||
/// Sum of all Volume page counts
|
||||
/// </summary>
|
||||
public int Pages { get; set; }
|
||||
|
||||
/// <summary>
|
||||
/// Highest path (that is under library root) that contains the series.
|
||||
/// </summary>
|
||||
/// <remarks><see cref="Parser.Parser.NormalizePath"/> must be used before setting</remarks>
|
||||
public string FolderPath { get; set; }
|
||||
/// <summary>
|
||||
/// Last time the folder was scanned
|
||||
/// </summary>
|
||||
public DateTime LastFolderScanned { get; set; }
|
||||
/// <summary>
|
||||
/// The type of all the files attached to this series
|
||||
/// </summary>
|
||||
|
@ -4,6 +4,7 @@ using API.Helpers;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using API.Services.Tasks.Metadata;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using API.SignalR;
|
||||
using API.SignalR.Presence;
|
||||
using Kavita.Common;
|
||||
@ -46,10 +47,12 @@ namespace API.Extensions
|
||||
services.AddScoped<IBookmarkService, BookmarkService>();
|
||||
services.AddScoped<IThemeService, ThemeService>();
|
||||
services.AddScoped<ISeriesService, SeriesService>();
|
||||
services.AddScoped<IProcessSeries, ProcessSeries>();
|
||||
|
||||
services.AddScoped<IScannerService, ScannerService>();
|
||||
services.AddScoped<IMetadataService, MetadataService>();
|
||||
services.AddScoped<IWordCountAnalyzerService, WordCountAnalyzerService>();
|
||||
services.AddScoped<ILibraryWatcher, LibraryWatcher>();
|
||||
|
||||
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Data;
|
||||
@ -34,6 +35,7 @@ public static class GenreHelper
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public static void KeepOnlySameGenreBetweenLists(ICollection<Genre> existingGenres, ICollection<Genre> removeAllExcept, Action<Genre> action = null)
|
||||
{
|
||||
var existing = existingGenres.ToList();
|
||||
@ -61,4 +63,14 @@ public static class GenreHelper
|
||||
metadataGenres.Add(genre);
|
||||
}
|
||||
}
|
||||
|
||||
public static void AddGenreIfNotExists(BlockingCollection<Genre> metadataGenres, Genre genre)
|
||||
{
|
||||
var existingGenre = metadataGenres.FirstOrDefault(p =>
|
||||
p.NormalizedTitle == Parser.Parser.Normalize(genre.Title));
|
||||
if (existingGenre == null)
|
||||
{
|
||||
metadataGenres.Add(genre);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -16,7 +16,7 @@ public static class ParserInfoHelpers
|
||||
/// <param name="parsedSeries"></param>
|
||||
/// <returns></returns>
|
||||
public static bool SeriesHasMatchingParserInfoFormat(Series series,
|
||||
Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries)
|
||||
Dictionary<ParsedSeries, IList<ParserInfo>> parsedSeries)
|
||||
{
|
||||
var format = MangaFormat.Unknown;
|
||||
foreach (var pSeries in parsedSeries.Keys)
|
||||
|
@ -1,4 +1,5 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Data;
|
||||
@ -103,4 +104,19 @@ public static class PersonHelper
|
||||
metadataPeople.Add(person);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Adds the person to the list if it's not already in there
|
||||
/// </summary>
|
||||
/// <param name="metadataPeople"></param>
|
||||
/// <param name="person"></param>
|
||||
public static void AddPersonIfNotExists(BlockingCollection<Person> metadataPeople, Person person)
|
||||
{
|
||||
var existingPerson = metadataPeople.SingleOrDefault(p =>
|
||||
p.NormalizedName == Parser.Parser.Normalize(person.Name) && p.Role == person.Role);
|
||||
if (existingPerson == null)
|
||||
{
|
||||
metadataPeople.Add(person);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,4 +1,5 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Data;
|
||||
@ -65,6 +66,16 @@ public static class TagHelper
|
||||
}
|
||||
}
|
||||
|
||||
public static void AddTagIfNotExists(BlockingCollection<Tag> metadataTags, Tag tag)
|
||||
{
|
||||
var existingGenre = metadataTags.FirstOrDefault(p =>
|
||||
p.NormalizedTitle == Parser.Parser.Normalize(tag.Title));
|
||||
if (existingGenre == null)
|
||||
{
|
||||
metadataTags.Add(tag);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Remove tags on a list
|
||||
/// </summary>
|
||||
|
@ -5,10 +5,16 @@ using API.Services;
|
||||
|
||||
namespace API.Parser;
|
||||
|
||||
public interface IDefaultParser
|
||||
{
|
||||
ParserInfo Parse(string filePath, string rootPath, LibraryType type = LibraryType.Manga);
|
||||
void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// This is an implementation of the Parser that is the basis for everything
|
||||
/// </summary>
|
||||
public class DefaultParser
|
||||
public class DefaultParser : IDefaultParser
|
||||
{
|
||||
private readonly IDirectoryService _directoryService;
|
||||
|
||||
|
@ -15,12 +15,14 @@ namespace API.Parser
|
||||
|
||||
public const string ImageFileExtensions = @"^(\.png|\.jpeg|\.jpg|\.webp|\.gif)";
|
||||
public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|\.cb7|\.cbt";
|
||||
public const string BookFileExtensions = @"\.epub|\.pdf";
|
||||
private const string BookFileExtensions = @"\.epub|\.pdf";
|
||||
public const string MacOsMetadataFileStartsWith = @"._";
|
||||
|
||||
public const string SupportedExtensions =
|
||||
ArchiveFileExtensions + "|" + ImageFileExtensions + "|" + BookFileExtensions;
|
||||
|
||||
public static readonly string[] SupportedGlobExtensions = new [] {@"**/*.png", @"**/*.cbz", @"**/*.pdf"};
|
||||
|
||||
private const RegexOptions MatchOptions =
|
||||
RegexOptions.IgnoreCase | RegexOptions.Compiled | RegexOptions.CultureInvariant;
|
||||
|
||||
|
@ -140,9 +140,10 @@ namespace API.Services
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="OrderByNatural"/> for ordering files
|
||||
/// Returns first entry that is an image and is not in a blacklisted folder path. Uses <see cref="EnumerableExtensions.OrderByNatural"/> for ordering files
|
||||
/// </summary>
|
||||
/// <param name="entryFullNames"></param>
|
||||
/// <param name="archiveName"></param>
|
||||
/// <returns>Entry name of match, null if no match</returns>
|
||||
public static string? FirstFileEntry(IEnumerable<string> entryFullNames, string archiveName)
|
||||
{
|
||||
|
@ -100,11 +100,9 @@ namespace API.Services
|
||||
var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId);
|
||||
var extractPath = GetCachePath(chapterId);
|
||||
|
||||
if (!_directoryService.Exists(extractPath))
|
||||
{
|
||||
var files = chapter.Files.ToList();
|
||||
ExtractChapterFiles(extractPath, files);
|
||||
}
|
||||
if (_directoryService.Exists(extractPath)) return chapter;
|
||||
var files = chapter.Files.ToList();
|
||||
ExtractChapterFiles(extractPath, files);
|
||||
|
||||
return chapter;
|
||||
}
|
||||
@ -215,9 +213,8 @@ namespace API.Services
|
||||
{
|
||||
// Calculate what chapter the page belongs to
|
||||
var path = GetCachePath(chapter.Id);
|
||||
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
|
||||
files = files
|
||||
.AsEnumerable()
|
||||
// TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access
|
||||
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions)
|
||||
.OrderByNatural(Path.GetFileNameWithoutExtension)
|
||||
.ToArray();
|
||||
|
||||
|
@ -9,6 +9,7 @@ using System.Threading.Tasks;
|
||||
using API.DTOs.System;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using Kavita.Common.Helpers;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services
|
||||
@ -57,6 +58,17 @@ namespace API.Services
|
||||
void RemoveNonImages(string directoryName);
|
||||
void Flatten(string directoryName);
|
||||
Task<bool> CheckWriteAccess(string directoryName);
|
||||
|
||||
IEnumerable<string> GetFilesWithCertainExtensions(string path,
|
||||
string searchPatternExpression = "",
|
||||
SearchOption searchOption = SearchOption.TopDirectoryOnly);
|
||||
|
||||
IEnumerable<string> GetDirectories(string folderPath);
|
||||
string GetParentDirectoryName(string fileOrFolder);
|
||||
#nullable enable
|
||||
IList<string> ScanFiles(string folderPath, GlobMatcher? matcher = null);
|
||||
DateTime GetLastWriteTime(string folderPath);
|
||||
#nullable disable
|
||||
}
|
||||
public class DirectoryService : IDirectoryService
|
||||
{
|
||||
@ -105,7 +117,7 @@ namespace API.Services
|
||||
/// <param name="searchPatternExpression">Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files.</param>
|
||||
/// <param name="searchOption">SearchOption to use, defaults to TopDirectoryOnly</param>
|
||||
/// <returns>List of file paths</returns>
|
||||
private IEnumerable<string> GetFilesWithCertainExtensions(string path,
|
||||
public IEnumerable<string> GetFilesWithCertainExtensions(string path,
|
||||
string searchPatternExpression = "",
|
||||
SearchOption searchOption = SearchOption.TopDirectoryOnly)
|
||||
{
|
||||
@ -507,10 +519,175 @@ namespace API.Services
|
||||
return dirs;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope.
|
||||
/// </summary>
|
||||
/// <param name="folderPath"></param>
|
||||
/// <returns>List of directory paths, empty if path doesn't exist</returns>
|
||||
public IEnumerable<string> GetDirectories(string folderPath)
|
||||
{
|
||||
if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray<string>.Empty;
|
||||
return FileSystem.Directory.GetDirectories(folderPath)
|
||||
.Where(path => ExcludeDirectories.Matches(path).Count == 0);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Returns all directories, including subdirectories. Automatically excludes directories that shouldn't be in scope.
|
||||
/// </summary>
|
||||
/// <param name="folderPath"></param>
|
||||
/// <returns></returns>
|
||||
public IEnumerable<string> GetAllDirectories(string folderPath)
|
||||
{
|
||||
if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray<string>.Empty;
|
||||
var directories = new List<string>();
|
||||
|
||||
var foundDirs = GetDirectories(folderPath);
|
||||
foreach (var foundDir in foundDirs)
|
||||
{
|
||||
directories.Add(foundDir);
|
||||
directories.AddRange(GetAllDirectories(foundDir));
|
||||
}
|
||||
|
||||
return directories;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Returns the parent directories name for a file or folder. Empty string is path is not valid.
|
||||
/// </summary>
|
||||
/// <remarks>This does touch I/O with an Attribute lookup</remarks>
|
||||
/// <param name="fileOrFolder"></param>
|
||||
/// <returns></returns>
|
||||
public string GetParentDirectoryName(string fileOrFolder)
|
||||
{
|
||||
// TODO: Write Unit tests
|
||||
try
|
||||
{
|
||||
var attr = File.GetAttributes(fileOrFolder);
|
||||
var isDirectory = attr.HasFlag(FileAttributes.Directory);
|
||||
if (isDirectory)
|
||||
{
|
||||
return Parser.Parser.NormalizePath(FileSystem.DirectoryInfo
|
||||
.FromDirectoryName(fileOrFolder).Parent
|
||||
.FullName);
|
||||
}
|
||||
|
||||
return Parser.Parser.NormalizePath(FileSystem.FileInfo
|
||||
.FromFileName(fileOrFolder).Directory.Parent
|
||||
.FullName);
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Scans a directory by utilizing a recursive folder search. If a .kavitaignore file is found, will ignore matching patterns
|
||||
/// </summary>
|
||||
/// <param name="folderPath"></param>
|
||||
/// <param name="matcher"></param>
|
||||
/// <returns></returns>
|
||||
public IList<string> ScanFiles(string folderPath, GlobMatcher? matcher = null)
|
||||
{
|
||||
_logger.LogDebug("[ScanFiles] called on {Path}", folderPath);
|
||||
var files = new List<string>();
|
||||
if (!Exists(folderPath)) return files;
|
||||
|
||||
var potentialIgnoreFile = FileSystem.Path.Join(folderPath, ".kavitaignore");
|
||||
if (matcher == null)
|
||||
{
|
||||
matcher = CreateMatcherFromFile(potentialIgnoreFile);
|
||||
}
|
||||
else
|
||||
{
|
||||
matcher.Merge(CreateMatcherFromFile(potentialIgnoreFile));
|
||||
}
|
||||
|
||||
|
||||
IEnumerable<string> directories;
|
||||
if (matcher == null)
|
||||
{
|
||||
directories = GetDirectories(folderPath);
|
||||
}
|
||||
else
|
||||
{
|
||||
directories = GetDirectories(folderPath)
|
||||
.Where(folder => matcher != null &&
|
||||
!matcher.ExcludeMatches($"{FileSystem.DirectoryInfo.FromDirectoryName(folder).Name}{FileSystem.Path.AltDirectorySeparatorChar}"));
|
||||
}
|
||||
|
||||
foreach (var directory in directories)
|
||||
{
|
||||
files.AddRange(ScanFiles(directory, matcher));
|
||||
}
|
||||
|
||||
|
||||
// Get the matcher from either ignore or global (default setup)
|
||||
if (matcher == null)
|
||||
{
|
||||
files.AddRange(GetFilesWithCertainExtensions(folderPath, Parser.Parser.SupportedExtensions));
|
||||
}
|
||||
else
|
||||
{
|
||||
var foundFiles = GetFilesWithCertainExtensions(folderPath,
|
||||
Parser.Parser.SupportedExtensions)
|
||||
.Where(file => !matcher.ExcludeMatches(FileSystem.FileInfo.FromFileName(file).Name));
|
||||
files.AddRange(foundFiles);
|
||||
}
|
||||
|
||||
return files;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Recursively scans a folder and returns the max last write time on any folders
|
||||
/// </summary>
|
||||
/// <remarks>This is required vs just an attribute check as NTFS does not bubble up certain events from nested folders.
|
||||
/// This will also ignore recursive nature if the device is not NTFS</remarks>
|
||||
/// <param name="folderPath"></param>
|
||||
/// <returns>Max Last Write Time</returns>
|
||||
public DateTime GetLastWriteTime(string folderPath)
|
||||
{
|
||||
if (!FileSystem.Directory.Exists(folderPath)) throw new IOException($"{folderPath} does not exist");
|
||||
if (new DriveInfo(FileSystem.Path.GetPathRoot(folderPath)).DriveFormat != "NTFS")
|
||||
{
|
||||
return FileSystem.Directory.GetLastWriteTime(folderPath);
|
||||
}
|
||||
|
||||
var directories = GetAllDirectories(folderPath).ToList();
|
||||
if (directories.Count == 0) return FileSystem.Directory.GetLastWriteTime(folderPath);
|
||||
|
||||
return directories.Max(d => FileSystem.Directory.GetLastWriteTime(d));
|
||||
}
|
||||
|
||||
|
||||
private GlobMatcher CreateMatcherFromFile(string filePath)
|
||||
{
|
||||
if (!FileSystem.File.Exists(filePath))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
// Read file in and add each line to Matcher
|
||||
var lines = FileSystem.File.ReadAllLines(filePath);
|
||||
if (lines.Length == 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
GlobMatcher matcher = new();
|
||||
foreach (var line in lines)
|
||||
{
|
||||
matcher.AddExclude(line);
|
||||
}
|
||||
|
||||
return matcher;
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Recursively scans files and applies an action on them. This uses as many cores the underlying PC has to speed
|
||||
/// up processing.
|
||||
/// NOTE: This is no longer parallel due to user's machines locking up
|
||||
/// </summary>
|
||||
/// <param name="root">Directory to scan</param>
|
||||
/// <param name="action">Action to apply on file path</param>
|
||||
@ -538,18 +715,16 @@ namespace API.Services
|
||||
string[] files;
|
||||
|
||||
try {
|
||||
subDirs = FileSystem.Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0);
|
||||
subDirs = GetDirectories(currentDir);
|
||||
}
|
||||
// Thrown if we do not have discovery permission on the directory.
|
||||
catch (UnauthorizedAccessException e) {
|
||||
Console.WriteLine(e.Message);
|
||||
logger.LogError(e, "Unauthorized access on {Directory}", currentDir);
|
||||
logger.LogCritical(e, "Unauthorized access on {Directory}", currentDir);
|
||||
continue;
|
||||
}
|
||||
// Thrown if another process has deleted the directory after we retrieved its name.
|
||||
catch (DirectoryNotFoundException e) {
|
||||
Console.WriteLine(e.Message);
|
||||
logger.LogError(e, "Directory not found on {Directory}", currentDir);
|
||||
logger.LogCritical(e, "Directory not found on {Directory}", currentDir);
|
||||
continue;
|
||||
}
|
||||
|
||||
@ -558,15 +733,15 @@ namespace API.Services
|
||||
.ToArray();
|
||||
}
|
||||
catch (UnauthorizedAccessException e) {
|
||||
Console.WriteLine(e.Message);
|
||||
logger.LogCritical(e, "Unauthorized access on a file in {Directory}", currentDir);
|
||||
continue;
|
||||
}
|
||||
catch (DirectoryNotFoundException e) {
|
||||
Console.WriteLine(e.Message);
|
||||
logger.LogCritical(e, "Directory not found on a file in {Directory}", currentDir);
|
||||
continue;
|
||||
}
|
||||
catch (IOException e) {
|
||||
Console.WriteLine(e.Message);
|
||||
logger.LogCritical(e, "IO exception on a file in {Directory}", currentDir);
|
||||
continue;
|
||||
}
|
||||
|
||||
@ -577,19 +752,16 @@ namespace API.Services
|
||||
foreach (var file in files) {
|
||||
action(file);
|
||||
fileCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
catch (AggregateException ae) {
|
||||
ae.Handle((ex) => {
|
||||
if (ex is UnauthorizedAccessException) {
|
||||
// Here we just output a message and go on.
|
||||
Console.WriteLine(ex.Message);
|
||||
_logger.LogError(ex, "Unauthorized access on file");
|
||||
return true;
|
||||
}
|
||||
// Handle other exceptions here if necessary...
|
||||
if (ex is not UnauthorizedAccessException) return false;
|
||||
// Here we just output a message and go on.
|
||||
_logger.LogError(ex, "Unauthorized access on file");
|
||||
return true;
|
||||
// Handle other exceptions here if necessary...
|
||||
|
||||
return false;
|
||||
});
|
||||
}
|
||||
|
||||
|
@ -1,6 +1,7 @@
|
||||
using System;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
|
||||
@ -23,6 +24,8 @@ namespace API.Services.HostedServices
|
||||
await taskScheduler.ScheduleTasks();
|
||||
taskScheduler.ScheduleUpdaterTasks();
|
||||
|
||||
|
||||
|
||||
try
|
||||
{
|
||||
// These methods will automatically check if stat collection is disabled to prevent sending any data regardless
|
||||
@ -34,6 +37,9 @@ namespace API.Services.HostedServices
|
||||
{
|
||||
//If stats startup fail the user can keep using the app
|
||||
}
|
||||
|
||||
var libraryWatcher = scope.ServiceProvider.GetRequiredService<ILibraryWatcher>();
|
||||
//await libraryWatcher.StartWatchingLibraries(); // TODO: Enable this in the next PR
|
||||
}
|
||||
|
||||
public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask;
|
||||
|
@ -37,6 +37,9 @@ public interface IMetadataService
|
||||
/// <param name="seriesId"></param>
|
||||
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
|
||||
Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true);
|
||||
|
||||
Task GenerateCoversForSeries(Series series, bool forceUpdate = false);
|
||||
Task RemoveAbandonedMetadataKeys();
|
||||
}
|
||||
|
||||
public class MetadataService : IMetadataService
|
||||
@ -77,10 +80,8 @@ public class MetadataService : IMetadataService
|
||||
|
||||
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile.FilePath);
|
||||
chapter.CoverImage = _readingItemService.GetCoverImage(firstFile.FilePath, ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId), firstFile.Format);
|
||||
|
||||
// await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate,
|
||||
// MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter), false);
|
||||
_updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter));
|
||||
_unitOfWork.ChapterRepository.Update(chapter); // BUG: CoverImage isn't saving for Monter Masume with new scan loop
|
||||
_updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter)); // TODO: IDEA: Instead of firing here where it's not yet saved, maybe collect the ids and fire after save
|
||||
return Task.FromResult(true);
|
||||
}
|
||||
|
||||
@ -271,17 +272,18 @@ public class MetadataService : IMetadataService
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
|
||||
MessageFactory.CoverUpdateProgressEvent(library.Id, 1F, ProgressEventType.Ended, $"Complete"));
|
||||
|
||||
await RemoveAbandonedMetadataKeys();
|
||||
|
||||
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
|
||||
}
|
||||
|
||||
|
||||
private async Task RemoveAbandonedMetadataKeys()
|
||||
public async Task RemoveAbandonedMetadataKeys()
|
||||
{
|
||||
await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated();
|
||||
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
|
||||
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
|
||||
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
|
||||
await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters();
|
||||
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -292,7 +294,6 @@ public class MetadataService : IMetadataService
|
||||
/// <param name="forceUpdate">Overrides any cache logic and forces execution</param>
|
||||
public async Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true)
|
||||
{
|
||||
var sw = Stopwatch.StartNew();
|
||||
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
|
||||
if (series == null)
|
||||
{
|
||||
@ -300,8 +301,19 @@ public class MetadataService : IMetadataService
|
||||
return;
|
||||
}
|
||||
|
||||
await GenerateCoversForSeries(series, forceUpdate);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Generate Cover for a Series. This is used by Scan Loop and should not be invoked directly via User Interaction.
|
||||
/// </summary>
|
||||
/// <param name="series">A full Series, with metadata, chapters, etc</param>
|
||||
/// <param name="forceUpdate"></param>
|
||||
public async Task GenerateCoversForSeries(Series series, bool forceUpdate = false)
|
||||
{
|
||||
var sw = Stopwatch.StartNew();
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
|
||||
MessageFactory.CoverUpdateProgressEvent(libraryId, 0F, ProgressEventType.Started, series.Name));
|
||||
MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 0F, ProgressEventType.Started, series.Name));
|
||||
|
||||
await ProcessSeriesCoverGen(series, forceUpdate);
|
||||
|
||||
@ -309,17 +321,14 @@ public class MetadataService : IMetadataService
|
||||
if (_unitOfWork.HasChanges())
|
||||
{
|
||||
await _unitOfWork.CommitAsync();
|
||||
_logger.LogInformation("[MetadataService] Updated cover images for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
|
||||
}
|
||||
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
|
||||
MessageFactory.CoverUpdateProgressEvent(libraryId, 1F, ProgressEventType.Ended, series.Name));
|
||||
|
||||
await RemoveAbandonedMetadataKeys();
|
||||
MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 1F, ProgressEventType.Ended, series.Name));
|
||||
|
||||
await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false);
|
||||
await FlushEvents();
|
||||
|
||||
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
|
||||
}
|
||||
|
||||
private async Task FlushEvents()
|
||||
|
@ -12,6 +12,7 @@ public interface IReadingItemService
|
||||
string GetCoverImage(string filePath, string fileName, MangaFormat format);
|
||||
void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1);
|
||||
ParserInfo Parse(string path, string rootPath, LibraryType type);
|
||||
ParserInfo ParseFile(string path, string rootPath, LibraryType type);
|
||||
}
|
||||
|
||||
public class ReadingItemService : IReadingItemService
|
||||
@ -20,7 +21,7 @@ public class ReadingItemService : IReadingItemService
|
||||
private readonly IBookService _bookService;
|
||||
private readonly IImageService _imageService;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly DefaultParser _defaultParser;
|
||||
private readonly IDefaultParser _defaultParser;
|
||||
|
||||
public ReadingItemService(IArchiveService archiveService, IBookService bookService, IImageService imageService, IDirectoryService directoryService)
|
||||
{
|
||||
@ -52,6 +53,71 @@ public class ReadingItemService : IReadingItemService
|
||||
return null;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Processes files found during a library scan.
|
||||
/// </summary>
|
||||
/// <param name="path">Path of a file</param>
|
||||
/// <param name="rootPath"></param>
|
||||
/// <param name="type">Library type to determine parsing to perform</param>
|
||||
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
|
||||
{
|
||||
var info = Parse(path, rootPath, type);
|
||||
if (info == null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
|
||||
// This catches when original library type is Manga/Comic and when parsing with non
|
||||
if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
|
||||
{
|
||||
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
|
||||
var info2 = Parse(path, rootPath, type);
|
||||
info.Merge(info2);
|
||||
}
|
||||
|
||||
info.ComicInfo = GetComicInfo(path);
|
||||
if (info.ComicInfo == null) return info;
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
|
||||
{
|
||||
info.Volumes = info.ComicInfo.Volume;
|
||||
}
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
|
||||
{
|
||||
info.Series = info.ComicInfo.Series.Trim();
|
||||
}
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
|
||||
{
|
||||
info.Chapters = info.ComicInfo.Number;
|
||||
}
|
||||
|
||||
// Patch is SeriesSort from ComicInfo
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort))
|
||||
{
|
||||
info.SeriesSort = info.ComicInfo.TitleSort.Trim();
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format))
|
||||
{
|
||||
info.IsSpecial = true;
|
||||
info.Chapters = Parser.Parser.DefaultChapter;
|
||||
info.Volumes = Parser.Parser.DefaultVolume;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort))
|
||||
{
|
||||
info.SeriesSort = info.ComicInfo.SeriesSort.Trim();
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries))
|
||||
{
|
||||
info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim();
|
||||
}
|
||||
|
||||
return info;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
///
|
||||
/// </summary>
|
||||
|
@ -422,8 +422,17 @@ public class SeriesService : ISeriesService
|
||||
}
|
||||
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(seriesIds);
|
||||
var libraryIds = series.Select(s => s.LibraryId);
|
||||
var libraries = await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(libraryIds);
|
||||
foreach (var library in libraries)
|
||||
{
|
||||
library.LastModified = DateTime.Now;
|
||||
_unitOfWork.LibraryRepository.Update(library);
|
||||
}
|
||||
|
||||
_unitOfWork.SeriesRepository.Remove(series);
|
||||
|
||||
|
||||
if (!_unitOfWork.HasChanges() || !await _unitOfWork.CommitAsync()) return true;
|
||||
|
||||
foreach (var s in series)
|
||||
|
@ -8,8 +8,8 @@ using API.Entities.Enums;
|
||||
using API.Helpers.Converters;
|
||||
using API.Services.Tasks;
|
||||
using API.Services.Tasks.Metadata;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using Hangfire;
|
||||
using Hangfire.Storage;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services;
|
||||
@ -29,8 +29,6 @@ public interface ITaskScheduler
|
||||
void CancelStatsTasks();
|
||||
Task RunStatCollection();
|
||||
void ScanSiteThemes();
|
||||
|
||||
|
||||
}
|
||||
public class TaskScheduler : ITaskScheduler
|
||||
{
|
||||
@ -48,6 +46,9 @@ public class TaskScheduler : ITaskScheduler
|
||||
private readonly IWordCountAnalyzerService _wordCountAnalyzerService;
|
||||
|
||||
public static BackgroundJobServer Client => new BackgroundJobServer();
|
||||
public const string ScanQueue = "scan";
|
||||
public const string DefaultQueue = "default";
|
||||
|
||||
private static readonly Random Rnd = new Random();
|
||||
|
||||
|
||||
@ -83,7 +84,7 @@ public class TaskScheduler : ITaskScheduler
|
||||
}
|
||||
else
|
||||
{
|
||||
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
|
||||
RecurringJob.AddOrUpdate("scan-libraries", () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local);
|
||||
}
|
||||
|
||||
setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value;
|
||||
@ -149,6 +150,7 @@ public class TaskScheduler : ITaskScheduler
|
||||
BackgroundJob.Enqueue(() => _themeService.Scan());
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region UpdateTasks
|
||||
@ -161,13 +163,31 @@ public class TaskScheduler : ITaskScheduler
|
||||
}
|
||||
#endregion
|
||||
|
||||
public void ScanLibraries()
|
||||
{
|
||||
if (RunningAnyTasksByMethod(new List<string>() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue))
|
||||
{
|
||||
_logger.LogInformation("A Scan is already running, rescheduling ScanLibraries in 3 hours");
|
||||
BackgroundJob.Schedule(() => ScanLibraries(), TimeSpan.FromHours(3));
|
||||
return;
|
||||
}
|
||||
_scannerService.ScanLibraries();
|
||||
}
|
||||
|
||||
public void ScanLibrary(int libraryId)
|
||||
{
|
||||
if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}))
|
||||
if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId}, ScanQueue))
|
||||
{
|
||||
_logger.LogInformation("A duplicate request to scan library for library occured. Skipping");
|
||||
return;
|
||||
}
|
||||
if (RunningAnyTasksByMethod(new List<string>() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue))
|
||||
{
|
||||
_logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours");
|
||||
BackgroundJob.Schedule(() => ScanLibrary(libraryId), TimeSpan.FromHours(3));
|
||||
return;
|
||||
}
|
||||
|
||||
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
|
||||
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId));
|
||||
// When we do a scan, force cache to re-unpack in case page numbers change
|
||||
@ -181,7 +201,7 @@ public class TaskScheduler : ITaskScheduler
|
||||
|
||||
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
|
||||
{
|
||||
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate}))
|
||||
if (HasAlreadyEnqueuedTask("MetadataService","GenerateCoversForLibrary", new object[] {libraryId, forceUpdate}))
|
||||
{
|
||||
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
|
||||
return;
|
||||
@ -193,7 +213,7 @@ public class TaskScheduler : ITaskScheduler
|
||||
|
||||
public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false)
|
||||
{
|
||||
if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate}))
|
||||
if (HasAlreadyEnqueuedTask("MetadataService","GenerateCoversForSeries", new object[] {libraryId, seriesId, forceUpdate}))
|
||||
{
|
||||
_logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping");
|
||||
return;
|
||||
@ -205,14 +225,20 @@ public class TaskScheduler : ITaskScheduler
|
||||
|
||||
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
|
||||
{
|
||||
if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate}))
|
||||
if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {seriesId, forceUpdate}, ScanQueue))
|
||||
{
|
||||
_logger.LogInformation("A duplicate request to scan series occured. Skipping");
|
||||
return;
|
||||
}
|
||||
if (RunningAnyTasksByMethod(new List<string>() {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}, ScanQueue))
|
||||
{
|
||||
_logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 mins");
|
||||
BackgroundJob.Schedule(() => ScanSeries(libraryId, seriesId, forceUpdate), TimeSpan.FromMinutes(10));
|
||||
return;
|
||||
}
|
||||
|
||||
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
|
||||
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None));
|
||||
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(seriesId, forceUpdate));
|
||||
}
|
||||
|
||||
public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false)
|
||||
@ -250,7 +276,7 @@ public class TaskScheduler : ITaskScheduler
|
||||
/// <param name="args">object[] of arguments in the order they are passed to enqueued job</param>
|
||||
/// <param name="queue">Queue to check against. Defaults to "default"</param>
|
||||
/// <returns></returns>
|
||||
private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default")
|
||||
public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue)
|
||||
{
|
||||
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
|
||||
return enqueuedJobs.Any(j => j.Value.InEnqueuedState &&
|
||||
@ -258,4 +284,11 @@ public class TaskScheduler : ITaskScheduler
|
||||
j.Value.Job.Method.Name.Equals(methodName) &&
|
||||
j.Value.Job.Method.DeclaringType.Name.Equals(className));
|
||||
}
|
||||
|
||||
public static bool RunningAnyTasksByMethod(IEnumerable<string> classNames, string queue = DefaultQueue)
|
||||
{
|
||||
var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue);
|
||||
return enqueuedJobs.Any(j => !j.Value.InEnqueuedState &&
|
||||
classNames.Contains(j.Value.Job.Method.DeclaringType?.Name));
|
||||
}
|
||||
}
|
||||
|
@ -142,7 +142,8 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService
|
||||
_logger.LogInformation("[WordCountAnalyzerService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
|
||||
}
|
||||
|
||||
private async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true)
|
||||
|
||||
public async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true)
|
||||
{
|
||||
var isEpub = series.Format == MangaFormat.Epub;
|
||||
var existingWordCount = series.WordCount;
|
||||
@ -208,6 +209,11 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService
|
||||
chapter.MinHoursToRead = est.MinHours;
|
||||
chapter.MaxHoursToRead = est.MaxHours;
|
||||
chapter.AvgHoursToRead = est.AvgHours;
|
||||
foreach (var file in chapter.Files)
|
||||
{
|
||||
file.LastFileAnalysis = DateTime.Now;
|
||||
_unitOfWork.MangaFileRepository.Update(file);
|
||||
}
|
||||
_unitOfWork.ChapterRepository.Update(chapter);
|
||||
}
|
||||
|
||||
|
212
API/Services/Tasks/Scanner/LibraryWatcher.cs
Normal file
212
API/Services/Tasks/Scanner/LibraryWatcher.cs
Normal file
@ -0,0 +1,212 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text.RegularExpressions;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using Hangfire;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services.Tasks.Scanner;
|
||||
|
||||
public interface ILibraryWatcher
|
||||
{
|
||||
Task StartWatchingLibraries();
|
||||
}
|
||||
|
||||
internal class FolderScanQueueable
|
||||
{
|
||||
public DateTime QueueTime { get; set; }
|
||||
public string FolderPath { get; set; }
|
||||
}
|
||||
|
||||
internal class FolderScanQueueableComparer : IEqualityComparer<FolderScanQueueable>
|
||||
{
|
||||
public bool Equals(FolderScanQueueable x, FolderScanQueueable y)
|
||||
{
|
||||
if (ReferenceEquals(x, y)) return true;
|
||||
if (ReferenceEquals(x, null)) return false;
|
||||
if (ReferenceEquals(y, null)) return false;
|
||||
if (x.GetType() != y.GetType()) return false;
|
||||
return x.FolderPath == y.FolderPath;
|
||||
}
|
||||
|
||||
public int GetHashCode(FolderScanQueueable obj)
|
||||
{
|
||||
return HashCode.Combine(obj.FolderPath);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Responsible for watching the file system and processing change events. This is mainly responsible for invoking
|
||||
/// Scanner to quickly pickup on changes.
|
||||
/// </summary>
|
||||
public class LibraryWatcher : ILibraryWatcher
|
||||
{
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ILogger<LibraryWatcher> _logger;
|
||||
private readonly IScannerService _scannerService;
|
||||
|
||||
private readonly IList<FileSystemWatcher> _watchers = new List<FileSystemWatcher>();
|
||||
|
||||
private readonly Dictionary<string, IList<FileSystemWatcher>> _watcherDictionary = new ();
|
||||
|
||||
private IList<string> _libraryFolders = new List<string>();
|
||||
|
||||
// TODO: This needs to be blocking so we can consume from another thread
|
||||
private readonly Queue<FolderScanQueueable> _scanQueue = new Queue<FolderScanQueueable>();
|
||||
//public readonly BlockingCollection<FolderScanQueueable> ScanQueue = new BlockingCollection<FolderScanQueueable>();
|
||||
private readonly TimeSpan _queueWaitTime;
|
||||
|
||||
|
||||
|
||||
public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger<LibraryWatcher> logger, IScannerService scannerService, IHostEnvironment environment)
|
||||
{
|
||||
_directoryService = directoryService;
|
||||
_unitOfWork = unitOfWork;
|
||||
_logger = logger;
|
||||
_scannerService = scannerService;
|
||||
|
||||
_queueWaitTime = environment.IsDevelopment() ? TimeSpan.FromSeconds(10) : TimeSpan.FromMinutes(5);
|
||||
|
||||
}
|
||||
|
||||
public async Task StartWatchingLibraries()
|
||||
{
|
||||
_logger.LogInformation("Starting file watchers");
|
||||
_libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()).SelectMany(l => l.Folders).ToList();
|
||||
|
||||
foreach (var library in await _unitOfWork.LibraryRepository.GetLibraryDtosAsync())
|
||||
{
|
||||
foreach (var libraryFolder in library.Folders)
|
||||
{
|
||||
_logger.LogInformation("Watching {FolderPath}", libraryFolder);
|
||||
var watcher = new FileSystemWatcher(libraryFolder);
|
||||
watcher.NotifyFilter = NotifyFilters.CreationTime
|
||||
| NotifyFilters.DirectoryName
|
||||
| NotifyFilters.FileName
|
||||
| NotifyFilters.LastWrite
|
||||
| NotifyFilters.Size;
|
||||
|
||||
watcher.Changed += OnChanged;
|
||||
watcher.Created += OnCreated;
|
||||
watcher.Deleted += OnDeleted;
|
||||
watcher.Renamed += OnRenamed;
|
||||
|
||||
watcher.Filter = "*.*"; // TODO: Configure with Parser files
|
||||
watcher.IncludeSubdirectories = true;
|
||||
watcher.EnableRaisingEvents = true;
|
||||
_logger.LogInformation("Watching {Folder}", libraryFolder);
|
||||
_watchers.Add(watcher);
|
||||
if (!_watcherDictionary.ContainsKey(libraryFolder))
|
||||
{
|
||||
_watcherDictionary.Add(libraryFolder, new List<FileSystemWatcher>());
|
||||
}
|
||||
|
||||
_watcherDictionary[libraryFolder].Add(watcher);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void OnChanged(object sender, FileSystemEventArgs e)
|
||||
{
|
||||
if (e.ChangeType != WatcherChangeTypes.Changed) return;
|
||||
Console.WriteLine($"Changed: {e.FullPath}, {e.Name}");
|
||||
ProcessChange(e.FullPath);
|
||||
}
|
||||
|
||||
private void OnCreated(object sender, FileSystemEventArgs e)
|
||||
{
|
||||
Console.WriteLine($"Created: {e.FullPath}, {e.Name}");
|
||||
ProcessChange(e.FullPath);
|
||||
}
|
||||
|
||||
private void OnDeleted(object sender, FileSystemEventArgs e) {
|
||||
Console.WriteLine($"Deleted: {e.FullPath}, {e.Name}");
|
||||
ProcessChange(e.FullPath);
|
||||
}
|
||||
|
||||
|
||||
|
||||
private void OnRenamed(object sender, RenamedEventArgs e)
|
||||
{
|
||||
Console.WriteLine($"Renamed:");
|
||||
Console.WriteLine($" Old: {e.OldFullPath}");
|
||||
Console.WriteLine($" New: {e.FullPath}");
|
||||
ProcessChange(e.FullPath);
|
||||
}
|
||||
|
||||
private void ProcessChange(string filePath)
|
||||
{
|
||||
if (!new Regex(Parser.Parser.SupportedExtensions).IsMatch(new FileInfo(filePath).Extension)) return;
|
||||
// Don't do anything if a Library or ScanSeries in progress
|
||||
if (TaskScheduler.RunningAnyTasksByMethod(new[] {"MetadataService", "ScannerService"}))
|
||||
{
|
||||
_logger.LogDebug("Suppressing Change due to scan being inprogress");
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
var parentDirectory = _directoryService.GetParentDirectoryName(filePath);
|
||||
if (string.IsNullOrEmpty(parentDirectory)) return;
|
||||
|
||||
// We need to find the library this creation belongs to
|
||||
// Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault
|
||||
var libraryFolder = _libraryFolders.Select(Parser.Parser.NormalizePath).FirstOrDefault(f => f.Contains(parentDirectory));
|
||||
|
||||
if (string.IsNullOrEmpty(libraryFolder)) return;
|
||||
|
||||
var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList();
|
||||
if (!rootFolder.Any()) return;
|
||||
|
||||
// Select the first folder and join with library folder, this should give us the folder to scan.
|
||||
var fullPath = _directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.First());
|
||||
var queueItem = new FolderScanQueueable()
|
||||
{
|
||||
FolderPath = fullPath,
|
||||
QueueTime = DateTime.Now
|
||||
};
|
||||
if (_scanQueue.Contains(queueItem, new FolderScanQueueableComparer()))
|
||||
{
|
||||
ProcessQueue();
|
||||
return;
|
||||
}
|
||||
|
||||
_scanQueue.Enqueue(queueItem);
|
||||
|
||||
ProcessQueue();
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Instead of making things complicated with a separate thread, this service will process the queue whenever a change occurs
|
||||
/// </summary>
|
||||
private void ProcessQueue()
|
||||
{
|
||||
var i = 0;
|
||||
while (i < _scanQueue.Count)
|
||||
{
|
||||
var item = _scanQueue.Peek();
|
||||
if (item.QueueTime < DateTime.Now.Subtract(_queueWaitTime))
|
||||
{
|
||||
_logger.LogDebug("Scheduling ScanSeriesFolder for {Folder}", item.FolderPath);
|
||||
BackgroundJob.Enqueue(() => _scannerService.ScanFolder(item.FolderPath));
|
||||
_scanQueue.Dequeue();
|
||||
i++;
|
||||
}
|
||||
else
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (_scanQueue.Count > 0)
|
||||
{
|
||||
Task.Delay(TimeSpan.FromSeconds(10)).ContinueWith(t=> ProcessQueue());
|
||||
}
|
||||
|
||||
}
|
||||
}
|
@ -1,37 +1,55 @@
|
||||
using System;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data.Metadata;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Helpers;
|
||||
using API.Parser;
|
||||
using API.SignalR;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services.Tasks.Scanner
|
||||
{
|
||||
public class ParsedSeries
|
||||
{
|
||||
/// <summary>
|
||||
/// Name of the Series
|
||||
/// </summary>
|
||||
public string Name { get; init; }
|
||||
/// <summary>
|
||||
/// Normalized Name of the Series
|
||||
/// </summary>
|
||||
public string NormalizedName { get; init; }
|
||||
/// <summary>
|
||||
/// Format of the Series
|
||||
/// </summary>
|
||||
public MangaFormat Format { get; init; }
|
||||
}
|
||||
|
||||
public enum Modified
|
||||
{
|
||||
Modified = 1,
|
||||
NotModified = 2
|
||||
}
|
||||
|
||||
public class SeriesModified
|
||||
{
|
||||
public string FolderPath { get; set; }
|
||||
public string SeriesName { get; set; }
|
||||
public DateTime LastScanned { get; set; }
|
||||
public MangaFormat Format { get; set; }
|
||||
}
|
||||
|
||||
|
||||
public class ParseScannedFiles
|
||||
{
|
||||
private readonly ConcurrentDictionary<ParsedSeries, List<ParserInfo>> _scannedSeries;
|
||||
private readonly ILogger _logger;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly IReadingItemService _readingItemService;
|
||||
private readonly IEventHub _eventHub;
|
||||
private readonly DefaultParser _defaultParser;
|
||||
|
||||
/// <summary>
|
||||
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
|
||||
@ -47,8 +65,6 @@ namespace API.Services.Tasks.Scanner
|
||||
_logger = logger;
|
||||
_directoryService = directoryService;
|
||||
_readingItemService = readingItemService;
|
||||
_scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
|
||||
_defaultParser = new DefaultParser(_directoryService);
|
||||
_eventHub = eventHub;
|
||||
}
|
||||
|
||||
@ -58,7 +74,7 @@ namespace API.Services.Tasks.Scanner
|
||||
/// <param name="parsedSeries"></param>
|
||||
/// <param name="series"></param>
|
||||
/// <returns></returns>
|
||||
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries, Series series)
|
||||
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, IList<ParserInfo>> parsedSeries, Series series)
|
||||
{
|
||||
var allKeys = parsedSeries.Keys.Where(ps =>
|
||||
SeriesHelper.FindSeries(series, ps));
|
||||
@ -72,83 +88,46 @@ namespace API.Services.Tasks.Scanner
|
||||
return infos;
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Processes files found during a library scan.
|
||||
/// Populates a collection of <see cref="ParserInfo"/> for DB updates later.
|
||||
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
|
||||
/// </summary>
|
||||
/// <param name="path">Path of a file</param>
|
||||
/// <param name="rootPath"></param>
|
||||
/// <param name="type">Library type to determine parsing to perform</param>
|
||||
private void ProcessFile(string path, string rootPath, LibraryType type)
|
||||
/// <param name="scanDirectoryByDirectory">Scan directory by directory and for each, call folderAction</param>
|
||||
/// <param name="folderPath">A library folder or series folder</param>
|
||||
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
|
||||
/// <param name="forceCheck">If we should bypass any folder last write time checks on the scan and force I/O</param>
|
||||
public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory,
|
||||
IDictionary<string, IList<SeriesModified>> seriesPaths, Func<IList<string>, string,Task> folderAction, bool forceCheck = false)
|
||||
{
|
||||
var info = _readingItemService.Parse(path, rootPath, type);
|
||||
if (info == null)
|
||||
string normalizedPath;
|
||||
if (scanDirectoryByDirectory)
|
||||
{
|
||||
// If the file is an image and literally a cover image, skip processing.
|
||||
if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path)))
|
||||
var directories = _directoryService.GetDirectories(folderPath).ToList();
|
||||
|
||||
foreach (var directory in directories)
|
||||
{
|
||||
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
|
||||
normalizedPath = Parser.Parser.NormalizePath(directory);
|
||||
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
|
||||
{
|
||||
await folderAction(new List<string>(), directory);
|
||||
}
|
||||
else
|
||||
{
|
||||
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
|
||||
await folderAction(_directoryService.ScanFiles(directory), directory);
|
||||
}
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
// This catches when original library type is Manga/Comic and when parsing with non
|
||||
if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume?
|
||||
normalizedPath = Parser.Parser.NormalizePath(folderPath);
|
||||
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck))
|
||||
{
|
||||
info = _defaultParser.Parse(path, rootPath, LibraryType.Book);
|
||||
var info2 = _readingItemService.Parse(path, rootPath, type);
|
||||
info.Merge(info2);
|
||||
}
|
||||
|
||||
info.ComicInfo = _readingItemService.GetComicInfo(path);
|
||||
if (info.ComicInfo != null)
|
||||
{
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
|
||||
{
|
||||
info.Volumes = info.ComicInfo.Volume;
|
||||
}
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
|
||||
{
|
||||
info.Series = info.ComicInfo.Series.Trim();
|
||||
}
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
|
||||
{
|
||||
info.Chapters = info.ComicInfo.Number;
|
||||
}
|
||||
|
||||
// Patch is SeriesSort from ComicInfo
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort))
|
||||
{
|
||||
info.SeriesSort = info.ComicInfo.TitleSort.Trim();
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format))
|
||||
{
|
||||
info.IsSpecial = true;
|
||||
info.Chapters = Parser.Parser.DefaultChapter;
|
||||
info.Volumes = Parser.Parser.DefaultVolume;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort))
|
||||
{
|
||||
info.SeriesSort = info.ComicInfo.SeriesSort.Trim();
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries))
|
||||
{
|
||||
info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim();
|
||||
}
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
TrackSeries(info);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
|
||||
await folderAction(new List<string>(), folderPath);
|
||||
return;
|
||||
}
|
||||
await folderAction(_directoryService.ScanFiles(folderPath), folderPath);
|
||||
}
|
||||
|
||||
|
||||
@ -156,13 +135,14 @@ namespace API.Services.Tasks.Scanner
|
||||
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
|
||||
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
|
||||
/// </summary>
|
||||
/// <param name="scannedSeries">A localized list of a series' parsed infos</param>
|
||||
/// <param name="info"></param>
|
||||
private void TrackSeries(ParserInfo info)
|
||||
private void TrackSeries(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
|
||||
{
|
||||
if (info.Series == string.Empty) return;
|
||||
|
||||
// Check if normalized info.Series already exists and if so, update info to use that name instead
|
||||
info.Series = MergeName(info);
|
||||
info.Series = MergeName(scannedSeries, info);
|
||||
|
||||
var normalizedSeries = Parser.Parser.Normalize(info.Series);
|
||||
var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort);
|
||||
@ -170,7 +150,7 @@ namespace API.Services.Tasks.Scanner
|
||||
|
||||
try
|
||||
{
|
||||
var existingKey = _scannedSeries.Keys.SingleOrDefault(ps =>
|
||||
var existingKey = scannedSeries.Keys.SingleOrDefault(ps =>
|
||||
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|
||||
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|
||||
|| ps.NormalizedName.Equals(normalizedSortSeries)));
|
||||
@ -181,7 +161,7 @@ namespace API.Services.Tasks.Scanner
|
||||
NormalizedName = normalizedSeries
|
||||
};
|
||||
|
||||
_scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
scannedSeries.AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
{
|
||||
oldValue ??= new List<ParserInfo>();
|
||||
if (!oldValue.Contains(info))
|
||||
@ -195,7 +175,7 @@ namespace API.Services.Tasks.Scanner
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series);
|
||||
foreach (var seriesKey in _scannedSeries.Keys.Where(ps =>
|
||||
foreach (var seriesKey in scannedSeries.Keys.Where(ps =>
|
||||
ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries)
|
||||
|| ps.NormalizedName.Equals(normalizedLocalizedSeries)
|
||||
|| ps.NormalizedName.Equals(normalizedSortSeries))))
|
||||
@ -205,23 +185,24 @@ namespace API.Services.Tasks.Scanner
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
|
||||
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
|
||||
/// </summary>
|
||||
/// <param name="info"></param>
|
||||
/// <returns>Series Name to group this info into</returns>
|
||||
public string MergeName(ParserInfo info)
|
||||
public string MergeName(ConcurrentDictionary<ParsedSeries, List<ParserInfo>> scannedSeries, ParserInfo info)
|
||||
{
|
||||
var normalizedSeries = Parser.Parser.Normalize(info.Series);
|
||||
var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries);
|
||||
// We use FirstOrDefault because this was introduced late in development and users might have 2 series with both names
|
||||
|
||||
try
|
||||
{
|
||||
var existingName =
|
||||
_scannedSeries.SingleOrDefault(p =>
|
||||
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
|
||||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
|
||||
scannedSeries.SingleOrDefault(p =>
|
||||
(Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) ||
|
||||
Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) &&
|
||||
p.Key.Format == info.Format)
|
||||
.Key;
|
||||
|
||||
@ -233,7 +214,7 @@ namespace API.Services.Tasks.Scanner
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath);
|
||||
var values = _scannedSeries.Where(p =>
|
||||
var values = scannedSeries.Where(p =>
|
||||
(Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries ||
|
||||
Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) &&
|
||||
p.Key.Format == info.Format);
|
||||
@ -247,34 +228,69 @@ namespace API.Services.Tasks.Scanner
|
||||
return info.Series;
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
///
|
||||
/// This is a new version which will process series by folder groups.
|
||||
/// </summary>
|
||||
/// <param name="libraryType">Type of library. Used for selecting the correct file extensions to search for and parsing files</param>
|
||||
/// <param name="folders">The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders</param>
|
||||
/// <param name="libraryName">Name of the Library</param>
|
||||
/// <param name="libraryType"></param>
|
||||
/// <param name="folders"></param>
|
||||
/// <param name="libraryName"></param>
|
||||
/// <returns></returns>
|
||||
public async Task<Dictionary<ParsedSeries, List<ParserInfo>>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable<string> folders, string libraryName)
|
||||
public async Task ScanLibrariesForSeries(LibraryType libraryType,
|
||||
IEnumerable<string> folders, string libraryName, bool isLibraryScan,
|
||||
IDictionary<string, IList<SeriesModified>> seriesPaths, Action<Tuple<bool, IList<ParserInfo>>> processSeriesInfos, bool forceCheck = false)
|
||||
{
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Started));
|
||||
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("Starting file scan", libraryName, ProgressEventType.Started));
|
||||
|
||||
foreach (var folderPath in folders)
|
||||
{
|
||||
try
|
||||
{
|
||||
async void Action(string f)
|
||||
await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) =>
|
||||
{
|
||||
try
|
||||
var normalizedFolder = Parser.Parser.NormalizePath(folder);
|
||||
if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck))
|
||||
{
|
||||
ProcessFile(f, folderPath, libraryType);
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(f, libraryName, ProgressEventType.Updated));
|
||||
var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo()
|
||||
{
|
||||
Series = fp.SeriesName,
|
||||
Format = fp.Format,
|
||||
}).ToList();
|
||||
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(true, parsedInfos));
|
||||
_logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder);
|
||||
return;
|
||||
}
|
||||
catch (FileNotFoundException exception)
|
||||
{
|
||||
_logger.LogError(exception, "The file {Filename} could not be found", f);
|
||||
}
|
||||
}
|
||||
_logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder);
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated));
|
||||
var scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
|
||||
var infos = files.Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)).Where(info => info != null).ToList();
|
||||
|
||||
_directoryService.TraverseTreeParallelForEach(folderPath, Action, Parser.Parser.SupportedExtensions, _logger);
|
||||
|
||||
MergeLocalizedSeriesWithSeries(infos);
|
||||
|
||||
foreach (var info in infos)
|
||||
{
|
||||
try
|
||||
{
|
||||
TrackSeries(scannedSeries, info);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath);
|
||||
}
|
||||
}
|
||||
|
||||
// It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it
|
||||
// Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow)
|
||||
foreach (var series in scannedSeries.Keys)
|
||||
{
|
||||
if (scannedSeries[series].Count > 0 && processSeriesInfos != null)
|
||||
{
|
||||
processSeriesInfos.Invoke(new Tuple<bool, IList<ParserInfo>>(false, scannedSeries[series]));
|
||||
}
|
||||
}
|
||||
}, forceCheck);
|
||||
}
|
||||
catch (ArgumentException ex)
|
||||
{
|
||||
@ -282,20 +298,47 @@ namespace API.Services.Tasks.Scanner
|
||||
}
|
||||
}
|
||||
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Ended));
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, libraryName, ProgressEventType.Ended));
|
||||
}
|
||||
|
||||
return SeriesWithInfos();
|
||||
private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary<string, IList<SeriesModified>> seriesPaths, string normalizedFolder, bool forceCheck = false)
|
||||
{
|
||||
if (forceCheck) return false;
|
||||
|
||||
return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerMinute) >=
|
||||
_directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerMinute));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Returns any series where there were parsed infos
|
||||
/// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so,
|
||||
/// rewrites the infos with series name instead of the localized name, so they stack.
|
||||
/// </summary>
|
||||
/// <returns></returns>
|
||||
private Dictionary<ParsedSeries, List<ParserInfo>> SeriesWithInfos()
|
||||
/// <example>
|
||||
/// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration"
|
||||
/// World of Acceleration v02.cbz has Series "World of Acceleration"
|
||||
/// After running this code, we'd have:
|
||||
/// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration"
|
||||
/// </example>
|
||||
/// <param name="infos">A collection of ParserInfos</param>
|
||||
private static void MergeLocalizedSeriesWithSeries(IReadOnlyCollection<ParserInfo> infos)
|
||||
{
|
||||
var filtered = _scannedSeries.Where(kvp => kvp.Value.Count > 0);
|
||||
var series = filtered.ToDictionary(v => v.Key, v => v.Value);
|
||||
return series;
|
||||
var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries));
|
||||
if (!hasLocalizedSeries) return;
|
||||
|
||||
var localizedSeries = infos.Select(i => i.LocalizedSeries).Distinct()
|
||||
.FirstOrDefault(i => !string.IsNullOrEmpty(i));
|
||||
if (string.IsNullOrEmpty(localizedSeries)) return;
|
||||
|
||||
var nonLocalizedSeries = infos.Select(i => i.Series).Distinct()
|
||||
.FirstOrDefault(series => !series.Equals(localizedSeries));
|
||||
|
||||
var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries);
|
||||
foreach (var infoNeedingMapping in infos.Where(i =>
|
||||
!Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries)))
|
||||
{
|
||||
infoNeedingMapping.Series = nonLocalizedSeries;
|
||||
infoNeedingMapping.LocalizedSeries = localizedSeries;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
776
API/Services/Tasks/Scanner/ProcessSeries.cs
Normal file
776
API/Services/Tasks/Scanner/ProcessSeries.cs
Normal file
@ -0,0 +1,776 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Diagnostics;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Metadata;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Helpers;
|
||||
using API.Parser;
|
||||
using API.Services.Tasks.Metadata;
|
||||
using API.SignalR;
|
||||
using Hangfire;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services.Tasks.Scanner;
|
||||
|
||||
public interface IProcessSeries
|
||||
{
|
||||
/// <summary>
|
||||
/// Do not allow this Prime to be invoked by multiple threads. It will break the DB.
|
||||
/// </summary>
|
||||
/// <returns></returns>
|
||||
Task Prime();
|
||||
Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library);
|
||||
void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// All code needed to Update a Series from a Scan action
|
||||
/// </summary>
|
||||
public class ProcessSeries : IProcessSeries
|
||||
{
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ILogger<ProcessSeries> _logger;
|
||||
private readonly IEventHub _eventHub;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly ICacheHelper _cacheHelper;
|
||||
private readonly IReadingItemService _readingItemService;
|
||||
private readonly IFileService _fileService;
|
||||
private readonly IMetadataService _metadataService;
|
||||
private readonly IWordCountAnalyzerService _wordCountAnalyzerService;
|
||||
|
||||
private IList<Genre> _genres;
|
||||
private IList<Person> _people;
|
||||
private IList<Tag> _tags;
|
||||
|
||||
|
||||
|
||||
public ProcessSeries(IUnitOfWork unitOfWork, ILogger<ProcessSeries> logger, IEventHub eventHub,
|
||||
IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService,
|
||||
IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService)
|
||||
{
|
||||
_unitOfWork = unitOfWork;
|
||||
_logger = logger;
|
||||
_eventHub = eventHub;
|
||||
_directoryService = directoryService;
|
||||
_cacheHelper = cacheHelper;
|
||||
_readingItemService = readingItemService;
|
||||
_fileService = fileService;
|
||||
_metadataService = metadataService;
|
||||
_wordCountAnalyzerService = wordCountAnalyzerService;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Invoke this before processing any series, just once to prime all the needed data during a scan
|
||||
/// </summary>
|
||||
public async Task Prime()
|
||||
{
|
||||
_genres = await _unitOfWork.GenreRepository.GetAllGenresAsync();
|
||||
_people = await _unitOfWork.PersonRepository.GetAllPeople();
|
||||
_tags = await _unitOfWork.TagRepository.GetAllTagsAsync();
|
||||
}
|
||||
|
||||
public async Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library)
|
||||
{
|
||||
if (!parsedInfos.Any()) return;
|
||||
|
||||
var scanWatch = Stopwatch.StartNew();
|
||||
var seriesName = parsedInfos.First().Series;
|
||||
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Updated, seriesName));
|
||||
_logger.LogInformation("[ScannerService] Beginning series update on {SeriesName}", seriesName);
|
||||
|
||||
// Check if there is a Series
|
||||
var seriesAdded = false;
|
||||
var series = await _unitOfWork.SeriesRepository.GetFullSeriesByName(parsedInfos.First().Series, library.Id);
|
||||
if (series == null)
|
||||
{
|
||||
seriesAdded = true;
|
||||
series = DbFactory.Series(parsedInfos.First().Series);
|
||||
}
|
||||
if (series.LibraryId == 0) series.LibraryId = library.Id;
|
||||
|
||||
try
|
||||
{
|
||||
_logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName);
|
||||
|
||||
UpdateVolumes(series, parsedInfos);
|
||||
series.Pages = series.Volumes.Sum(v => v.Pages);
|
||||
|
||||
series.NormalizedName = Parser.Parser.Normalize(series.Name);
|
||||
series.OriginalName ??= parsedInfos[0].Series;
|
||||
if (series.Format == MangaFormat.Unknown)
|
||||
{
|
||||
series.Format = parsedInfos[0].Format;
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(series.SortName))
|
||||
{
|
||||
series.SortName = series.Name;
|
||||
}
|
||||
if (!series.SortNameLocked)
|
||||
{
|
||||
series.SortName = series.Name;
|
||||
if (!string.IsNullOrEmpty(parsedInfos[0].SeriesSort))
|
||||
{
|
||||
series.SortName = parsedInfos[0].SeriesSort;
|
||||
}
|
||||
}
|
||||
|
||||
// parsedInfos[0] is not the first volume or chapter. We need to find it
|
||||
var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p));
|
||||
if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries))
|
||||
{
|
||||
series.LocalizedName = localizedSeries;
|
||||
}
|
||||
|
||||
// Update series FolderPath here (TODO: Move this into it's own private method)
|
||||
var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path), parsedInfos.Select(f => f.FullFilePath).ToList());
|
||||
if (seriesDirs.Keys.Count == 0)
|
||||
{
|
||||
_logger.LogCritical("Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are in a folder");
|
||||
}
|
||||
else
|
||||
{
|
||||
// Don't save FolderPath if it's a library Folder
|
||||
if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First()))
|
||||
{
|
||||
series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First());
|
||||
}
|
||||
}
|
||||
|
||||
series.Metadata ??= DbFactory.SeriesMetadata(new List<CollectionTag>());
|
||||
UpdateSeriesMetadata(series, library.Type);
|
||||
|
||||
series.LastFolderScanned = DateTime.Now;
|
||||
_unitOfWork.SeriesRepository.Attach(series);
|
||||
|
||||
try
|
||||
{
|
||||
await _unitOfWork.CommitAsync();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
await _unitOfWork.RollbackAsync();
|
||||
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series);
|
||||
|
||||
await _eventHub.SendMessageAsync(MessageFactory.Error,
|
||||
MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}",
|
||||
string.Empty));
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "[ScannerService] There was an exception updating series for {SeriesName}", series.Name);
|
||||
}
|
||||
|
||||
if (seriesAdded)
|
||||
{
|
||||
await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded,
|
||||
MessageFactory.SeriesAddedEvent(series.Id, series.Name, series.LibraryId));
|
||||
}
|
||||
|
||||
_logger.LogInformation("[ScannerService] Finished series update on {SeriesName} in {Milliseconds} ms", seriesName, scanWatch.ElapsedMilliseconds);
|
||||
EnqueuePostSeriesProcessTasks(series.LibraryId, series.Id, false);
|
||||
}
|
||||
|
||||
public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false)
|
||||
{
|
||||
BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate));
|
||||
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
|
||||
}
|
||||
|
||||
private static void UpdateSeriesMetadata(Series series, LibraryType libraryType)
|
||||
{
|
||||
var isBook = libraryType == LibraryType.Book;
|
||||
var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook);
|
||||
|
||||
var firstFile = firstChapter?.Files.FirstOrDefault();
|
||||
if (firstFile == null) return;
|
||||
if (Parser.Parser.IsPdf(firstFile.FilePath)) return;
|
||||
|
||||
var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList();
|
||||
|
||||
// Update Metadata based on Chapter metadata
|
||||
series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year);
|
||||
|
||||
if (series.Metadata.ReleaseYear < 1000)
|
||||
{
|
||||
// Not a valid year, default to 0
|
||||
series.Metadata.ReleaseYear = 0;
|
||||
}
|
||||
|
||||
// Set the AgeRating as highest in all the comicInfos
|
||||
if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating);
|
||||
|
||||
series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount);
|
||||
series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count);
|
||||
// To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well.
|
||||
if (series.Metadata.MaxCount != series.Metadata.TotalCount)
|
||||
{
|
||||
var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name));
|
||||
var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range));
|
||||
if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume;
|
||||
else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter;
|
||||
}
|
||||
|
||||
|
||||
if (!series.Metadata.PublicationStatusLocked)
|
||||
{
|
||||
series.Metadata.PublicationStatus = PublicationStatus.OnGoing;
|
||||
if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0)
|
||||
{
|
||||
series.Metadata.PublicationStatus = PublicationStatus.Completed;
|
||||
} else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0)
|
||||
{
|
||||
series.Metadata.PublicationStatus = PublicationStatus.Ended;
|
||||
}
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked)
|
||||
{
|
||||
series.Metadata.Summary = firstChapter.Summary;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked)
|
||||
{
|
||||
series.Metadata.Language = firstChapter.Language;
|
||||
}
|
||||
|
||||
// Handle People
|
||||
foreach (var chapter in chapters)
|
||||
{
|
||||
if (!series.Metadata.WriterLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Writer))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.CoverArtistLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.CoverArtist))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.PublisherLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Publisher))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.CharacterLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Character))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.ColoristLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Colorist))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.EditorLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Editor))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.InkerLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Inker))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.LettererLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Letterer))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.PencillerLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Penciller))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.TranslatorLocked)
|
||||
{
|
||||
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Translator))
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.TagsLocked)
|
||||
{
|
||||
foreach (var tag in chapter.Tags)
|
||||
{
|
||||
TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag);
|
||||
}
|
||||
}
|
||||
|
||||
if (!series.Metadata.GenresLocked)
|
||||
{
|
||||
foreach (var genre in chapter.Genres)
|
||||
{
|
||||
GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it
|
||||
// I might be able to filter out people that are in locked fields?
|
||||
var people = chapters.SelectMany(c => c.People).ToList();
|
||||
PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People,
|
||||
people, person =>
|
||||
{
|
||||
switch (person.Role)
|
||||
{
|
||||
case PersonRole.Writer:
|
||||
if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Penciller:
|
||||
if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Inker:
|
||||
if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Colorist:
|
||||
if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Letterer:
|
||||
if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.CoverArtist:
|
||||
if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Editor:
|
||||
if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Publisher:
|
||||
if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Character:
|
||||
if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
case PersonRole.Translator:
|
||||
if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person);
|
||||
break;
|
||||
default:
|
||||
series.Metadata.People.Remove(person);
|
||||
break;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
private void UpdateVolumes(Series series, IList<ParserInfo> parsedInfos)
|
||||
{
|
||||
var startingVolumeCount = series.Volumes.Count;
|
||||
// Add new volumes and update chapters per volume
|
||||
var distinctVolumes = parsedInfos.DistinctVolumes();
|
||||
_logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name);
|
||||
foreach (var volumeNumber in distinctVolumes)
|
||||
{
|
||||
var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber);
|
||||
if (volume == null)
|
||||
{
|
||||
volume = DbFactory.Volume(volumeNumber);
|
||||
volume.SeriesId = series.Id;
|
||||
series.Volumes.Add(volume);
|
||||
_unitOfWork.VolumeRepository.Add(volume);
|
||||
}
|
||||
|
||||
volume.Name = volumeNumber;
|
||||
|
||||
_logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name);
|
||||
var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray();
|
||||
UpdateChapters(series, volume, infos);
|
||||
volume.Pages = volume.Chapters.Sum(c => c.Pages);
|
||||
|
||||
// Update all the metadata on the Chapters
|
||||
foreach (var chapter in volume.Chapters)
|
||||
{
|
||||
var firstFile = chapter.Files.MinBy(x => x.Chapter);
|
||||
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue;
|
||||
try
|
||||
{
|
||||
var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath));
|
||||
UpdateChapterFromComicInfo(chapter, firstChapterInfo?.ComicInfo);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "There was some issue when updating chapter's metadata");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove existing volumes that aren't in parsedInfos
|
||||
var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList();
|
||||
if (series.Volumes.Count != nonDeletedVolumes.Count)
|
||||
{
|
||||
_logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name",
|
||||
(series.Volumes.Count - nonDeletedVolumes.Count), series.Name);
|
||||
var deletedVolumes = series.Volumes.Except(nonDeletedVolumes);
|
||||
foreach (var volume in deletedVolumes)
|
||||
{
|
||||
var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? "";
|
||||
if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file))
|
||||
{
|
||||
_logger.LogError(
|
||||
"[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}",
|
||||
file);
|
||||
}
|
||||
|
||||
_logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file);
|
||||
}
|
||||
|
||||
series.Volumes = nonDeletedVolumes;
|
||||
}
|
||||
|
||||
_logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
|
||||
series.Name, startingVolumeCount, series.Volumes.Count);
|
||||
}
|
||||
|
||||
private void UpdateChapters(Series series, Volume volume, IList<ParserInfo> parsedInfos)
|
||||
{
|
||||
// Add new chapters
|
||||
foreach (var info in parsedInfos)
|
||||
{
|
||||
// Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0
|
||||
// also are treated like specials for UI grouping.
|
||||
Chapter chapter;
|
||||
try
|
||||
{
|
||||
chapter = volume.Chapters.GetChapterByRange(info);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (chapter == null)
|
||||
{
|
||||
_logger.LogDebug(
|
||||
"[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters);
|
||||
chapter = DbFactory.Chapter(info);
|
||||
volume.Chapters.Add(chapter);
|
||||
series.LastChapterAdded = DateTime.Now;
|
||||
}
|
||||
else
|
||||
{
|
||||
chapter.UpdateFrom(info);
|
||||
}
|
||||
|
||||
if (chapter == null) continue;
|
||||
// Add files
|
||||
var specialTreatment = info.IsSpecialInfo();
|
||||
AddOrUpdateFileForChapter(chapter, info);
|
||||
chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty;
|
||||
chapter.Range = specialTreatment ? info.Filename : info.Chapters;
|
||||
}
|
||||
|
||||
|
||||
// Remove chapters that aren't in parsedInfos or have no files linked
|
||||
var existingChapters = volume.Chapters.ToList();
|
||||
foreach (var existingChapter in existingChapters)
|
||||
{
|
||||
if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter))
|
||||
{
|
||||
_logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series);
|
||||
volume.Chapters.Remove(existingChapter);
|
||||
}
|
||||
else
|
||||
{
|
||||
// Ensure we remove any files that no longer exist AND order
|
||||
existingChapter.Files = existingChapter.Files
|
||||
.Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath))
|
||||
.OrderByNatural(f => f.FilePath).ToList();
|
||||
existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info)
|
||||
{
|
||||
chapter.Files ??= new List<MangaFile>();
|
||||
var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath);
|
||||
if (existingFile != null)
|
||||
{
|
||||
existingFile.Format = info.Format;
|
||||
if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return;
|
||||
existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format);
|
||||
// We skip updating DB here with last modified time so that metadata refresh can do it
|
||||
}
|
||||
else
|
||||
{
|
||||
var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format));
|
||||
if (file == null) return;
|
||||
|
||||
chapter.Files.Add(file);
|
||||
}
|
||||
}
|
||||
|
||||
#nullable enable
|
||||
private void UpdateChapterFromComicInfo(Chapter chapter, ComicInfo? info)
|
||||
{
|
||||
var firstFile = chapter.Files.MinBy(x => x.Chapter);
|
||||
if (firstFile == null ||
|
||||
_cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return;
|
||||
|
||||
var comicInfo = info;
|
||||
if (info == null)
|
||||
{
|
||||
comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath);
|
||||
}
|
||||
|
||||
if (comicInfo == null) return;
|
||||
_logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath);
|
||||
|
||||
chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating);
|
||||
|
||||
if (!string.IsNullOrEmpty(comicInfo.Title))
|
||||
{
|
||||
chapter.TitleName = comicInfo.Title.Trim();
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(comicInfo.Summary))
|
||||
{
|
||||
chapter.Summary = comicInfo.Summary;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(comicInfo.LanguageISO))
|
||||
{
|
||||
chapter.Language = comicInfo.LanguageISO;
|
||||
}
|
||||
|
||||
if (comicInfo.Count > 0)
|
||||
{
|
||||
chapter.TotalCount = comicInfo.Count;
|
||||
}
|
||||
|
||||
// This needs to check against both Number and Volume to calculate Count
|
||||
if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0)
|
||||
{
|
||||
chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number));
|
||||
}
|
||||
if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0)
|
||||
{
|
||||
chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume)));
|
||||
}
|
||||
|
||||
void AddPerson(Person person)
|
||||
{
|
||||
PersonHelper.AddPersonIfNotExists(chapter.People, person);
|
||||
}
|
||||
|
||||
void AddGenre(Genre genre)
|
||||
{
|
||||
//chapter.Genres.Add(genre);
|
||||
GenreHelper.AddGenreIfNotExists(chapter.Genres, genre);
|
||||
}
|
||||
|
||||
void AddTag(Tag tag, bool added)
|
||||
{
|
||||
//chapter.Tags.Add(tag);
|
||||
TagHelper.AddTagIfNotExists(chapter.Tags, tag);
|
||||
}
|
||||
|
||||
|
||||
if (comicInfo.Year > 0)
|
||||
{
|
||||
var day = Math.Max(comicInfo.Day, 1);
|
||||
var month = Math.Max(comicInfo.Month, 1);
|
||||
chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}");
|
||||
}
|
||||
|
||||
var people = GetTagValues(comicInfo.Colorist);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist);
|
||||
UpdatePeople(people, PersonRole.Colorist,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.Characters);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character);
|
||||
UpdatePeople(people, PersonRole.Character,
|
||||
AddPerson);
|
||||
|
||||
|
||||
people = GetTagValues(comicInfo.Translator);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator);
|
||||
UpdatePeople(people, PersonRole.Translator,
|
||||
AddPerson);
|
||||
|
||||
|
||||
people = GetTagValues(comicInfo.Writer);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer);
|
||||
UpdatePeople(people, PersonRole.Writer,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.Editor);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor);
|
||||
UpdatePeople(people, PersonRole.Editor,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.Inker);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker);
|
||||
UpdatePeople(people, PersonRole.Inker,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.Letterer);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer);
|
||||
UpdatePeople(people, PersonRole.Letterer,
|
||||
AddPerson);
|
||||
|
||||
|
||||
people = GetTagValues(comicInfo.Penciller);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller);
|
||||
UpdatePeople(people, PersonRole.Penciller,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.CoverArtist);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist);
|
||||
UpdatePeople(people, PersonRole.CoverArtist,
|
||||
AddPerson);
|
||||
|
||||
people = GetTagValues(comicInfo.Publisher);
|
||||
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher);
|
||||
UpdatePeople(people, PersonRole.Publisher,
|
||||
AddPerson);
|
||||
|
||||
var genres = GetTagValues(comicInfo.Genre);
|
||||
GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList());
|
||||
UpdateGenre(genres, false,
|
||||
AddGenre);
|
||||
|
||||
var tags = GetTagValues(comicInfo.Tags);
|
||||
TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList());
|
||||
UpdateTag(tags, false,
|
||||
AddTag);
|
||||
}
|
||||
|
||||
private static IList<string> GetTagValues(string comicInfoTagSeparatedByComma)
|
||||
{
|
||||
|
||||
if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma))
|
||||
{
|
||||
return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList();
|
||||
}
|
||||
return ImmutableList<string>.Empty;
|
||||
}
|
||||
#nullable disable
|
||||
|
||||
/// <summary>
|
||||
/// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and
|
||||
/// add an entry. For each person in name, the callback will be executed.
|
||||
/// </summary>
|
||||
/// <remarks>This does not remove people if an empty list is passed into names</remarks>
|
||||
/// <remarks>This is used to add new people to a list without worrying about duplicating rows in the DB</remarks>
|
||||
/// <param name="names"></param>
|
||||
/// <param name="role"></param>
|
||||
/// <param name="action"></param>
|
||||
private void UpdatePeople(IEnumerable<string> names, PersonRole role, Action<Person> action)
|
||||
{
|
||||
|
||||
var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList();
|
||||
|
||||
foreach (var name in names)
|
||||
{
|
||||
var normalizedName = Parser.Parser.Normalize(name);
|
||||
var person = allPeopleTypeRole.FirstOrDefault(p =>
|
||||
p.NormalizedName.Equals(normalizedName));
|
||||
if (person == null)
|
||||
{
|
||||
person = DbFactory.Person(name, role);
|
||||
lock (_people)
|
||||
{
|
||||
_people.Add(person);
|
||||
}
|
||||
}
|
||||
|
||||
action(person);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
///
|
||||
/// </summary>
|
||||
/// <param name="names"></param>
|
||||
/// <param name="isExternal"></param>
|
||||
/// <param name="action"></param>
|
||||
private void UpdateGenre(IEnumerable<string> names, bool isExternal, Action<Genre> action)
|
||||
{
|
||||
foreach (var name in names)
|
||||
{
|
||||
if (string.IsNullOrEmpty(name.Trim())) continue;
|
||||
|
||||
var normalizedName = Parser.Parser.Normalize(name);
|
||||
var genre = _genres.FirstOrDefault(p =>
|
||||
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
|
||||
if (genre == null)
|
||||
{
|
||||
genre = DbFactory.Genre(name, false);
|
||||
lock (_genres)
|
||||
{
|
||||
_genres.Add(genre);
|
||||
}
|
||||
}
|
||||
|
||||
action(genre);
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
///
|
||||
/// </summary>
|
||||
/// <param name="names"></param>
|
||||
/// <param name="isExternal"></param>
|
||||
/// <param name="action">Callback for every item. Will give said item back and a bool if item was added</param>
|
||||
private void UpdateTag(IEnumerable<string> names, bool isExternal, Action<Tag, bool> action)
|
||||
{
|
||||
foreach (var name in names)
|
||||
{
|
||||
if (string.IsNullOrEmpty(name.Trim())) continue;
|
||||
|
||||
var added = false;
|
||||
var normalizedName = Parser.Parser.Normalize(name);
|
||||
|
||||
var tag = _tags.FirstOrDefault(p =>
|
||||
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
|
||||
if (tag == null)
|
||||
{
|
||||
added = true;
|
||||
tag = DbFactory.Tag(name, false);
|
||||
lock (_tags)
|
||||
{
|
||||
_tags.Add(tag);
|
||||
}
|
||||
}
|
||||
|
||||
action(tag, added);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
111
API/Services/Tasks/Scanner/ScanLibrary.cs
Normal file
111
API/Services/Tasks/Scanner/ScanLibrary.cs
Normal file
@ -0,0 +1,111 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Helpers;
|
||||
using API.Parser;
|
||||
using Kavita.Common.Helpers;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Services.Tasks.Scanner;
|
||||
|
||||
/// <summary>
|
||||
/// This is responsible for scanning and updating a Library
|
||||
/// </summary>
|
||||
public class ScanLibrary
|
||||
{
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ILogger _logger;
|
||||
|
||||
public ScanLibrary(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger logger)
|
||||
{
|
||||
_directoryService = directoryService;
|
||||
_unitOfWork = unitOfWork;
|
||||
_logger = logger;
|
||||
}
|
||||
|
||||
|
||||
// public Task UpdateLibrary(Library library)
|
||||
// {
|
||||
//
|
||||
//
|
||||
// }
|
||||
|
||||
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list.
|
||||
/// </summary>
|
||||
/// <param name="parsedSeries"></param>
|
||||
/// <param name="series"></param>
|
||||
/// <returns></returns>
|
||||
public static IList<ParserInfo> GetInfosByName(Dictionary<ParsedSeries, List<ParserInfo>> parsedSeries, Series series)
|
||||
{
|
||||
var allKeys = parsedSeries.Keys.Where(ps =>
|
||||
SeriesHelper.FindSeries(series, ps));
|
||||
|
||||
var infos = new List<ParserInfo>();
|
||||
foreach (var key in allKeys)
|
||||
{
|
||||
infos.AddRange(parsedSeries[key]);
|
||||
}
|
||||
|
||||
return infos;
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained
|
||||
/// </summary>
|
||||
/// <param name="folderPath">A library folder or series folder</param>
|
||||
/// <param name="folderAction">A callback async Task to be called once all files for each folder path are found</param>
|
||||
public async Task ProcessFiles(string folderPath, bool isLibraryFolder, Func<IEnumerable<string>, string,Task> folderAction)
|
||||
{
|
||||
if (isLibraryFolder)
|
||||
{
|
||||
var directories = _directoryService.GetDirectories(folderPath).ToList();
|
||||
|
||||
foreach (var directory in directories)
|
||||
{
|
||||
// For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication
|
||||
await folderAction(_directoryService.ScanFiles(directory), directory);
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
//folderAction(ScanFiles(folderPath));
|
||||
await folderAction(_directoryService.ScanFiles(folderPath), folderPath);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
private GlobMatcher CreateIgnoreMatcher(string ignoreFile)
|
||||
{
|
||||
if (!_directoryService.FileSystem.File.Exists(ignoreFile))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
// Read file in and add each line to Matcher
|
||||
var lines = _directoryService.FileSystem.File.ReadAllLines(ignoreFile);
|
||||
if (lines.Length == 0)
|
||||
{
|
||||
_logger.LogError("Kavita Ignore file found but empty, ignoring: {IgnoreFile}", ignoreFile);
|
||||
return null;
|
||||
}
|
||||
|
||||
GlobMatcher matcher = new();
|
||||
foreach (var line in lines)
|
||||
{
|
||||
matcher.AddExclude(line);
|
||||
}
|
||||
|
||||
return matcher;
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -108,7 +108,10 @@ namespace API.SignalR
|
||||
/// When files are being scanned to calculate word count
|
||||
/// </summary>
|
||||
private const string WordCountAnalyzerProgress = "WordCountAnalyzerProgress";
|
||||
|
||||
/// <summary>
|
||||
/// A generic message that can occur in background processing to inform user, but no direct action is needed
|
||||
/// </summary>
|
||||
public const string Info = "Info";
|
||||
|
||||
|
||||
public static SignalRMessage ScanSeriesEvent(int libraryId, int seriesId, string seriesName)
|
||||
@ -261,9 +264,7 @@ namespace API.SignalR
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* A generic error that will show on events widget in the UI
|
||||
*/
|
||||
|
||||
public static SignalRMessage ErrorEvent(string title, string subtitle)
|
||||
{
|
||||
return new SignalRMessage
|
||||
@ -281,6 +282,23 @@ namespace API.SignalR
|
||||
};
|
||||
}
|
||||
|
||||
public static SignalRMessage InfoEvent(string title, string subtitle)
|
||||
{
|
||||
return new SignalRMessage
|
||||
{
|
||||
Name = Info,
|
||||
Title = title,
|
||||
SubTitle = subtitle,
|
||||
Progress = ProgressType.None,
|
||||
EventType = ProgressEventType.Single,
|
||||
Body = new
|
||||
{
|
||||
Title = title,
|
||||
SubTitle = subtitle,
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
public static SignalRMessage LibraryModifiedEvent(int libraryId, string action)
|
||||
{
|
||||
return new SignalRMessage
|
||||
|
@ -152,8 +152,10 @@ namespace API
|
||||
.UseMemoryStorage());
|
||||
|
||||
// Add the processing server as IHostedService
|
||||
services.AddHangfireServer();
|
||||
|
||||
services.AddHangfireServer(options =>
|
||||
{
|
||||
options.Queues = new[] {TaskScheduler.ScanQueue, TaskScheduler.DefaultQueue};
|
||||
});
|
||||
// Add IHostedService for startup tasks
|
||||
// Any services that should be bootstrapped go here
|
||||
services.AddHostedService<StartupTasksHostedService>();
|
||||
|
@ -5,7 +5,7 @@
|
||||
"TokenKey": "super secret unguessable key",
|
||||
"Logging": {
|
||||
"LogLevel": {
|
||||
"Default": "Critical",
|
||||
"Default": "Debug",
|
||||
"Microsoft": "Information",
|
||||
"Microsoft.Hosting.Lifetime": "Error",
|
||||
"Hangfire": "Information",
|
||||
|
64
Kavita.Common/Helpers/GlobMatcher.cs
Normal file
64
Kavita.Common/Helpers/GlobMatcher.cs
Normal file
@ -0,0 +1,64 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using DotNet.Globbing;
|
||||
|
||||
namespace Kavita.Common.Helpers;
|
||||
|
||||
/**
|
||||
* Matches against strings using Glob syntax
|
||||
*/
|
||||
public class GlobMatcher
|
||||
{
|
||||
private readonly IList<Glob> _includes = new List<Glob>();
|
||||
private readonly IList<Glob> _excludes = new List<Glob>();
|
||||
|
||||
public void AddInclude(string pattern)
|
||||
{
|
||||
_includes.Add(Glob.Parse(pattern));
|
||||
}
|
||||
|
||||
public void AddExclude(string pattern)
|
||||
{
|
||||
_excludes.Add(Glob.Parse(pattern));
|
||||
}
|
||||
|
||||
public bool ExcludeMatches(string file)
|
||||
{
|
||||
// NOTE: Glob.IsMatch() returns the opposite of what you'd expect
|
||||
return _excludes.Any(p => p.IsMatch(file));
|
||||
}
|
||||
|
||||
|
||||
/// <summary>
|
||||
///
|
||||
/// </summary>
|
||||
/// <param name="file"></param>
|
||||
/// <param name="mustMatchIncludes"></param>
|
||||
/// <returns>True if any</returns>
|
||||
public bool IsMatch(string file, bool mustMatchIncludes = false)
|
||||
{
|
||||
// NOTE: Glob.IsMatch() returns the opposite of what you'd expect
|
||||
if (_excludes.Any(p => p.IsMatch(file))) return true;
|
||||
if (mustMatchIncludes)
|
||||
{
|
||||
return _includes.Any(p => p.IsMatch(file));
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
public void Merge(GlobMatcher matcher)
|
||||
{
|
||||
if (matcher == null) return;
|
||||
foreach (var glob in matcher._excludes)
|
||||
{
|
||||
_excludes.Add(glob);
|
||||
}
|
||||
|
||||
foreach (var glob in matcher._includes)
|
||||
{
|
||||
_includes.Add(glob);
|
||||
}
|
||||
|
||||
}
|
||||
}
|
@ -9,6 +9,7 @@
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="DotNet.Glob" Version="3.1.3" />
|
||||
<PackageReference Include="Flurl.Http" Version="3.2.4" />
|
||||
<PackageReference Include="Microsoft.Extensions.Configuration.Abstractions" Version="6.0.0" />
|
||||
<PackageReference Include="Microsoft.Extensions.Hosting" Version="6.0.1" />
|
||||
|
32
UI/Web/src/app/_models/events/info-event.ts
Normal file
32
UI/Web/src/app/_models/events/info-event.ts
Normal file
@ -0,0 +1,32 @@
|
||||
import { EVENTS } from "src/app/_services/message-hub.service";
|
||||
|
||||
export interface InfoEvent {
|
||||
/**
|
||||
* Payload of the event subtype
|
||||
*/
|
||||
body: any;
|
||||
/**
|
||||
* Subtype event
|
||||
*/
|
||||
name: EVENTS.Info;
|
||||
/**
|
||||
* Title to display in events widget
|
||||
*/
|
||||
title: string;
|
||||
/**
|
||||
* Optional subtitle to display. Defaults to empty string
|
||||
*/
|
||||
subTitle: string;
|
||||
/**
|
||||
* Type of event. Helps events widget to understand how to handle said event
|
||||
*/
|
||||
eventType: 'single';
|
||||
/**
|
||||
* Type of progress. Helps widget understand how to display spinner
|
||||
*/
|
||||
progress: 'none';
|
||||
/**
|
||||
* When event was sent
|
||||
*/
|
||||
eventTime: string;
|
||||
}
|
@ -55,4 +55,8 @@ export interface Series {
|
||||
minHoursToRead: number;
|
||||
maxHoursToRead: number;
|
||||
avgHoursToRead: number;
|
||||
/**
|
||||
* Highest level folder containing this series
|
||||
*/
|
||||
folderPath: string;
|
||||
}
|
||||
|
@ -71,7 +71,11 @@ export enum EVENTS {
|
||||
/**
|
||||
* When files are being scanned to calculate word count
|
||||
*/
|
||||
WordCountAnalyzerProgress = 'WordCountAnalyzerProgress'
|
||||
WordCountAnalyzerProgress = 'WordCountAnalyzerProgress',
|
||||
/**
|
||||
* When the user needs to be informed, but it's not a big deal
|
||||
*/
|
||||
Info = 'Info',
|
||||
}
|
||||
|
||||
export interface Message<T> {
|
||||
@ -217,6 +221,13 @@ export class MessageHubService {
|
||||
});
|
||||
});
|
||||
|
||||
this.hubConnection.on(EVENTS.Info, resp => {
|
||||
this.messagesSource.next({
|
||||
event: EVENTS.Info,
|
||||
payload: resp.body
|
||||
});
|
||||
});
|
||||
|
||||
this.hubConnection.on(EVENTS.SeriesAdded, resp => {
|
||||
this.messagesSource.next({
|
||||
event: EVENTS.SeriesAdded,
|
||||
|
@ -345,10 +345,12 @@
|
||||
</div>
|
||||
<div class="row g-0 mb-2">
|
||||
<div class="col-md-6" >Created: {{series.created | date:'shortDate'}}</div>
|
||||
<div class="col-md-6">Last Read: {{series.latestReadDate | date:'shortDate'}}</div>
|
||||
<div class="col-md-6">Last Added To: {{series.lastChapterAdded | date:'shortDate'}}</div>
|
||||
<div class="col-md-6">Last Read: {{series.latestReadDate | date:'shortDate' | defaultDate}}</div>
|
||||
<div class="col-md-6">Last Added To: {{series.lastChapterAdded | date:'shortDate' | defaultDate}}</div>
|
||||
<div class="col-md-6">Folder Path: {{series.folderPath | defaultValue}}</div>
|
||||
</div>
|
||||
<div class="row g-0 mb-2" *ngIf="metadata">
|
||||
<!-- TODO: Put tooltips in here to explain to the user what these are (ComicInfo tags) -->
|
||||
<div class="col-md-6">Max Items: {{metadata.maxCount}}</div>
|
||||
<div class="col-md-6">Total Items: {{metadata.totalCount}}</div>
|
||||
<div class="col-md-6">Publication Status: {{metadata.publicationStatus | publicationStatus}}</div>
|
||||
|
@ -1,8 +1,8 @@
|
||||
import { CdkVirtualScrollViewport } from '@angular/cdk/scrolling';
|
||||
import { DOCUMENT } from '@angular/common';
|
||||
import { AfterContentInit, AfterViewInit, ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core';
|
||||
import { AfterViewInit, ChangeDetectionStrategy, ChangeDetectorRef, Component, ContentChild, ElementRef, EventEmitter, HostListener, Inject, Input, OnChanges, OnDestroy, OnInit, Output, TemplateRef, TrackByFunction, ViewChild } from '@angular/core';
|
||||
import { VirtualScrollerComponent } from '@iharbeck/ngx-virtual-scroller';
|
||||
import { first, Subject, takeUntil, takeWhile } from 'rxjs';
|
||||
import { Subject } from 'rxjs';
|
||||
import { FilterSettings } from 'src/app/metadata-filter/filter-settings';
|
||||
import { Breakpoint, UtilityService } from 'src/app/shared/_services/utility.service';
|
||||
import { JumpKey } from 'src/app/_models/jumpbar/jump-key';
|
||||
@ -77,6 +77,7 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges,
|
||||
private jumpbarService: JumpbarService) {
|
||||
this.filter = this.seriesService.createSeriesFilter();
|
||||
this.changeDetectionRef.markForCheck();
|
||||
|
||||
}
|
||||
|
||||
@HostListener('window:resize', ['$event'])
|
||||
@ -108,10 +109,11 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges,
|
||||
this.virtualScroller.refresh();
|
||||
});
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
ngAfterViewInit(): void {
|
||||
// NOTE: I can't seem to figure out a way to resume the JumpKey with the scroller.
|
||||
// NOTE: I can't seem to figure out a way to resume the JumpKey with the scroller.
|
||||
// this.virtualScroller.vsUpdate.pipe(takeWhile(() => this.hasResumedJumpKey), takeUntil(this.onDestory)).subscribe(() => {
|
||||
// const resumeKey = this.jumpbarService.getResumeKey(this.header);
|
||||
// console.log('Resume key:', resumeKey);
|
||||
@ -130,7 +132,6 @@ export class CardDetailLayoutComponent implements OnInit, OnDestroy, OnChanges,
|
||||
ngOnChanges(): void {
|
||||
this.jumpBarKeysToRender = [...this.jumpBarKeys];
|
||||
this.resizeJumpBar();
|
||||
|
||||
}
|
||||
|
||||
|
||||
|
@ -79,7 +79,8 @@ export class DashboardComponent implements OnInit, OnDestroy {
|
||||
);
|
||||
|
||||
this.loadRecentlyAdded$.pipe(debounceTime(1000), takeUntil(this.onDestroy)).subscribe(() => {
|
||||
this.loadRecentlyAdded();
|
||||
this.loadRecentlyUpdated();
|
||||
this.loadRecentlyAddedSeries();
|
||||
this.cdRef.markForCheck();
|
||||
});
|
||||
}
|
||||
@ -104,7 +105,7 @@ export class DashboardComponent implements OnInit, OnDestroy {
|
||||
|
||||
reloadSeries() {
|
||||
this.loadOnDeck();
|
||||
this.loadRecentlyAdded();
|
||||
this.loadRecentlyUpdated();
|
||||
this.loadRecentlyAddedSeries();
|
||||
}
|
||||
|
||||
@ -144,7 +145,7 @@ export class DashboardComponent implements OnInit, OnDestroy {
|
||||
}
|
||||
|
||||
|
||||
loadRecentlyAdded() {
|
||||
loadRecentlyUpdated() {
|
||||
let api = this.seriesService.getRecentlyUpdatedSeries();
|
||||
if (this.libraryId > 0) {
|
||||
api = this.seriesService.getRecentlyUpdatedSeries();
|
||||
|
@ -26,6 +26,7 @@
|
||||
[trackByIdentity]="trackByIdentity"
|
||||
[filterOpen]="filterOpen"
|
||||
[jumpBarKeys]="jumpKeys"
|
||||
[refresh]="refresh"
|
||||
(applyFilter)="updateFilter($event)"
|
||||
>
|
||||
<ng-template #cardItem let-item let-position="idx">
|
||||
|
@ -41,6 +41,7 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
|
||||
filterOpen: EventEmitter<boolean> = new EventEmitter();
|
||||
filterActive: boolean = false;
|
||||
filterActiveCheck!: SeriesFilter;
|
||||
refresh: EventEmitter<void> = new EventEmitter();
|
||||
|
||||
jumpKeys: Array<JumpKey> = [];
|
||||
|
||||
@ -141,15 +142,38 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
|
||||
}
|
||||
|
||||
ngOnInit(): void {
|
||||
this.hubService.messages$.pipe(debounceTime(6000), takeUntil(this.onDestroy)).subscribe((event) => {
|
||||
this.hubService.messages$.pipe(takeUntil(this.onDestroy)).subscribe((event) => {
|
||||
if (event.event === EVENTS.SeriesAdded) {
|
||||
const seriesAdded = event.payload as SeriesAddedEvent;
|
||||
if (seriesAdded.libraryId !== this.libraryId) return;
|
||||
this.loadPage();
|
||||
if (!this.utilityService.deepEqual(this.filter, this.filterActiveCheck)) {
|
||||
this.loadPage();
|
||||
return;
|
||||
}
|
||||
this.seriesService.getSeries(seriesAdded.seriesId).subscribe(s => {
|
||||
this.series = [...this.series, s].sort((s1: Series, s2: Series) => {
|
||||
if (s1.sortName < s2.sortName) return -1;
|
||||
if (s1.sortName > s2.sortName) return 1;
|
||||
return 0;
|
||||
});
|
||||
this.pagination.totalItems++;
|
||||
this.cdRef.markForCheck();
|
||||
this.refresh.emit();
|
||||
});
|
||||
|
||||
|
||||
} else if (event.event === EVENTS.SeriesRemoved) {
|
||||
const seriesRemoved = event.payload as SeriesRemovedEvent;
|
||||
if (seriesRemoved.libraryId !== this.libraryId) return;
|
||||
this.loadPage();
|
||||
if (!this.utilityService.deepEqual(this.filter, this.filterActiveCheck)) {
|
||||
this.loadPage();
|
||||
return;
|
||||
}
|
||||
|
||||
this.series = this.series.filter(s => s.id != seriesRemoved.seriesId);
|
||||
this.pagination.totalItems--;
|
||||
this.cdRef.markForCheck();
|
||||
this.refresh.emit();
|
||||
}
|
||||
});
|
||||
}
|
||||
@ -228,5 +252,5 @@ export class LibraryDetailComponent implements OnInit, OnDestroy {
|
||||
this.router.navigate(['library', this.libraryId, 'series', series.id]);
|
||||
}
|
||||
|
||||
trackByIdentity = (index: number, item: Series) => `${item.name}_${item.localizedName}_${item.pagesRead}`;
|
||||
trackByIdentity = (index: number, item: Series) => `${item.id}_${item.name}_${item.localizedName}_${item.pagesRead}`;
|
||||
}
|
||||
|
@ -486,11 +486,6 @@ export class MangaReaderComponent implements OnInit, AfterViewInit, OnDestroy {
|
||||
|
||||
this.updateForm();
|
||||
|
||||
this.generalSettingsForm.get('darkness')?.valueChanges.pipe(takeUntil(this.onDestroy)).subscribe(val => {
|
||||
console.log('brightness: ', val);
|
||||
//this.cdRef.markForCheck();
|
||||
});
|
||||
|
||||
this.generalSettingsForm.get('layoutMode')?.valueChanges.pipe(takeUntil(this.onDestroy)).subscribe(val => {
|
||||
|
||||
const changeOccurred = parseInt(val, 10) !== this.layoutMode;
|
||||
|
@ -1,17 +1,27 @@
|
||||
<ng-container *ngIf="isAdmin$ | async">
|
||||
<ng-container *ngIf="downloadService.activeDownloads$ | async as activeDownloads">
|
||||
<ng-container *ngIf="errors$ | async as errors">
|
||||
<button type="button" class="btn btn-icon" [ngClass]="{'colored': activeEvents > 0 || activeDownloads.length > 0, 'colored-error': errors.length > 0}"
|
||||
[ngbPopover]="popContent" title="Activity" placement="bottom" [popoverClass]="'nav-events'" [autoClose]="'outside'">
|
||||
<i aria-hidden="true" class="fa fa-wave-square nav"></i>
|
||||
</button>
|
||||
<ng-container *ngIf="infos$ | async as infos">
|
||||
<button type="button" class="btn btn-icon" [ngClass]="{'colored': activeEvents > 0 || activeDownloads.length > 0, 'colored-error': errors.length > 0,
|
||||
'colored-info': infos.length > 0 && errors.length === 0}"
|
||||
[ngbPopover]="popContent" title="Activity" placement="bottom" [popoverClass]="'nav-events'" [autoClose]="'outside'">
|
||||
<i aria-hidden="true" class="fa fa-wave-square nav"></i>
|
||||
</button>
|
||||
</ng-container>
|
||||
</ng-container>
|
||||
</ng-container>
|
||||
|
||||
|
||||
<ng-template #popContent>
|
||||
<ul class="list-group list-group-flush dark-menu">
|
||||
|
||||
<ul class="list-group list-group-flush dark-menu">
|
||||
<ng-container *ngIf="errors$ | async as errors">
|
||||
<ng-container *ngIf="infos$ | async as infos">
|
||||
<li class="list-group-item dark-menu-item clickable" *ngIf="errors.length > 0 || infos.length > 0" (click)="clearAllErrorOrInfos()">
|
||||
Dismiss All
|
||||
</li>
|
||||
</ng-container>
|
||||
</ng-container>
|
||||
<ng-container *ngIf="debugMode">
|
||||
<li class="list-group-item dark-menu-item">
|
||||
<div class="h6 mb-1">Title goes here</div>
|
||||
@ -46,6 +56,13 @@
|
||||
</div>
|
||||
<button type="button" class="btn-close float-end" aria-label="close" ></button>
|
||||
</li>
|
||||
<li class="list-group-item dark-menu-item info">
|
||||
<div>
|
||||
<div class="h6 mb-1"><i class="fa-solid fa-circle-info me-2"></i>Scan didn't run becasuse nothing to do</div>
|
||||
<div class="accent-text mb-1">Click for more information</div>
|
||||
</div>
|
||||
<button type="button" class="btn-close float-end" aria-label="close" ></button>
|
||||
</li>
|
||||
<li class="list-group-item dark-menu-item">
|
||||
<div class="d-inline-flex">
|
||||
<span class="download">
|
||||
@ -59,6 +76,7 @@
|
||||
<div class="accent-text">PDFs</div>
|
||||
</li>
|
||||
</ng-container>
|
||||
|
||||
<!-- Progress Events-->
|
||||
<ng-container *ngIf="progressEvents$ | async as progressUpdates">
|
||||
<ng-container *ngFor="let message of progressUpdates">
|
||||
@ -119,12 +137,25 @@
|
||||
<!-- Errors -->
|
||||
<ng-container *ngIf="errors$ | async as errors">
|
||||
<ng-container *ngFor="let error of errors">
|
||||
<li class="list-group-item dark-menu-item error" role="alert" (click)="seeMoreError(error)">
|
||||
<li class="list-group-item dark-menu-item error" role="alert" (click)="seeMore(error)">
|
||||
<div>
|
||||
<div class="h6 mb-1"><i class="fa-solid fa-triangle-exclamation me-2"></i>{{error.title}}</div>
|
||||
<div class="accent-text mb-1">Click for more information</div>
|
||||
</div>
|
||||
<button type="button" class="btn-close float-end" aria-label="close" (click)="removeError(error, $event)"></button>
|
||||
<button type="button" class="btn-close float-end" aria-label="close" (click)="removeErrorOrInfo(error, $event)"></button>
|
||||
</li>
|
||||
</ng-container>
|
||||
</ng-container>
|
||||
|
||||
<!-- Infos -->
|
||||
<ng-container *ngIf="infos$ | async as infos">
|
||||
<ng-container *ngFor="let info of infos">
|
||||
<li class="list-group-item dark-menu-item info" role="alert" (click)="seeMore(info)">
|
||||
<div>
|
||||
<div class="h6 mb-1"><i class="fa-solid fa-circle-info me-2"></i>{{info.title}}</div>
|
||||
<div class="accent-text mb-1">Click for more information</div>
|
||||
</div>
|
||||
<button type="button" class="btn-close float-end" aria-label="close" (click)="removeErrorOrInfo(info, $event)"></button>
|
||||
</li>
|
||||
</ng-container>
|
||||
</ng-container>
|
||||
|
@ -69,6 +69,11 @@
|
||||
border-radius: 60px;
|
||||
}
|
||||
|
||||
.colored-info {
|
||||
background-color: var(--event-widget-info-bg-color) !important;
|
||||
border-radius: 60px;
|
||||
}
|
||||
|
||||
.update-available {
|
||||
cursor: pointer;
|
||||
|
||||
@ -95,4 +100,23 @@
|
||||
font-size: 11px;
|
||||
position: absolute;
|
||||
}
|
||||
}
|
||||
|
||||
.info {
|
||||
cursor: pointer;
|
||||
position: relative;
|
||||
.h6 {
|
||||
color: var(--event-widget-info-bg-color);
|
||||
}
|
||||
|
||||
i.fa {
|
||||
color: var(--primary-color) !important;
|
||||
}
|
||||
|
||||
.btn-close {
|
||||
top: 10px;
|
||||
right: 10px;
|
||||
font-size: 11px;
|
||||
position: absolute;
|
||||
}
|
||||
}
|
@ -7,6 +7,7 @@ import { ConfirmService } from 'src/app/shared/confirm.service';
|
||||
import { UpdateNotificationModalComponent } from 'src/app/shared/update-notification/update-notification-modal.component';
|
||||
import { DownloadService } from 'src/app/shared/_services/download.service';
|
||||
import { ErrorEvent } from 'src/app/_models/events/error-event';
|
||||
import { InfoEvent } from 'src/app/_models/events/info-event';
|
||||
import { NotificationProgressEvent } from 'src/app/_models/events/notification-progress-event';
|
||||
import { UpdateVersionEvent } from 'src/app/_models/events/update-version-event';
|
||||
import { User } from 'src/app/_models/user';
|
||||
@ -38,6 +39,9 @@ export class EventsWidgetComponent implements OnInit, OnDestroy {
|
||||
errorSource = new BehaviorSubject<ErrorEvent[]>([]);
|
||||
errors$ = this.errorSource.asObservable();
|
||||
|
||||
infoSource = new BehaviorSubject<InfoEvent[]>([]);
|
||||
infos$ = this.infoSource.asObservable();
|
||||
|
||||
private updateNotificationModalRef: NgbModalRef | null = null;
|
||||
|
||||
activeEvents: number = 0;
|
||||
@ -64,6 +68,7 @@ export class EventsWidgetComponent implements OnInit, OnDestroy {
|
||||
ngOnInit(): void {
|
||||
this.messageHub.messages$.pipe(takeUntil(this.onDestroy)).subscribe(event => {
|
||||
if (event.event === EVENTS.NotificationProgress) {
|
||||
console.log('[Event Widget]: Event came in ', event.payload);
|
||||
this.processNotificationProgressEvent(event);
|
||||
} else if (event.event === EVENTS.Error) {
|
||||
const values = this.errorSource.getValue();
|
||||
@ -71,6 +76,12 @@ export class EventsWidgetComponent implements OnInit, OnDestroy {
|
||||
this.errorSource.next(values);
|
||||
this.activeEvents += 1;
|
||||
this.cdRef.markForCheck();
|
||||
} else if (event.event === EVENTS.Info) {
|
||||
const values = this.infoSource.getValue();
|
||||
values.push(event.payload as InfoEvent);
|
||||
this.infoSource.next(values);
|
||||
this.activeEvents += 1;
|
||||
this.cdRef.markForCheck();
|
||||
}
|
||||
});
|
||||
|
||||
@ -139,28 +150,46 @@ export class EventsWidgetComponent implements OnInit, OnDestroy {
|
||||
});
|
||||
}
|
||||
|
||||
async seeMoreError(error: ErrorEvent) {
|
||||
async seeMore(event: ErrorEvent | InfoEvent) {
|
||||
const config = new ConfirmConfig();
|
||||
config.buttons = [
|
||||
{text: 'Dismiss', type: 'primary'},
|
||||
{text: 'Ok', type: 'secondary'},
|
||||
];
|
||||
config.header = error.title;
|
||||
config.content = error.subTitle;
|
||||
var result = await this.confirmService.alert(error.subTitle || error.title, config);
|
||||
if (event.name === EVENTS.Error) {
|
||||
config.buttons = [{text: 'Dismiss', type: 'primary'}, ...config.buttons];
|
||||
}
|
||||
config.header = event.title;
|
||||
config.content = event.subTitle;
|
||||
var result = await this.confirmService.alert(event.subTitle || event.title, config);
|
||||
if (result) {
|
||||
this.removeError(error);
|
||||
this.removeErrorOrInfo(event);
|
||||
}
|
||||
}
|
||||
|
||||
removeError(error: ErrorEvent, event?: MouseEvent) {
|
||||
clearAllErrorOrInfos() {
|
||||
const infoCount = this.infoSource.getValue().length;
|
||||
const errorCount = this.errorSource.getValue().length;
|
||||
this.infoSource.next([]);
|
||||
this.errorSource.next([]);
|
||||
this.activeEvents -= Math.max(infoCount + errorCount, 0);
|
||||
this.cdRef.markForCheck();
|
||||
}
|
||||
|
||||
removeErrorOrInfo(messageEvent: ErrorEvent | InfoEvent, event?: MouseEvent) {
|
||||
if (event) {
|
||||
event.stopPropagation();
|
||||
event.preventDefault();
|
||||
}
|
||||
let data = this.errorSource.getValue();
|
||||
data = data.filter(m => m !== error);
|
||||
this.errorSource.next(data);
|
||||
let data = [];
|
||||
if (messageEvent.name === EVENTS.Info) {
|
||||
data = this.infoSource.getValue();
|
||||
data = data.filter(m => m !== messageEvent);
|
||||
this.infoSource.next(data);
|
||||
} else {
|
||||
data = this.errorSource.getValue();
|
||||
data = data.filter(m => m !== messageEvent);
|
||||
this.errorSource.next(data);
|
||||
}
|
||||
this.activeEvents = Math.max(this.activeEvents - 1, 0);
|
||||
this.cdRef.markForCheck();
|
||||
}
|
||||
|
@ -2,7 +2,185 @@
|
||||
<div class="container-fluid">
|
||||
<a class="visually-hidden-focusable focus-visible" href="javascript:void(0);" (click)="moveFocus()">Skip to main content</a>
|
||||
<a class="side-nav-toggle" *ngIf="navService?.sideNavVisibility$ | async" (click)="hideSideNav()"><i class="fas fa-bars"></i></a>
|
||||
<a class="navbar-brand dark-exempt" routerLink="/libraries" routerLinkActive="active"><img width="28px" height="28px" class="logo" src="../../assets/images/logo.png" alt="kavita icon" aria-hidden="true"/><span class="d-none d-md-inline"> Kavita</span></a>
|
||||
<a class="navbar-brand dark-exempt" routerLink="/libraries" routerLinkActive="active">
|
||||
<svg
|
||||
width="28px"
|
||||
height="28px"
|
||||
viewBox="0 0 135.46666 135.46667"
|
||||
version="1.1"
|
||||
id="svg5"
|
||||
xml:space="preserve"
|
||||
|
||||
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||
xmlns:xlink="http://www.w3.org/1999/xlink"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
xmlns:svg="http://www.w3.org/2000/svg">
|
||||
<g id="layer1"
|
||||
transform="translate(-34.013356,-59.091761)">
|
||||
<image
|
||||
width="135.46666"
|
||||
height="135.46666"
|
||||
preserveAspectRatio="none"
|
||||
style="image-rendering:optimizeQuality"
|
||||
xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAMAAADDpiTIAAACE1BMVEUAAABQvplGxZNXt6FO2sxG
|
||||
wppKx5RJx5RIxpRHxZNGxZVJxpNLw5ZIxpRLyJFHw5BIxZJKxJVIyJRIx5RIxpRIxZRFxZNKyJZI
|
||||
xpRKyJRJyZVIxZRKx5VKx5RJxpNHx5RHu4xXuJNJx5NJxpRJyJZMv49KxZNKxpNIyJFIw5NHxpNI
|
||||
x5RJxZRHyZRJx5VJyJVIx5NJx5VIyJVHxZJIx5RHxpVHxpNHxJNFxJRGxI1JyJVIxZNJxZRJxJJM
|
||||
x5VKyJRGx5RGxpNJx5NFxpNKx5dKxZNJyJRFzJlKxpZLxZVKyZRGxZVHxJZLyJNJxJZKxpT///9C
|
||||
THLk5+/9/f3n6vFX0ffl6PD+/v9CSXHm6fD7+/1EZHhMypjl5uvl9u5FTnRLyJXR1eBLyJZEYXdM
|
||||
zJhKypfj5u1JyZVMypfv+v5T0PdW0ffr7fPo6/Fx2Phd0vf5+vvv8fVKxZT5+/zt7/RCRnFEX3f3
|
||||
+Pp02Pj19vld0/fO0t1CSHFGxZNCxJBIyJREUnU7RW1JwJJGg4BGjoRFd33X2uVEcnxIso5IrI1H
|
||||
nohQWHxEaHk+SG9HmYdDV3XS8OO659Ov5M1+1bCGjKVey59It5D7/f1JvpHn9/Dg9Ou+wdCa3cFs
|
||||
0KdgaIhEbXpDXXZIpIrw+vXEyNWk38V5gZvw8vbx+/5S0PfI69vFyNWus8SfpbhocI6J3vlu1/hO
|
||||
0Pe/V/w3AAAAT3RSTlMABzIFAhLF0rRiSDYi6lMpbD/typ1XTPfA9vHm3KOHdRYL+vh9Du/ajlAe
|
||||
1XE7reTNuyaW4LiLXToa+8eS+PPogSunffy9tgrzvvnzfvl+T9nnOAAAH1hJREFUeNrs3Flz2jAQ
|
||||
AGBhMAaDua9yhiNAKBCOnDRp0yPpdDpT/QE9aDTS1P+if7017STNRSCEYEv7vfHEw0qrXa0AqcOv
|
||||
9+LawDyuHU0/hYLGQclqfz//9ev8e9sqHRjB0KfpUe3YHGjxnu5HQBYBPRoxw63psDDqlNrnlNhC
|
||||
MMboH/ia84kxJoRN6Hm71BkVhtNW2IxE9QACHvW1p+USZ6F0x2qQOVswSvGTKGXCJnMNq5MOnSVy
|
||||
Wu8rAt5RnyTD3UOj1CAOwSh+JsoEcTRKxmE3nJzUEXC5HS1bPTRSzNnwjOIXQpmTEljKOKxmtR0E
|
||||
XEnXjqeFcmO+5/FGzPNBo1yYHms6Am4SNS9CHSf2jOINo8xZBZ3QhRlFwAX8k73uO0sQIih+NdT5
|
||||
Putdd28CDeNWRXNn6baz8fEWOKmgnT7LQSbYDj3ZuvxgE5viLaI2sT9ctpJQE7yyaLZSxpww7AKM
|
||||
cFyuZCERvBa/Vru0CBHYRQQh1mVNg4pg43yRVrrpkq1/LxE0062ID4GN8Z1URzNuU+xS1OazUfUE
|
||||
1sBGBLTWqOHi6F+vgcaopcEM6aVN+oU3ro/+9Rp4U+hPEHgxemyY4sQT0f+LEp4axqA3fBGBSLcs
|
||||
3Fj1LcaIKHcjcBSsayf8rcld1fEtT/DmtzAMD9ehHZVtL6X+uyixy0caAs9Sj13luedS/12M569i
|
||||
8JJkdad9g3l589+ghBn9UwRW8bG776673vUIst/9iMCyBpWi93P/bYwXKwMEluA3Q00uRe6/jfJm
|
||||
yIRp0VN8sYIkR/+DxUAhBoOCRXzZoLTh/7cEgllYAgvDjyUHS+Ax/pgC4f+3BGJQC9xjjqkS4XcQ
|
||||
OjYR+N9gOFMm/A4yG0JTeCOeycvY+C1CeT4TR8BxWk2pFn4H5akqXBAjVE8cyHbrtyzGDxLKj4ly
|
||||
afITK+snSeeQyuIVtWq/+8isom4poP9Q8vC/Vwq0FH08uGd476XfJjBi7CH1RCtfbAzm7C8V1X5Z
|
||||
6EuUIPvfoLyUUGpAEBkLyP63MDGOIFXUPxc5Bnfw4mdFLgUGQSj+HsJIUIX5wO7Fe8Vb/8eR9xe7
|
||||
SHInQSj+Hkd58ATJzFcrwvZfiBRrErcD8Ss4/Z/CyJW0d8NhC4r/JXArjGS0k4HefzlMZCT8TXHS
|
||||
gOpvWZQbSSQXfw2av9UawppUD4dPh1D9rYaRoUTvxZJvofpbGX8ryzEQSED6f94xkJDiL4Z0qP6f
|
||||
3w1I8FYoDne/69wMe/5SKLcP6X8NZN/br4YD/bxEf/SyDSLf93AhsJsRkP7XREXGsyPi3hiO/9/s
|
||||
3U9LIzEUAPD4Z9tKxbZaKy1SLdKDVi0epIhgFVkoXpIPEIYhZNgeBQ+eks/QP8yhpb2UXvyW68Iu
|
||||
uKsrCzOTvMy+39nbe87k/ck0Ol9enBAn5bH6j4dsObkueNzF419MeNfBiwNFPP7FeRQsEres1bH7
|
||||
EycR1J0aDjXKbvy2gzv8ftmhpfHDXJq/9WaHz3PObIlsZvD4nwCZceQC4V0W458ImXXig9P5fSz/
|
||||
EsL3HWgIlHD6kxzeBr8kUtnGW/8J6m8Dnw5u9bD9k6igt0UA261i/BMWVHcJWJ0qtv8SJ6odAtTG
|
||||
EcbfAHEE9OpY5wHjb4R4APkM6OD/vyniCGAG7OL73xwB7yS4hfE3SVSBVYOV2n/8yWcbvtVAdYRK
|
||||
2P8zrb8NqCucb2P8jeu3wUyG7nD+ZwPfBzId3mxh/K3gLRAbIl92cP/DErkD4PZwI4fxt0bmrG+K
|
||||
rpUx/hbJ8hqxq47vf6t4nVhVxOu/dvlBkVh0vIcNYMvE3jGxZrWLC0DWBd1VYskJNgAg4K0TYkXz
|
||||
AgsAEORFk1iwjgUgFLK8Tsw7xQEQGP1TYlyligUgGH61QgwrtLEAACRoF4hRzWssAEDh103yOTwA
|
||||
ppvZg2ARn//gmOwJl2rYAQZH1ErEkKsWPgAAClpXxIiVczwAgCTPV4gJB44WAEprrf71z5zED4gB
|
||||
pZ6gEQw8S6aD4XIcKq0+j74Kx8vhYOpZMqARiF6JJO4wy2kUE2bTYjALtfp7+MPZYMFsmtAoePaQ
|
||||
fARSB8Bjli2Gc60+Dv98uGCWedC7ARuB73gCMPY8CjV9R4ejZ2adF3VDbIMkqtAV1PkEYGw61u/i
|
||||
P54yADwajegWSIJWLiVNQwKwx5miv1GzRwaBRyOSl3/UgtAqQBgJwNiIqjfhpyMGgwe6FszXRGoS
|
||||
gI3eJgCU+MeQAKKWJ29BmwGDSQC21PQnvWRQeJAnw7eSpikBnsb61/nviUHh0ejkLUnE2Y2fqgRg
|
||||
k1DRVyqcMDA8Gp1/c0YS0Mhwmq4EYENNX+khg8OjMeCZBonfPadpS4CnsYb1AogpASi/Jz+AqwCg
|
||||
JQB7UZSqFwaIR19BrARWvkqavgR4nCs1h9EB+s7effQ2EUQBHH+OTTOdiN57Fb3DgY5ASDPYJovN
|
||||
II1Wa2PABoc4GBBCiCI4cAAuCARX4AR8RDBlAeN4Z9v4zfL+t1yiyPnFs56dl40UAHOOpyDaZgiW
|
||||
QAD85b17aLYAogTAxAyItCkL7UQCeCMlilsAkQOwF06BKFvssEQCePXu3SuOqagAMGcxRNicYzKZ
|
||||
APjbtxxVkQGQx+ZAZK2eUGUJBfAC0yZAlABYdcJqiKqjDksqgOfPOaqiA8CcoxBR02fZiQXwGtE2
|
||||
cMQA7FnTIZqOOCyxABqodgEiBcCcIxBJ2f1SIwDLIx5t+TxHVZQA5P4sRNDyQcH0AbA+XbvVsw88
|
||||
0UUJgInB5Zj2ANUAfP5ypUcfz13mie4nAET7gct2lrUCuHblXI+uXicAPirvXAZhW+IwAjBmyAEw
|
||||
Zwmqj4DxAGg06wXv6sMjXLmR4XohpurvS54AMH0UXOww5ABaQ5WcSpVigyvWKFZysVUplBQAYLkl
|
||||
sHe+RA7Aqim/8k2uWLOSi7HKe40A5Py9EKJMuMeBXLrEbnQWNYDSUE61Qp4rlS/kYq3eC8CNztqv
|
||||
YoiciRkIXnaFDPP7f/royYWbHV2MGMCIOoCiKoBiriN9AC7e7OjCk0dPwwiQK7IQuPRxEeLX/+Dh
|
||||
/dG75zsLDSD4n2vN4h65q0rAwi9FF893dnf0/sMHIQiI42kI2voyC96DJ6NnuxQ1AH5becG+zRW7
|
||||
nYuxylCjF4CzXRp98oAFr7weApY6EOYN4OHoWS0A+O26yiowVFf7/bvfMqaGai2uDMAV8PBSmFGx
|
||||
FARrdjnM+n//rCYAPF9SyeI+skqxZXF1AG73w1wHlGdDoFLjRQgAj0bHAEA7gQEAjD4KAUCMT0GQ
|
||||
ppVZCACP7xKA6ADcfRwCACtP0/cG4PbsPAGIDsD5Z+FmBVPa3gDcLhCAKAFcYEzzW0BqniAAnhkC
|
||||
QMxLgd+yxyQB8MwQAPJY1v9dAMEIgGeGAGDC9x2BvXMlAfDOFABy7l7/5wAIgHemAPB9LuDwAkkA
|
||||
FDIGgFxwGPy0zmEEQCFjADBnHfho3xabAKhkDgB7yz5Qb5PDCIBK5gBgziZQ7uSEKgFQyiAA1Qkn
|
||||
/ewCmwXA8o77zIqpUAA07QdnTgiTADSaBZPmAoZLfQIgTmRArYH50iAAyZoLiBGAnD8Aau12mDkA
|
||||
rLryK1/jitU0zAXoB8Cc3arjoLZBABI2FxAnAFtxVHSSYAYBSNpcQIwAmJgECqXnGQVA/c+1ojwX
|
||||
EPMS0OwbgHlp8G7yXGkSAH47Z9pcwEi/AMi5k9UuAY0C0D7Er1LNz1xAbSiuirUWDwZA02XguEW2
|
||||
YQB4/lXJu1f+5gJelWIqb/H+AbAXjQOvpl1ipgFIUvECYJemgVebBQFQzzQAYjN4tH2NTQDUMw2A
|
||||
vWa7941gAqCeaQCYs8lzE4AA+Mg4AF5bAQNzJQHwkXEA5NwB6NUqhxEAHxkHgDmroEeprVUC4Cfz
|
||||
AFS3pnqNg5yWBMBP5gGQp/f2/r+wBMBP5gFgzpKeKwAB8JWBAHqtAZNXSgLgKwMByJWTYawOOYwA
|
||||
+MpAAMw5BGOUGhQEwF8mAhCDqbF3gcwEYKmEYyzAsvoOYOy9oKUOMxFAq1lQqelnLuD3t0zOXICb
|
||||
sxS6ltkmTATQylWU/0WrYo2hJM4FuIltGejWlKk2BgAfr/boShLmAob7DMCeOgW6NVswBABunbvT
|
||||
M5oLCJ+YDd1ajAEA/3C5dxbNBYROLIYujVtbxgDAb5Z5cwHD/QZQXjsO/m2mlCYCUD7EX/EzFxDn
|
||||
RWBxpN8ApJzZ/SiAkQAU5wKK/uYCikMxVaw1uEYA6ocCMqeEoQC4lVfJ8vctY6vvG0HfEqcy0NmG
|
||||
qbapAJKUHgD21A3dng9EAPpfUADhnyN0UBAABGkCIA5CR+lBAoAhXQAG0/9OBBEABHkCiGtCaH2V
|
||||
EQAEaQLAqp0XAesEAcCQLgBiHfxV+gABQJE2AAfSnbeCCQCGdAHovCWcLTMCgCFdAFg5C3+2ShAA
|
||||
FGkDIFbBn00kADjSB2Di32cBaAnAkSYAnWcCBlZKAoAibQDkygH43R7BkACwgkRzAQESe+B3O5AA
|
||||
GBkuxFazwZVrNAuxNTyCBMAO+N08HABKhUouSDQXECAxD9yWLSqjADAc6HWnuYBAlRct+/MaEAWA
|
||||
ei5INBcQ6irQHQlJPgCaCxhrPGQJEgDBlgCaCwiWWOIC2IUEwEighzXRXECwxC742fKNVcYYAgC8
|
||||
0YzxPH6LK9eKcS6giWAu4HvVjcvhR9tnScYYBgCc52OK5gL+Sc7aDj+aabe/xgHgf08jAGbP/Osx
|
||||
YQQAQzoBuA8RW0cA0KQTgHsucBcBQJNWALv+elg8AcCQRgDuA+U3LKSLQDTpBGAv3ADtBlbI9pcE
|
||||
AEM6AcgVA9BujmDtCACGdAJgYg60m0EA8KQXwAxod5AA4EkjAHdIfCIBwJM+AO7R8MwgAcCTXgCD
|
||||
mT9mAggAhrQBcGcD3LlQAoAhbQDcCVF3GwAHACsfUxbnKO4Ho5kLcDcCsoJ9DwMAmgvQCoCJLABM
|
||||
wgMg5rmAFles9T/MBbg3hM/gAUCHQnUDOAMAO/AAoGPhugHsAIDNBOD/BXAEAObhAUBLgG4ApwDS
|
||||
vx4XiwBAzHMB6heB/8VcwLeqW9O/NwIRAGh//CrGlN+5gLgqoJkL+LkV6J4HwgAgzrkA3hHOn0Mb
|
||||
APdM0PQ1qAD8731l7+56mgiiMI4fthVaCSgSXowIKITgC0Z5qYmBxEDUqBezaRsIzVxMmi32QpIS
|
||||
mpDGCz+GX1dKYaGlhd2LOXt29/ldebmJ/2TK7pwZ5gBWHpGzY9QFBCABbwBmx6E1ozoQgAS8ASiz
|
||||
dv0pAAFIwByAfkh5BCAJdwB5WqipDgQgAXMAtQUqIABJuAMo0EZVdSAACZgDqG7QLgKQhDuAXcoh
|
||||
AEm4A8jRMN4DSMIcgBmmEQQgCXcAI7SJACThDmCTniEASbgDeEZjsgIo+fA5mCOAMVpRlyQEcGpv
|
||||
Q0i4uQCbzyFoLkCpFfqpbMBcQCzmApT6STvKBmwKjcWmUKV26K+yAdvCY7EtXKm/5P8TAaQxAIUl
|
||||
IO1LAH4EpvxHIP4MTPmfgXgRJOo52F8E4VWwLOyvgvExSBb2j0H4HCwL++dgbAiRhX1DCLaEycK+
|
||||
JQybQmVh3xSKbeGysG8Lx2CILOyDIRgNk4V9NAzDobKwD4diPFwW9vFwHBAhC/sBETgiRhb2I2Jw
|
||||
SJQs7IdE4Zg4Wc/BfkwcDopM+0GROCo25UfF0nccFp3aTaH6OxHtygkA28LZj4vHhRHpDmADV8ak
|
||||
ewlYxqVRqZ4L0HlcG4dr45xPuDhSznOwBmA+Obg6Vhjuq2NxebQw3JdH4/p4YSK4Pp5yCEAO3gBy
|
||||
1LaNAOTgDWCb2h4jADl4A3hMbdMIQA7eAKapzXlp1DkEIAFnAOalQ22Xe4IQgARMAfj7gdq+jdTV
|
||||
OQQgAWcA9ZFvdGFLq3MIQALOAPQWdawiADFYA1iljjwCEIM1gDx1TOBHoBisPwInqGP9iVFKyQig
|
||||
ZM2BiK/BguYCzJN16njxpq6UkhBAo3m4b0nIuQB7zyFmLqD+5gXRjT8DBARgeS6g5QbUSsVcgN6i
|
||||
K/NCAsCmUNYA5unKlJAAsC2cNYApuuIsGaWSH0A5aADlYpdkBmCWHLqyOFcTEYDtJcANKBVLQG1u
|
||||
kXyzWkQAlucCGm5AjTTMBehZurYhIwDMBXAGsEHXPgoJwHUPrHF7yHwOxgA+0jVnyQgJIO04Auj9
|
||||
DdiZDUAAIvAE4M8E+HIaAYjAFoDO0U3LCEAGvgCW6aaHWAJk4FsCHpKvMyGKACRgCcCfC72WGdYI
|
||||
QAKuAPRwhrqsIgAR2AJYpW6v6ghAAq4A6q+o2/pKFQEIwBRAdWWdumUmNQIQgCkAPZmhHtsIQAKu
|
||||
ALap13QNAQjAFEBtmnq9HqsigOjxBFAde029sl+0hABKlmAu4Cb9JUu37HnRB9D4bXEuoOUG1rL5
|
||||
HALmArw9um3CGHUBcwFJnwswZoJ8vXsCkrwpFHMB/fYC+J5rdS7B28IxF3BJP6d+ppIeAOYCbo2E
|
||||
9H4STvgSgLkA/1NwP9kPWinMBSR/LkB/yFJfC17EAbiVk+OyLc1TN7DTZtmW45OKG3EA3gL158yY
|
||||
yF8EHfgwF2ApADPjUH9DkzryANKOIQA9OUQDfPYQQDhxDMD7TIOMLhkEEEoMAzBLozTI0Ps6Aggl
|
||||
hgHU3w/RQPMeAgglhgF48zTY2g+DAMKIXwDmxxrRXWsAAggjfgH4K0B/ex4CCCN+AXh7dBdnxiCA
|
||||
EGIXgHnq0F0ysxoBhBC7APRshu701kMAIcQuAO8tDeJPCCGA4OIWgD8RNNg7jQCCi1sA+h3dZ/wX
|
||||
AggubgH8Gqf7PJirIoDAYhZAde4B3avgIYDAYhaAV6D7jc4YBBBUvAIwM6N0v8ysRgBBxSuA9kuA
|
||||
APJ2A/hXiVapdNdkYiVa/+wGkKcgFjerFgP4U96P1uFxs1Vx+6i0mseH+9Eq/7EYQHVzkQIpeFYD
|
||||
KEbt6Gi/eeb2OGvuHx0Vo2Y1AK9AwThPTaIDuJgUPelaCUonxej/9+0F4H8HCib7VSc9gPZ8RsP1
|
||||
Ncoi/vvtBqC/Zimg8VryAygWD/1l4OywKMR/9u6mpXEgDOD4UyxScqiiiCgo6klBEMFD1ZPivZ1p
|
||||
KA0d6LRM2ksCK+seRA8NrLnE3noL9FJ68eU7boTuyi576Mx0yGTi/5hTYH48yUCGqATglmHekh/K
|
||||
5wAA/n1Q/7s2688BQOhn8XN3zPIAoOfXa0l1X5P5rxYAO4b5Oz/p5gDA7JzurT7rrxBA9+QcODpk
|
||||
eQCAPw4Mf8MapQ4AOwSerrZpHgD07mq1O40GgDoAdPsKuNpheQCAnXpdnzdAlQDYDvBV2aR5ANC7
|
||||
v9dpACgDQDcrwFexRHIB4FanV0B1AEipCJxZNzQHALDvY51SBIDeWMBb8lmAEgAdz045p4E/cxz8
|
||||
WcOxU87rcADg+xCAu7KrBEAcdtItDB3PbvxnS2h7Tpj6zcVKALhl4K+wTFQAaKfcIBpP3x4a/j8E
|
||||
kgsPb9NxNGinnAoAZLkAApVdSQCaNgyCwTS27b+evXY8HQTBEGlasyU7AMRGgJEAPgrQKPTxn/xw
|
||||
lFzSuGZLcgAIteEaCyAhMJ54eJY3GWu9/JIA3A0Qq3BJzAWAhlE8E+DFkbazfwEAyGUBBDtyDQaQ
|
||||
CHjxcZL/ov36SwFwj0C0pVNiMAAUjEMHYyfUff7LASCnSyCcdUANBoCCkY2xPdJ//SUA0AMLxCuW
|
||||
mMkAUHvi+5M20j9xAKxUBIkqW9RkAMkIyMQAEAdAtyog1Q4zGUAzCsMoEzfaEv8OQK6Vta7BANDw
|
||||
9VX7HYAMgO7aCki2y0wGEIwy8QQQBsB2Qba9M9dgAMP3d5MngHu2B9KtEoMBoChCWUgQAFkF+fbX
|
||||
icEA2lnYBIoCIOv7sICsC1rl7zkbADJS87nKH72wYBEVr1mVu/7TI/pqYT0+9avcsWsQTX4r2P/5
|
||||
A331i527eWkciAIAPmVFEYUqiLB6UA89SfXkQQQVRPD2mNtAeyj94N0D42lKQj5IE0pyC7m1/U+3
|
||||
VViWZbfbziQpb+3vX3jTvK+ZFkb6WGkLaP4n8jhytjmgMMIZof4fw5traDwXR2/7CSiM9FDjOXiD
|
||||
FeZU45GAG25PQEFk6Go8BThlJsxXAuh6jlRiM3gJxGYo6Xgumi0BzB2ed0GjDvDDVq9679PUUbLQ
|
||||
UyCk5On0vVe9Vuhr5H/onh8yU+bzQETYAMtysyj2J6kSRUWft7wkynLLgg1ANJ8Bmqu99oEOnIMs
|
||||
mXDFjQk19SML54CO/muNFezgrAu0IFqziXEikL0kJxX7he7ZASvcSxvIQStOFTegHN+lFv259gsr
|
||||
3tUuwRMAmJl0o7IVEQw/tHevWAmajx2gB139rYQMcorx7zw2WSmeKdWBP+HQk7rxtyjGH/rPrByN
|
||||
B4pJANANFdegJhTTP0D7ocF+8bU7gQ+YTQVfm0gzkvFfdACluSD5CQCMdQ5ATDL+0L5g5andkSwD
|
||||
ANYvA6QHJPXvaqxE9SeSScDOe4KvRfRyGwjqPtVZqfYHFHtBwMgR693AIDkAgM5gn5Vr54RmEsCY
|
||||
izXiz4kWAP2THVayyz2ihWDCxerxT2jGv713yUrXfCNZBnycgP88/t23JvubbS8IOFttMaTSGc34
|
||||
a3SAWmo3NMsAwNFY8n+S4xHR+PdvvrFK3F8PgCR0Pa74Uop7NAfAAIPre1aRY5ojYQDbjpZeERFi
|
||||
Etkk+//FCPiYVeaI6CcAAIfxWCj+R0qM4yHRnz/A4Igtt50GfEJrFqRSid9v/sk0mNFc/64+Afjy
|
||||
m+FPiFkSTLmUSok5paTk0yDJyF39098Bm6vfks0Cc4h2HsV+EI5brXEY+HGU25TDD4PbOqvY6XeS
|
||||
S4Ef7NxJbuJAFAbglxAmQcAQBwQhDCIsGBLEimQDQYqiKIvuC9TCKpXVvkVfvdWtjhBiMnioV87/
|
||||
XYEn+3+D2Twd//0vGRp28r2D06hR7CbeT2DCm5AvCILJtA6AsRrMUAEsqNmAtBiNDW4FkkOMR6TJ
|
||||
5dLkViAh3OUlaTOsGzoTTg5ZH5JGlpkXYsnhuBZp1UYM0Eq0Sa8UmkGdlJ0izXo5VIA2Ktcj7dI3
|
||||
qABN1E2aGChiHKCHGBeJhVULFaCBaK2IicIUi6HYedMCsVGpogJi5lUrxEitafhG3TS/mjVipdTA
|
||||
UDhGslEiZm5RAfGRjVtiZ95HBcRE9ufEUP4TFRAL+ZknlvJ4BsRB9pn+/kRz5IDoyQbL5/9XEsSJ
|
||||
UMRcjvlvrfSBCoiU+8Gu/9tUw0wwSl6V2fxnW2WKzVBkxJTV/He3AnaDUREtRvuf/VZZXIhEQmXZ
|
||||
7H8PK2ZQARFQGSb3H8d1cwLX4iFzRK5LxujZHiogVI5nM7j/9C/VdjEUDJF029rvv09k1TESCo1b
|
||||
t8g4wyXawZCI5ZAMVBijGQiFGhvR/m8bzRSiYGCOmo3IUAMbHw8H5bj2gIx1NUEUDBr/Jldkshp2
|
||||
Q0GIKfvt3zHlDILAuRyVKZPx0jZmQueRrs3i69+grqwFXgNnEAvL7Nf/WgUTgdOpsQHHH369PAm8
|
||||
Bk4ixdMLJUiqg9fAKcSiY9ry55hKFt2AX47KJujx/6WLbsB/+jfo9uME+WtkQR/UNdtPv4IqvyML
|
||||
HiPFewKGP/tcdJrIggeJZueCkuwZk+HDs99nSrjBKxrCfcTi1eDVr2+PGSSBXaTIPNK3MHhooh3Y
|
||||
opoPRh1+B1KYYSawSbozQw//znNh3SEMrjnqzkp2+N9WvH/DPwn8573dG/PZX4iGWYTBv6TIGnn2
|
||||
H1z6RxXvgT/tnc1W4jAUxyFYaAstQkWqFHBKhYKIfIsIR8eFHwuPL3AXOTnJmb6Frz7F1YwbUUHa
|
||||
JL8X6OL/T+7Nzc0tpkWPi76fL5GvzgUvCsC8ynHldw0OLRB4zPQfsGLf9ftdTvfHVNBUgNDxvgiV
|
||||
v4+4qgmZCmBarHHV9fUN8gdl0SyAaflA7OD/P+c9sbJBmPcEqfuvjT7AwlgA8EBPSN6RMtNECAsA
|
||||
SZu8tfxuhmRJAAuE8pdEK/t/zgI8T5jDUv6PSJo2txbAQGxTyv8RKV1t8HgoxLSh6jL2r8V59Zi3
|
||||
6iChx1V58Fuf3/4FcDRchsGFH5NBz5HhqpvhJBnAQDJdWfT9PL/M+3L8IwGh5XtTnG7PDYOmThDn
|
||||
bQBD4ExRQvJ1TjT7mcY0G2D0+Unj853vT7KX9R0Wv+5BAszxs7wM+dkxitkr0jiFAgy02DPFbfXb
|
||||
AsOufUTj8RcKHNAjuztMSDbLHvL6buQ9EKrv9j0kt/6tkJzV+vMIeyBUf96vzWS5f4sks57VoFHM
|
||||
CQnQhuVlpfpbJ4XqAwOiVSlmAMagjuRdz0+RK1Wdl4hsBAToi1Mtifi6b6coBe/pJoDdZgQ4gODm
|
||||
ySvIE99uyLUXVhNgNzsBAYCmtWjLpb9TUsO2f20wgJ/8Ow1efc+49ttDGfUjQU6/VVsuAJCtuwAT
|
||||
AHBb6q0uV360UNDlyHZWLtjW+BnCVto79ugSyZgfUU5QqTY5KxIACAje3KoPAIAUzya1EpLXe5Hn
|
||||
dFjQ/Emm4sIKRvDXhWewwq1kJr5WGMqnvHHioYMOuwvVahkuvBEwgvEaqoeyB/CGa7QsddE9RJ2H
|
||||
hCSm7Cm5rK55i57db1WaSwwBY4QQHPKP6CGEEBYqj5fNSqtv9xaepmdzirzV4YeU0smjma7Vp6M7
|
||||
NZ0ZV4zm4/L1dfnYNCrjTFq9G03rmj5D+Y4i0OHuL3GFPJytaM54AAAAAElFTkSuQmCC
|
||||
"
|
||||
id="image71"
|
||||
x="34.013355"
|
||||
y="59.091763"/></g></svg>
|
||||
<span class="d-none d-md-inline"> Kavita</span>
|
||||
</a>
|
||||
<ul class="navbar-nav col me-auto">
|
||||
|
||||
<div class="nav-item" *ngIf="(accountService.currentUser$ | async) as user">
|
||||
|
13
UI/Web/src/app/pipe/default-date.pipe.ts
Normal file
13
UI/Web/src/app/pipe/default-date.pipe.ts
Normal file
@ -0,0 +1,13 @@
|
||||
import { Pipe, PipeTransform } from '@angular/core';
|
||||
|
||||
@Pipe({
|
||||
name: 'defaultDate'
|
||||
})
|
||||
export class DefaultDatePipe implements PipeTransform {
|
||||
|
||||
transform(value: any, replacementString = 'Never'): string {
|
||||
if (value === null || value === undefined || value === '' || value === Infinity || value === NaN || value === '1/1/01') return replacementString;
|
||||
return value;
|
||||
}
|
||||
|
||||
}
|
@ -14,6 +14,7 @@ import { MangaFormatPipe } from './manga-format.pipe';
|
||||
import { MangaFormatIconPipe } from './manga-format-icon.pipe';
|
||||
import { LibraryTypePipe } from './library-type.pipe';
|
||||
import { SafeStylePipe } from './safe-style.pipe';
|
||||
import { DefaultDatePipe } from './default-date.pipe';
|
||||
|
||||
|
||||
|
||||
@ -32,7 +33,8 @@ import { SafeStylePipe } from './safe-style.pipe';
|
||||
MangaFormatPipe,
|
||||
MangaFormatIconPipe,
|
||||
LibraryTypePipe,
|
||||
SafeStylePipe
|
||||
SafeStylePipe,
|
||||
DefaultDatePipe
|
||||
],
|
||||
imports: [
|
||||
CommonModule,
|
||||
@ -51,7 +53,8 @@ import { SafeStylePipe } from './safe-style.pipe';
|
||||
MangaFormatPipe,
|
||||
MangaFormatIconPipe,
|
||||
LibraryTypePipe,
|
||||
SafeStylePipe
|
||||
SafeStylePipe,
|
||||
DefaultDatePipe
|
||||
]
|
||||
})
|
||||
export class PipeModule { }
|
||||
|
@ -56,7 +56,7 @@ export class ConfirmEmailComponent {
|
||||
this.toastr.success('Account registration complete');
|
||||
this.router.navigateByUrl('login');
|
||||
}, err => {
|
||||
console.log('error: ', err);
|
||||
console.error('Error from Confirming Email: ', err);
|
||||
this.errors = err;
|
||||
this.cdRef.markForCheck();
|
||||
});
|
||||
|
@ -6,7 +6,7 @@ import {
|
||||
HttpResponse
|
||||
} from "@angular/common/http";
|
||||
import { Observable } from "rxjs";
|
||||
import { distinctUntilChanged, scan, map, tap } from "rxjs/operators";
|
||||
import { scan } from "rxjs/operators";
|
||||
|
||||
function isHttpResponse<T>(event: HttpEvent<T>): event is HttpResponse<T> {
|
||||
return event.type === HttpEventType.Response;
|
||||
|
@ -37,6 +37,7 @@ export class SideNavItemComponent implements OnInit, OnDestroy {
|
||||
map(evt => evt as NavigationEnd))
|
||||
.subscribe((evt: NavigationEnd) => {
|
||||
this.updateHightlight(evt.url.split('?')[0]);
|
||||
|
||||
});
|
||||
}
|
||||
|
||||
|
@ -111,7 +111,6 @@ export class SideNavComponent implements OnInit, OnDestroy {
|
||||
|
||||
toggleNavBar() {
|
||||
this.navService.toggleSideNav();
|
||||
//this.cdRef.markForCheck();
|
||||
}
|
||||
|
||||
}
|
@ -219,7 +219,6 @@
|
||||
--carousel-hover-header-text-decoration: none;
|
||||
|
||||
/** Drawer */
|
||||
--drawer-background-color: black; // TODO: Remove this for bg
|
||||
--drawer-bg-color: #292929;
|
||||
--drawer-text-color: white;
|
||||
|
||||
@ -229,6 +228,7 @@
|
||||
--event-widget-text-color: var(--body-text-color);
|
||||
--event-widget-item-border-color: rgba(53, 53, 53, 0.5);
|
||||
--event-widget-border-color: rgba(1, 4, 9, 0.5);
|
||||
--event-widget-info-bg-color: #b6d4fe;
|
||||
|
||||
/* Search */
|
||||
--search-result-text-lite-color: initial;
|
||||
|
@ -151,7 +151,6 @@
|
||||
--carousel-hover-header-text-decoration: none;
|
||||
|
||||
/** Drawer */
|
||||
--drawer-background-color: white; // TODO: Remove this for bg
|
||||
--drawer-bg-color: white;
|
||||
--drawer-text-color: black;
|
||||
|
||||
|
Loading…
x
Reference in New Issue
Block a user