mirror of
https://github.com/Kareadita/Kavita.git
synced 2025-06-01 04:34:49 -04:00
* New Scan Loop (#1447) * Staging the code for the new scan loop. * Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real. * Started writing unit test for new loop code * Implemented a basic method to scan a folder path with ignore support (not implemented, code in place) * Added some code to the parser to build out the idea of processing series in batches based on some top level folder. * Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue. * Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support). * Wrote some notes on update library scan loop. * Removed migration for merge * Reapplied the SeriesFolder migration after merge * Refactored a check that used multiple db calls into one. * Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then. * Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned. * Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them. * Fixed an issue where ignore files nested wouldn't stack with higher level ignores * Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking. * Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it. * Refactored ScanFiles out to Directory Service. * Refactored more code out to keep the code clean. * More unit tests * Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work). * Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning. * Prep for unit tests (updating broken ones with new implementations) * Just some notes. Not sure I want to finish this work. * Refactored the LibraryWatcher with some comments and state variables. * Undid the migrations in case I don't move forward with this branch * Started to clean the code and prepare for finishing this work. * Fixed a bad merge * Updated signatures to cleanup the code and commit to the new strategy for scanning. * Swapped out the code with async processing of series on a small library * The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations. * Refactored UpdateSeries out of Scanner and into a dedicated file. * Refactored how ProcessTasks are awaited to allow more async * Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush * Moved where we start to stopwatch to encapsulate the full scan * Cleaned up SignalR events to report correctly (still needs a redesign) * Remove the "remove" code until I figure it out * Put in extremely expensive series deletion code for library scan. * Have Genre and Tag update the DB immediately to avoid dup issues * Taking a break * Moving to a lock with People was successful. Need to apply to others. * Refactored code for series level and tag and genre with new locking strategy. * New scan loop works. Next up optimization * Swapped out the Kavita log with svg for faster load * Refactored metadata updates to occur when the series are being updated. * Code cleanup * Added a new type of generic message (Info) to inform the user. * Code cleanup * Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds. Fixed a bug where File Analysis was running everytime for each non-epub file. * Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet. * Some code cleanup * Added experimental signalr update code to have a more natural refresh of library-detail page * Hooked in ability to send new series events to UI * Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series. * Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors. Added --event-widget-info-bg-color * Remove --drawer-background-color since it's not used * When new series added, inject directly into the view. * Some debug code cleanup * Fixed up the unit tests * Ensure all config directories exist on startup * Disabled Library Watching (that will go in next build) * Ensure update for series is admin only * Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again. * Removed SeriesFolder migration * Added the SeriesFolder migration * Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail. * The scan optimizations now work for NTFS systems. * Removed a TODO * Migrated all the times to use DateTime.Now and not Utc. * Refactored some repo calls to use the includes flag pattern * Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed. * Added another optimization which will use just folder attribute of last write time if the drive is not NTFS. * Fixed a unit test * Some code cleanup * Bump versions by dotnet-bump-version. * Misc UI Fixes (#1450) * Fixed collection cover images not rendering * added a try/catch on sending email, so we fail silently if it doesn't send. * Fixed Go Back not returning to last scroll position due to layoutmode change resetting, despite nothing changing. * Fixed a bug where when turning between pages on default mode, the height calculations could get skewed. * Fixed a missing case for card item where it wouldn't show tooltip title for series. * Bump versions by dotnet-bump-version. * New Scan Loop Fixes (#1452) * Refactored ScanSeries to avoid a lot of extra work and fixed a bug where Scan Series would invoke the processing twice. Refactored the series selection code during process such that we use Localized Name as well, for cases where the original name was changed. Undid an optimization around Last Write time, since Linux file systems match how NTFS works. * Fixed part of the query * Added a NormalizedLocalizedName for quick searching in which a series needs grouping. Reworked scan loop code a bit to ensure we don't do extra work. Tweaked the widget logic to help display better and not show "Nothing going on here". * Fixed a bug where archives with ._ files would be counted as valid files, while they are actually just metadata files on Mac's. * Fixed a broken unit test * Bump versions by dotnet-bump-version. * Simplify parent lookup with Directory.GetParent (#1455) * Simplify parent lookup with Directory.GetParent * Address comments * Bump versions by dotnet-bump-version. * Scan Loop Fixes (#1459) * Added Last Folder Scanned time to series info modal. Tweaked the info event detail modal to have a primary and thus be auto-dismissable * Added an error event when multiple series are found in processing a series. * Fixed a bug where a series could get stuck with other series due to a bad select query. Started adding the force flag hook for the UI and designing the confirm. Confirm service now also has ability to hide the close button. Updated error events and logging in the loop, to be more informative * Fixed a bug where confirm service wasn't showing the proper body content. * Hooked up force scan series * refresh metadata now has force update * Fixed up the messaging with the prompt on scan, hooked it up properly in the scan library to avoid the check if the whole library needs to even be scanned. Fixed a bug where NormalizedLocalizedName wasn't being calculated on new entities. Started adding unit tests for this problematic repo method. * Fixed a bug where we updated NormalizedLocalizedName before we set it. * Send an info to the UI when series are spread between multiple library level folders. * Added some logger output when there are no files found in a folder. Return early if there are no files found, so we can avoid some small loops of code. * Fixed an issue where multiple series in a folder with localized series would cause unintended grouping. This is not supported and hence we will warn them and allow the bad grouping. * Added a case where scan series fails due to the folder being removed. We will now log an error * Normalize paths when finding the highest directory till root. * Fixed an issue with Scan Series where changing a series' folder to a different path but the original series folder existed with another series in it, would cause the series to not be deleted. * Fixed some bugs around specials causing a series merge issue on scan series. * Removed a bug marker * Cleaned up some of the scan loop and removed a test I don't need. * Remove any prompts for force flow, it doesn't work well. Leave the API as is though. * Fixed up a check for duplicate ScanLibrary calls * Bump versions by dotnet-bump-version. * Scroll Resume (#1460) * When we navigate from a page then back, resume back on the last scroll key (if clicked) * Resume jump key position when navigating back to a page. Removed some extra blank space on collection detail when a collection doesn't have a summary or cover image. * Ignore progress events on series cards * Added a url to swagger for /, which could be reverse proxy url * Bump versions by dotnet-bump-version. * Misc UI fixes (#1461) * Misc fixes - Fixed modal being stretched when not needed. - Fixed Logo vertical align - Fixed drawer content scroll, and from it being squished due to overridden by bootstrap. * series detail cover image stretch fix - Fixes: Fixes series detail cover image being stretched on larger resolutions * fixing empty lists scrollbar * Fixing want to read error * fixing unnecessary scrollbar * Fixing recently updated tooltip * Bump versions by dotnet-bump-version. * Folder Watching (#1467) * Hooked in a server setting to enable/disable folder watching * Validated the file rename change event * Validated delete file works * Tweaked some logic to determine if a change occurs on a folder or a file. * Added a note for an upcoming branch * Some minor changes in the loop that just shift where code runs. * Implemented ScanFolder api * Ensure we restart watchers when we modify a library folder. * Fixed a unit test * Bump versions by dotnet-bump-version. * More Scan Loop Bugfixes (#1471) * Updated scan time for watcher to 30 seconds for non-dev. Moved ScanFolder off the Scan queue as it doesn't need to be there. Updated loggers * Fixed jumpbar missing * Tweaked the messaging for CoverGen * When we return early due to nothing being done on library and series scan, make sure we kick off other tasks that need to occur. * Fixed a foreign constraint issue on Volumes when we were adding to a new series. * Fixed a case where when picking normalized series, capitalization differences wouldn't stack when they should. * Reduced the logging output on dev and prod settings. * Fixed a bug in the code that finds the highest directory from a file, where we were not checking against a normalized path. * Cleaned up some code * Fixed broken unit tests * Bump versions by dotnet-bump-version. * More Scan Loop Fixes (#1473) * Added a ToList() to avoid a bug where a person could be removed from a list while iterating over the list. * When deleting a series, want to read page will now automatically remove that series from the view. * Fixed a series lookup which was ignoring format * Ignore XML comment warnings * Removed a note since it was already working that way * Fixed unit test * Bump versions by dotnet-bump-version. * Misc UI Fixes (#1477) * Tweaked a Migration to log correctly only if something is going to be done. * Refactored Reading List Controller code into a dedicated service and cleaned up some methods that aren't needed anymore. * Fixed a bug where adding a new item to a reading list wasn't adding it at the end. * Fixed an issue where collection page would re-render the same covers on multiple items. * Fixed a missing margin-top which made the page extras drawer not render correctly and hence unclosable on small screens. * Added some timeout on manage users screen to give data time to flush. Added a dedicated token log for account flows, in case url encoding plays a part (but from testing it doesn't). * Reverted back to building for ES6 instead of es2020 for old Safari 12.5.5 browsers (10MB difference in build size). * Cleaned up the logic in removing series not found during scan loop. * Tweaked the timings for Library Watcher to 1 min and reprocess queue every 30 seconds. * Bump versions by dotnet-bump-version. * Added fixes for libvips (#1479) * Bump versions by dotnet-bump-version. * Tachiyomi + Fixes (#1481) * Fixed a bootstrap bug * Fixed repeating images on collection detail * Fixed up some logic in library watcher which wasn't processing all of the queue. * When parsing non-epubs in Book library, use Manga parsing for Volume support to better support Light Novels * Fixed some bugs with the tachiyomi plugin api's for progress tracking * Bump versions by dotnet-bump-version. * Adding Health controller (#1480) * Adding Health controller - Added: Added API endpoint for a health check to streamline docker healthy status. * review comment fixes * Bump versions by dotnet-bump-version. * Simplify Folder Watcher (#1484) * Refactored Library Watcher to use Hangfire under the hood. * Support .kavitaignore at root level. * Refactored a lot of the library watching code to process faster and handle when FileSystemWatcher runs out of internal buffer space. It's still not perfect, but good enough for basic use. * Make folder watching as experimental and default it to off by default. * Revert #1479 * Tweaked the messaging for OPDS to remove a note about download role. Moved some code closer to where it's used. * Cleaned up how the events widget reports * Fixed a null issue when deleting series in the UI * Cleaned up some debug code * Added more information for when we skip a scan * Cleaned up some logging messages in CoverGen tasks * More log message tweaks * Added some debug to help identify a rare issue * Fixed a bug where save bookmarks as webp could get reset to false when saving other server settings * Updated some documentation on library watcher. * Make LibraryWatcher fire every 5 mins * Bump versions by dotnet-bump-version. * Sort series by chapter number only when some chapters have no volume (#1487) * Sort series by chapter number only when some chapters have no volume information * Implement a Default static instance of ChapterSortComparer * Further use Default static Comparers * Add missing ToLit() as per comments * SQLite Hangfire (#1488) * Update to use SQLIte for Hangfire to retain information on tasks * Updated all external links to have noopener noreferrer * When watching folders, ensure the folders exist before creating watchers. * Tweaked the messaging for Email Service and added link to the project. * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Fixed typeahead not working correctly (#1490) * Bump versions by dotnet-bump-version. * Release Testing Day 1 (#1491) * Fixed a bug where typeahead wouldn't automatically show results on relationship screen without an additional click. * Tweaked the code which checks if a modification occured to check on seconds rather than minutes * Clear cache will now clear temp/ directory as well. * Fixed an issue where Chrome was caching api responses when it shouldn't had. * Added a cleanup temp code * Ensure genres get removed during series scan when removed from metadata. * Fixed a bug where all epubs with a volume would show as Volume 0 in reading list * When a scan is in progress, don't let the user delete the library. * Bump versions by dotnet-bump-version. * Scan Loop Last Write Time Change (#1492) * Refactored invite user flow to separate error handling on create user flow and email flow. This should help users that have unique situations. * Switch to using files to check LastWriteTime. Debug code in for Robbie to test on rclone * Updated Parser namespace. Changed the LastWriteTime to check all files and folders. * Bump versions by dotnet-bump-version. * Release Testing Day 2 (#1493) * Added a no data section to collection detail. * Remove an optimization for skipping the whole library scan as it wasn't reliable * When resetting password, ensure the input is colored correctly * Fixed setting new password after resetting, throwing an error despite it actually being successful. Fixed incorrect messaging for Password Reset page. * Fixed a bug where reset password would show the side nav button and skew the page. Updated a lot of references to use Typed version for formcontrols. * Removed a migration from 0.5.0, 6 releases ago. * Added a null check so we don't throw an exception when connecting with signalR on unauthenticated users. * Bump versions by dotnet-bump-version. * Fixed a bug where a series with a relationship couldn't be deleted. (#1495) * Bump versions by dotnet-bump-version. * Release Testing Day 3 (#1496) * Tweaked log messaging for library scan when no files were scanned. * When a theme that is set gets removed due to a scan, inform the user to refresh. * Fixed a typo and make Darkness -> Brightness * Make download theme files allowed to be invoked by non-authenticated users, to allow new users to get the default theme. * Hide all series side nav item if there are no libraries exposed to the user * Fixed an API for Tachiyomi when syncing progress * Fixed dashboard not responding to Series Removed and Added events. Ensure we send SeriesRemoved events when they are deleted. * Reverted Hangfire SQLite due to aborted jobs being resumed, when they shouldnt. Fixed some scan loop issues where cover gen wouldn't be invoked always on new libraries. * Bump versions by dotnet-bump-version. * Updating series detail cover style (#1498) # FIxed - Fixed: Fixed an issue with series detail cover when scaled down. * Bump versions by dotnet-bump-version. * Version bump * v0.5.6 Release (#1499) Co-authored-by: tjarls <tjarls@gmail.com> Co-authored-by: Robbie Davis <robbie@therobbiedavis.com> Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
820 lines
33 KiB
C#
820 lines
33 KiB
C#
using System;
|
|
using System.Collections.Generic;
|
|
using System.Collections.Immutable;
|
|
using System.Diagnostics;
|
|
using System.Linq;
|
|
using System.Threading.Tasks;
|
|
using API.Data;
|
|
using API.Data.Metadata;
|
|
using API.Entities;
|
|
using API.Entities.Enums;
|
|
using API.Extensions;
|
|
using API.Helpers;
|
|
using API.Parser;
|
|
using API.Services.Tasks.Metadata;
|
|
using API.SignalR;
|
|
using Hangfire;
|
|
using Microsoft.Extensions.Logging;
|
|
|
|
namespace API.Services.Tasks.Scanner;
|
|
|
|
public interface IProcessSeries
|
|
{
|
|
/// <summary>
|
|
/// Do not allow this Prime to be invoked by multiple threads. It will break the DB.
|
|
/// </summary>
|
|
/// <returns></returns>
|
|
Task Prime();
|
|
Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library);
|
|
void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false);
|
|
}
|
|
|
|
/// <summary>
|
|
/// All code needed to Update a Series from a Scan action
|
|
/// </summary>
|
|
public class ProcessSeries : IProcessSeries
|
|
{
|
|
private readonly IUnitOfWork _unitOfWork;
|
|
private readonly ILogger<ProcessSeries> _logger;
|
|
private readonly IEventHub _eventHub;
|
|
private readonly IDirectoryService _directoryService;
|
|
private readonly ICacheHelper _cacheHelper;
|
|
private readonly IReadingItemService _readingItemService;
|
|
private readonly IFileService _fileService;
|
|
private readonly IMetadataService _metadataService;
|
|
private readonly IWordCountAnalyzerService _wordCountAnalyzerService;
|
|
|
|
private IList<Genre> _genres;
|
|
private IList<Person> _people;
|
|
private IList<Tag> _tags;
|
|
|
|
|
|
|
|
public ProcessSeries(IUnitOfWork unitOfWork, ILogger<ProcessSeries> logger, IEventHub eventHub,
|
|
IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService,
|
|
IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService)
|
|
{
|
|
_unitOfWork = unitOfWork;
|
|
_logger = logger;
|
|
_eventHub = eventHub;
|
|
_directoryService = directoryService;
|
|
_cacheHelper = cacheHelper;
|
|
_readingItemService = readingItemService;
|
|
_fileService = fileService;
|
|
_metadataService = metadataService;
|
|
_wordCountAnalyzerService = wordCountAnalyzerService;
|
|
}
|
|
|
|
/// <summary>
|
|
/// Invoke this before processing any series, just once to prime all the needed data during a scan
|
|
/// </summary>
|
|
public async Task Prime()
|
|
{
|
|
_genres = await _unitOfWork.GenreRepository.GetAllGenresAsync();
|
|
_people = await _unitOfWork.PersonRepository.GetAllPeople();
|
|
_tags = await _unitOfWork.TagRepository.GetAllTagsAsync();
|
|
}
|
|
|
|
public async Task ProcessSeriesAsync(IList<ParserInfo> parsedInfos, Library library)
|
|
{
|
|
if (!parsedInfos.Any()) return;
|
|
|
|
var seriesAdded = false;
|
|
var scanWatch = Stopwatch.StartNew();
|
|
var seriesName = parsedInfos.First().Series;
|
|
await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress,
|
|
MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Updated, seriesName));
|
|
_logger.LogInformation("[ScannerService] Beginning series update on {SeriesName}", seriesName);
|
|
|
|
// Check if there is a Series
|
|
var firstInfo = parsedInfos.First();
|
|
Series series;
|
|
try
|
|
{
|
|
series =
|
|
await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(firstInfo.Series, firstInfo.LocalizedSeries,
|
|
library.Id, firstInfo.Format);
|
|
}
|
|
catch (Exception ex)
|
|
{
|
|
_logger.LogError(ex, "There was an exception finding existing series for {SeriesName} with Localized name of {LocalizedName} for library {LibraryId}. This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan", firstInfo.Series, firstInfo.LocalizedSeries, library.Id);
|
|
await _eventHub.SendMessageAsync(MessageFactory.Error,
|
|
MessageFactory.ErrorEvent($"There was an exception finding existing series for {firstInfo.Series} with Localized name of {firstInfo.LocalizedSeries} for library {library.Id}",
|
|
"This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan."));
|
|
return;
|
|
}
|
|
|
|
if (series == null)
|
|
{
|
|
seriesAdded = true;
|
|
series = DbFactory.Series(firstInfo.Series, firstInfo.LocalizedSeries);
|
|
}
|
|
|
|
if (series.LibraryId == 0) series.LibraryId = library.Id;
|
|
|
|
try
|
|
{
|
|
_logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName);
|
|
|
|
var firstParsedInfo = parsedInfos[0];
|
|
|
|
UpdateVolumes(series, parsedInfos);
|
|
series.Pages = series.Volumes.Sum(v => v.Pages);
|
|
|
|
series.NormalizedName = Parser.Parser.Normalize(series.Name);
|
|
series.OriginalName ??= firstParsedInfo.Series;
|
|
if (series.Format == MangaFormat.Unknown)
|
|
{
|
|
series.Format = firstParsedInfo.Format;
|
|
}
|
|
|
|
if (string.IsNullOrEmpty(series.SortName))
|
|
{
|
|
series.SortName = series.Name;
|
|
}
|
|
if (!series.SortNameLocked)
|
|
{
|
|
series.SortName = series.Name;
|
|
if (!string.IsNullOrEmpty(firstParsedInfo.SeriesSort))
|
|
{
|
|
series.SortName = firstParsedInfo.SeriesSort;
|
|
}
|
|
}
|
|
|
|
// parsedInfos[0] is not the first volume or chapter. We need to find it
|
|
var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p));
|
|
if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries))
|
|
{
|
|
series.LocalizedName = localizedSeries;
|
|
series.NormalizedLocalizedName = Parser.Parser.Normalize(series.LocalizedName);
|
|
}
|
|
|
|
UpdateSeriesMetadata(series, library.Type);
|
|
|
|
// Update series FolderPath here
|
|
await UpdateSeriesFolderPath(parsedInfos, library, series);
|
|
|
|
series.LastFolderScanned = DateTime.Now;
|
|
_unitOfWork.SeriesRepository.Attach(series);
|
|
|
|
if (_unitOfWork.HasChanges())
|
|
{
|
|
try
|
|
{
|
|
await _unitOfWork.CommitAsync();
|
|
}
|
|
catch (Exception ex)
|
|
{
|
|
await _unitOfWork.RollbackAsync();
|
|
_logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series);
|
|
|
|
await _eventHub.SendMessageAsync(MessageFactory.Error,
|
|
MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}",
|
|
ex.Message));
|
|
return;
|
|
}
|
|
|
|
if (seriesAdded)
|
|
{
|
|
await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded,
|
|
MessageFactory.SeriesAddedEvent(series.Id, series.Name, series.LibraryId), false);
|
|
}
|
|
|
|
_logger.LogInformation("[ScannerService] Finished series update on {SeriesName} in {Milliseconds} ms", seriesName, scanWatch.ElapsedMilliseconds);
|
|
}
|
|
}
|
|
catch (Exception ex)
|
|
{
|
|
_logger.LogError(ex, "[ScannerService] There was an exception updating series for {SeriesName}", series.Name);
|
|
}
|
|
|
|
await _metadataService.GenerateCoversForSeries(series, false);
|
|
EnqueuePostSeriesProcessTasks(series.LibraryId, series.Id);
|
|
}
|
|
|
|
private async Task UpdateSeriesFolderPath(IEnumerable<ParserInfo> parsedInfos, Library library, Series series)
|
|
{
|
|
var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path),
|
|
parsedInfos.Select(f => f.FullFilePath).ToList());
|
|
if (seriesDirs.Keys.Count == 0)
|
|
{
|
|
_logger.LogCritical(
|
|
"Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are under a single folder from library");
|
|
await _eventHub.SendMessageAsync(MessageFactory.Info,
|
|
MessageFactory.InfoEvent($"{series.Name} has files spread outside a single series folder",
|
|
"This has negative performance effects. Please ensure all series are under a single folder from library"));
|
|
}
|
|
else
|
|
{
|
|
// Don't save FolderPath if it's a library Folder
|
|
if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First()))
|
|
{
|
|
series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First());
|
|
}
|
|
}
|
|
}
|
|
|
|
public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false)
|
|
{
|
|
//BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate));
|
|
BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate));
|
|
}
|
|
|
|
private static void UpdateSeriesMetadata(Series series, LibraryType libraryType)
|
|
{
|
|
series.Metadata ??= DbFactory.SeriesMetadata(new List<CollectionTag>());
|
|
var isBook = libraryType == LibraryType.Book;
|
|
var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook);
|
|
|
|
var firstFile = firstChapter?.Files.FirstOrDefault();
|
|
if (firstFile == null) return;
|
|
if (Parser.Parser.IsPdf(firstFile.FilePath)) return;
|
|
|
|
var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList();
|
|
|
|
// Update Metadata based on Chapter metadata
|
|
series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year);
|
|
|
|
if (series.Metadata.ReleaseYear < 1000)
|
|
{
|
|
// Not a valid year, default to 0
|
|
series.Metadata.ReleaseYear = 0;
|
|
}
|
|
|
|
// Set the AgeRating as highest in all the comicInfos
|
|
if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating);
|
|
|
|
series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount);
|
|
series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count);
|
|
// To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well.
|
|
if (series.Metadata.MaxCount != series.Metadata.TotalCount)
|
|
{
|
|
var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name));
|
|
var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range));
|
|
if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume;
|
|
else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter;
|
|
}
|
|
|
|
|
|
if (!series.Metadata.PublicationStatusLocked)
|
|
{
|
|
series.Metadata.PublicationStatus = PublicationStatus.OnGoing;
|
|
if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0)
|
|
{
|
|
series.Metadata.PublicationStatus = PublicationStatus.Completed;
|
|
} else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0)
|
|
{
|
|
series.Metadata.PublicationStatus = PublicationStatus.Ended;
|
|
}
|
|
}
|
|
|
|
if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked)
|
|
{
|
|
series.Metadata.Summary = firstChapter.Summary;
|
|
}
|
|
|
|
if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked)
|
|
{
|
|
series.Metadata.Language = firstChapter.Language;
|
|
}
|
|
|
|
// Handle People
|
|
foreach (var chapter in chapters)
|
|
{
|
|
if (!series.Metadata.WriterLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Writer))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.CoverArtistLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.CoverArtist))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.PublisherLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Publisher))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.CharacterLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Character))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.ColoristLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Colorist))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.EditorLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Editor))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.InkerLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Inker))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.LettererLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Letterer))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.PencillerLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Penciller))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.TranslatorLocked)
|
|
{
|
|
foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Translator))
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(series.Metadata.People, person);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.TagsLocked)
|
|
{
|
|
foreach (var tag in chapter.Tags)
|
|
{
|
|
TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag);
|
|
}
|
|
}
|
|
|
|
if (!series.Metadata.GenresLocked)
|
|
{
|
|
foreach (var genre in chapter.Genres)
|
|
{
|
|
GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre);
|
|
}
|
|
}
|
|
}
|
|
|
|
var genres = chapters.SelectMany(c => c.Genres).ToList();
|
|
GenreHelper.KeepOnlySameGenreBetweenLists(series.Metadata.Genres.ToList(), genres, genre =>
|
|
{
|
|
if (series.Metadata.GenresLocked) return;
|
|
series.Metadata.Genres.Remove(genre);
|
|
});
|
|
|
|
// NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it
|
|
// I might be able to filter out people that are in locked fields?
|
|
var people = chapters.SelectMany(c => c.People).ToList();
|
|
PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People.ToList(),
|
|
people, person =>
|
|
{
|
|
switch (person.Role)
|
|
{
|
|
case PersonRole.Writer:
|
|
if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Penciller:
|
|
if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Inker:
|
|
if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Colorist:
|
|
if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Letterer:
|
|
if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.CoverArtist:
|
|
if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Editor:
|
|
if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Publisher:
|
|
if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Character:
|
|
if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
case PersonRole.Translator:
|
|
if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person);
|
|
break;
|
|
default:
|
|
series.Metadata.People.Remove(person);
|
|
break;
|
|
}
|
|
});
|
|
}
|
|
|
|
private void UpdateVolumes(Series series, IList<ParserInfo> parsedInfos)
|
|
{
|
|
var startingVolumeCount = series.Volumes.Count;
|
|
// Add new volumes and update chapters per volume
|
|
var distinctVolumes = parsedInfos.DistinctVolumes();
|
|
_logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name);
|
|
foreach (var volumeNumber in distinctVolumes)
|
|
{
|
|
var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber);
|
|
if (volume == null)
|
|
{
|
|
volume = DbFactory.Volume(volumeNumber);
|
|
volume.SeriesId = series.Id;
|
|
series.Volumes.Add(volume);
|
|
}
|
|
|
|
volume.Name = volumeNumber;
|
|
|
|
_logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name);
|
|
var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray();
|
|
UpdateChapters(series, volume, infos);
|
|
volume.Pages = volume.Chapters.Sum(c => c.Pages);
|
|
|
|
// Update all the metadata on the Chapters
|
|
foreach (var chapter in volume.Chapters)
|
|
{
|
|
var firstFile = chapter.Files.MinBy(x => x.Chapter);
|
|
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue;
|
|
try
|
|
{
|
|
var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath));
|
|
UpdateChapterFromComicInfo(chapter, firstChapterInfo?.ComicInfo);
|
|
}
|
|
catch (Exception ex)
|
|
{
|
|
_logger.LogError(ex, "There was some issue when updating chapter's metadata");
|
|
}
|
|
}
|
|
}
|
|
|
|
// Remove existing volumes that aren't in parsedInfos
|
|
var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList();
|
|
if (series.Volumes.Count != nonDeletedVolumes.Count)
|
|
{
|
|
_logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name",
|
|
(series.Volumes.Count - nonDeletedVolumes.Count), series.Name);
|
|
var deletedVolumes = series.Volumes.Except(nonDeletedVolumes);
|
|
foreach (var volume in deletedVolumes)
|
|
{
|
|
var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? "";
|
|
if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file))
|
|
{
|
|
_logger.LogError(
|
|
"[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}",
|
|
file);
|
|
}
|
|
|
|
_logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file);
|
|
}
|
|
|
|
series.Volumes = nonDeletedVolumes;
|
|
}
|
|
|
|
_logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
|
|
series.Name, startingVolumeCount, series.Volumes.Count);
|
|
}
|
|
|
|
private void UpdateChapters(Series series, Volume volume, IList<ParserInfo> parsedInfos)
|
|
{
|
|
// Add new chapters
|
|
foreach (var info in parsedInfos)
|
|
{
|
|
// Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0
|
|
// also are treated like specials for UI grouping.
|
|
Chapter chapter;
|
|
try
|
|
{
|
|
chapter = volume.Chapters.GetChapterByRange(info);
|
|
}
|
|
catch (Exception ex)
|
|
{
|
|
_logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters);
|
|
continue;
|
|
}
|
|
|
|
if (chapter == null)
|
|
{
|
|
_logger.LogDebug(
|
|
"[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters);
|
|
chapter = DbFactory.Chapter(info);
|
|
volume.Chapters.Add(chapter);
|
|
series.LastChapterAdded = DateTime.Now;
|
|
}
|
|
else
|
|
{
|
|
chapter.UpdateFrom(info);
|
|
}
|
|
|
|
if (chapter == null) continue;
|
|
// Add files
|
|
var specialTreatment = info.IsSpecialInfo();
|
|
AddOrUpdateFileForChapter(chapter, info);
|
|
chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty;
|
|
chapter.Range = specialTreatment ? info.Filename : info.Chapters;
|
|
}
|
|
|
|
|
|
// Remove chapters that aren't in parsedInfos or have no files linked
|
|
var existingChapters = volume.Chapters.ToList();
|
|
foreach (var existingChapter in existingChapters)
|
|
{
|
|
if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter))
|
|
{
|
|
_logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series);
|
|
volume.Chapters.Remove(existingChapter);
|
|
}
|
|
else
|
|
{
|
|
// Ensure we remove any files that no longer exist AND order
|
|
existingChapter.Files = existingChapter.Files
|
|
.Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath))
|
|
.OrderByNatural(f => f.FilePath).ToList();
|
|
existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages);
|
|
}
|
|
}
|
|
}
|
|
|
|
private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info)
|
|
{
|
|
chapter.Files ??= new List<MangaFile>();
|
|
var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath);
|
|
if (existingFile != null)
|
|
{
|
|
existingFile.Format = info.Format;
|
|
if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return;
|
|
existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format);
|
|
// We skip updating DB here with last modified time so that metadata refresh can do it
|
|
}
|
|
else
|
|
{
|
|
var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format));
|
|
if (file == null) return;
|
|
|
|
chapter.Files.Add(file);
|
|
}
|
|
}
|
|
|
|
#nullable enable
|
|
private void UpdateChapterFromComicInfo(Chapter chapter, ComicInfo? info)
|
|
{
|
|
var firstFile = chapter.Files.MinBy(x => x.Chapter);
|
|
if (firstFile == null ||
|
|
_cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return;
|
|
|
|
var comicInfo = info;
|
|
if (info == null)
|
|
{
|
|
comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath);
|
|
}
|
|
|
|
if (comicInfo == null) return;
|
|
_logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath);
|
|
|
|
chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating);
|
|
|
|
if (!string.IsNullOrEmpty(comicInfo.Title))
|
|
{
|
|
chapter.TitleName = comicInfo.Title.Trim();
|
|
}
|
|
|
|
if (!string.IsNullOrEmpty(comicInfo.Summary))
|
|
{
|
|
chapter.Summary = comicInfo.Summary;
|
|
}
|
|
|
|
if (!string.IsNullOrEmpty(comicInfo.LanguageISO))
|
|
{
|
|
chapter.Language = comicInfo.LanguageISO;
|
|
}
|
|
|
|
if (comicInfo.Count > 0)
|
|
{
|
|
chapter.TotalCount = comicInfo.Count;
|
|
}
|
|
|
|
// This needs to check against both Number and Volume to calculate Count
|
|
if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0)
|
|
{
|
|
chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number));
|
|
}
|
|
if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0)
|
|
{
|
|
chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume)));
|
|
}
|
|
|
|
void AddPerson(Person person)
|
|
{
|
|
PersonHelper.AddPersonIfNotExists(chapter.People, person);
|
|
}
|
|
|
|
void AddGenre(Genre genre)
|
|
{
|
|
//chapter.Genres.Add(genre);
|
|
GenreHelper.AddGenreIfNotExists(chapter.Genres, genre);
|
|
}
|
|
|
|
void AddTag(Tag tag, bool added)
|
|
{
|
|
//chapter.Tags.Add(tag);
|
|
TagHelper.AddTagIfNotExists(chapter.Tags, tag);
|
|
}
|
|
|
|
|
|
if (comicInfo.Year > 0)
|
|
{
|
|
var day = Math.Max(comicInfo.Day, 1);
|
|
var month = Math.Max(comicInfo.Month, 1);
|
|
chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}");
|
|
}
|
|
|
|
var people = GetTagValues(comicInfo.Colorist);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist);
|
|
UpdatePeople(people, PersonRole.Colorist,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.Characters);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character);
|
|
UpdatePeople(people, PersonRole.Character,
|
|
AddPerson);
|
|
|
|
|
|
people = GetTagValues(comicInfo.Translator);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator);
|
|
UpdatePeople(people, PersonRole.Translator,
|
|
AddPerson);
|
|
|
|
|
|
people = GetTagValues(comicInfo.Writer);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer);
|
|
UpdatePeople(people, PersonRole.Writer,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.Editor);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor);
|
|
UpdatePeople(people, PersonRole.Editor,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.Inker);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker);
|
|
UpdatePeople(people, PersonRole.Inker,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.Letterer);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer);
|
|
UpdatePeople(people, PersonRole.Letterer,
|
|
AddPerson);
|
|
|
|
|
|
people = GetTagValues(comicInfo.Penciller);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller);
|
|
UpdatePeople(people, PersonRole.Penciller,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.CoverArtist);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist);
|
|
UpdatePeople(people, PersonRole.CoverArtist,
|
|
AddPerson);
|
|
|
|
people = GetTagValues(comicInfo.Publisher);
|
|
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher);
|
|
UpdatePeople(people, PersonRole.Publisher,
|
|
AddPerson);
|
|
|
|
var genres = GetTagValues(comicInfo.Genre);
|
|
GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList());
|
|
UpdateGenre(genres, false,
|
|
AddGenre);
|
|
|
|
var tags = GetTagValues(comicInfo.Tags);
|
|
TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList());
|
|
UpdateTag(tags, false,
|
|
AddTag);
|
|
}
|
|
|
|
private static IList<string> GetTagValues(string comicInfoTagSeparatedByComma)
|
|
{
|
|
|
|
if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma))
|
|
{
|
|
return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList();
|
|
}
|
|
return ImmutableList<string>.Empty;
|
|
}
|
|
#nullable disable
|
|
|
|
/// <summary>
|
|
/// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and
|
|
/// add an entry. For each person in name, the callback will be executed.
|
|
/// </summary>
|
|
/// <remarks>This does not remove people if an empty list is passed into names</remarks>
|
|
/// <remarks>This is used to add new people to a list without worrying about duplicating rows in the DB</remarks>
|
|
/// <param name="names"></param>
|
|
/// <param name="role"></param>
|
|
/// <param name="action"></param>
|
|
private void UpdatePeople(IEnumerable<string> names, PersonRole role, Action<Person> action)
|
|
{
|
|
|
|
var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList();
|
|
|
|
foreach (var name in names)
|
|
{
|
|
var normalizedName = Parser.Parser.Normalize(name);
|
|
var person = allPeopleTypeRole.FirstOrDefault(p =>
|
|
p.NormalizedName.Equals(normalizedName));
|
|
if (person == null)
|
|
{
|
|
person = DbFactory.Person(name, role);
|
|
lock (_people)
|
|
{
|
|
_people.Add(person);
|
|
}
|
|
}
|
|
|
|
action(person);
|
|
}
|
|
}
|
|
|
|
/// <summary>
|
|
///
|
|
/// </summary>
|
|
/// <param name="names"></param>
|
|
/// <param name="isExternal"></param>
|
|
/// <param name="action"></param>
|
|
private void UpdateGenre(IEnumerable<string> names, bool isExternal, Action<Genre> action)
|
|
{
|
|
foreach (var name in names)
|
|
{
|
|
if (string.IsNullOrEmpty(name.Trim())) continue;
|
|
|
|
var normalizedName = Parser.Parser.Normalize(name);
|
|
var genre = _genres.FirstOrDefault(p =>
|
|
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
|
|
if (genre == null)
|
|
{
|
|
genre = DbFactory.Genre(name, false);
|
|
lock (_genres)
|
|
{
|
|
_genres.Add(genre);
|
|
}
|
|
}
|
|
|
|
action(genre);
|
|
}
|
|
}
|
|
|
|
/// <summary>
|
|
///
|
|
/// </summary>
|
|
/// <param name="names"></param>
|
|
/// <param name="isExternal"></param>
|
|
/// <param name="action">Callback for every item. Will give said item back and a bool if item was added</param>
|
|
private void UpdateTag(IEnumerable<string> names, bool isExternal, Action<Tag, bool> action)
|
|
{
|
|
foreach (var name in names)
|
|
{
|
|
if (string.IsNullOrEmpty(name.Trim())) continue;
|
|
|
|
var added = false;
|
|
var normalizedName = Parser.Parser.Normalize(name);
|
|
|
|
var tag = _tags.FirstOrDefault(p =>
|
|
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
|
|
if (tag == null)
|
|
{
|
|
added = true;
|
|
tag = DbFactory.Tag(name, false);
|
|
lock (_tags)
|
|
{
|
|
_tags.Add(tag);
|
|
}
|
|
}
|
|
|
|
action(tag, added);
|
|
}
|
|
}
|
|
|
|
}
|