From 150e67031a21b798c54b164ba62bee5745fbced3 Mon Sep 17 00:00:00 2001 From: Joseph Milazzo Date: Fri, 2 Sep 2022 07:52:51 -0500 Subject: [PATCH] v0.5.6 - Performance Part 2 (Is that a new scan loop?) (#1500) * New Scan Loop (#1447) * Staging the code for the new scan loop. * Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real. * Started writing unit test for new loop code * Implemented a basic method to scan a folder path with ignore support (not implemented, code in place) * Added some code to the parser to build out the idea of processing series in batches based on some top level folder. * Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue. * Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support). * Wrote some notes on update library scan loop. * Removed migration for merge * Reapplied the SeriesFolder migration after merge * Refactored a check that used multiple db calls into one. * Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then. * Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned. * Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them. * Fixed an issue where ignore files nested wouldn't stack with higher level ignores * Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking. * Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it. * Refactored ScanFiles out to Directory Service. * Refactored more code out to keep the code clean. * More unit tests * Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work). * Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning. * Prep for unit tests (updating broken ones with new implementations) * Just some notes. Not sure I want to finish this work. * Refactored the LibraryWatcher with some comments and state variables. * Undid the migrations in case I don't move forward with this branch * Started to clean the code and prepare for finishing this work. * Fixed a bad merge * Updated signatures to cleanup the code and commit to the new strategy for scanning. * Swapped out the code with async processing of series on a small library * The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations. * Refactored UpdateSeries out of Scanner and into a dedicated file. * Refactored how ProcessTasks are awaited to allow more async * Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush * Moved where we start to stopwatch to encapsulate the full scan * Cleaned up SignalR events to report correctly (still needs a redesign) * Remove the "remove" code until I figure it out * Put in extremely expensive series deletion code for library scan. * Have Genre and Tag update the DB immediately to avoid dup issues * Taking a break * Moving to a lock with People was successful. Need to apply to others. * Refactored code for series level and tag and genre with new locking strategy. * New scan loop works. Next up optimization * Swapped out the Kavita log with svg for faster load * Refactored metadata updates to occur when the series are being updated. * Code cleanup * Added a new type of generic message (Info) to inform the user. * Code cleanup * Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds. Fixed a bug where File Analysis was running everytime for each non-epub file. * Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet. * Some code cleanup * Added experimental signalr update code to have a more natural refresh of library-detail page * Hooked in ability to send new series events to UI * Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series. * Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors. Added --event-widget-info-bg-color * Remove --drawer-background-color since it's not used * When new series added, inject directly into the view. * Some debug code cleanup * Fixed up the unit tests * Ensure all config directories exist on startup * Disabled Library Watching (that will go in next build) * Ensure update for series is admin only * Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again. * Removed SeriesFolder migration * Added the SeriesFolder migration * Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail. * The scan optimizations now work for NTFS systems. * Removed a TODO * Migrated all the times to use DateTime.Now and not Utc. * Refactored some repo calls to use the includes flag pattern * Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed. * Added another optimization which will use just folder attribute of last write time if the drive is not NTFS. * Fixed a unit test * Some code cleanup * Bump versions by dotnet-bump-version. * Misc UI Fixes (#1450) * Fixed collection cover images not rendering * added a try/catch on sending email, so we fail silently if it doesn't send. * Fixed Go Back not returning to last scroll position due to layoutmode change resetting, despite nothing changing. * Fixed a bug where when turning between pages on default mode, the height calculations could get skewed. * Fixed a missing case for card item where it wouldn't show tooltip title for series. * Bump versions by dotnet-bump-version. * New Scan Loop Fixes (#1452) * Refactored ScanSeries to avoid a lot of extra work and fixed a bug where Scan Series would invoke the processing twice. Refactored the series selection code during process such that we use Localized Name as well, for cases where the original name was changed. Undid an optimization around Last Write time, since Linux file systems match how NTFS works. * Fixed part of the query * Added a NormalizedLocalizedName for quick searching in which a series needs grouping. Reworked scan loop code a bit to ensure we don't do extra work. Tweaked the widget logic to help display better and not show "Nothing going on here". * Fixed a bug where archives with ._ files would be counted as valid files, while they are actually just metadata files on Mac's. * Fixed a broken unit test * Bump versions by dotnet-bump-version. * Simplify parent lookup with Directory.GetParent (#1455) * Simplify parent lookup with Directory.GetParent * Address comments * Bump versions by dotnet-bump-version. * Scan Loop Fixes (#1459) * Added Last Folder Scanned time to series info modal. Tweaked the info event detail modal to have a primary and thus be auto-dismissable * Added an error event when multiple series are found in processing a series. * Fixed a bug where a series could get stuck with other series due to a bad select query. Started adding the force flag hook for the UI and designing the confirm. Confirm service now also has ability to hide the close button. Updated error events and logging in the loop, to be more informative * Fixed a bug where confirm service wasn't showing the proper body content. * Hooked up force scan series * refresh metadata now has force update * Fixed up the messaging with the prompt on scan, hooked it up properly in the scan library to avoid the check if the whole library needs to even be scanned. Fixed a bug where NormalizedLocalizedName wasn't being calculated on new entities. Started adding unit tests for this problematic repo method. * Fixed a bug where we updated NormalizedLocalizedName before we set it. * Send an info to the UI when series are spread between multiple library level folders. * Added some logger output when there are no files found in a folder. Return early if there are no files found, so we can avoid some small loops of code. * Fixed an issue where multiple series in a folder with localized series would cause unintended grouping. This is not supported and hence we will warn them and allow the bad grouping. * Added a case where scan series fails due to the folder being removed. We will now log an error * Normalize paths when finding the highest directory till root. * Fixed an issue with Scan Series where changing a series' folder to a different path but the original series folder existed with another series in it, would cause the series to not be deleted. * Fixed some bugs around specials causing a series merge issue on scan series. * Removed a bug marker * Cleaned up some of the scan loop and removed a test I don't need. * Remove any prompts for force flow, it doesn't work well. Leave the API as is though. * Fixed up a check for duplicate ScanLibrary calls * Bump versions by dotnet-bump-version. * Scroll Resume (#1460) * When we navigate from a page then back, resume back on the last scroll key (if clicked) * Resume jump key position when navigating back to a page. Removed some extra blank space on collection detail when a collection doesn't have a summary or cover image. * Ignore progress events on series cards * Added a url to swagger for /, which could be reverse proxy url * Bump versions by dotnet-bump-version. * Misc UI fixes (#1461) * Misc fixes - Fixed modal being stretched when not needed. - Fixed Logo vertical align - Fixed drawer content scroll, and from it being squished due to overridden by bootstrap. * series detail cover image stretch fix - Fixes: Fixes series detail cover image being stretched on larger resolutions * fixing empty lists scrollbar * Fixing want to read error * fixing unnecessary scrollbar * Fixing recently updated tooltip * Bump versions by dotnet-bump-version. * Folder Watching (#1467) * Hooked in a server setting to enable/disable folder watching * Validated the file rename change event * Validated delete file works * Tweaked some logic to determine if a change occurs on a folder or a file. * Added a note for an upcoming branch * Some minor changes in the loop that just shift where code runs. * Implemented ScanFolder api * Ensure we restart watchers when we modify a library folder. * Fixed a unit test * Bump versions by dotnet-bump-version. * More Scan Loop Bugfixes (#1471) * Updated scan time for watcher to 30 seconds for non-dev. Moved ScanFolder off the Scan queue as it doesn't need to be there. Updated loggers * Fixed jumpbar missing * Tweaked the messaging for CoverGen * When we return early due to nothing being done on library and series scan, make sure we kick off other tasks that need to occur. * Fixed a foreign constraint issue on Volumes when we were adding to a new series. * Fixed a case where when picking normalized series, capitalization differences wouldn't stack when they should. * Reduced the logging output on dev and prod settings. * Fixed a bug in the code that finds the highest directory from a file, where we were not checking against a normalized path. * Cleaned up some code * Fixed broken unit tests * Bump versions by dotnet-bump-version. * More Scan Loop Fixes (#1473) * Added a ToList() to avoid a bug where a person could be removed from a list while iterating over the list. * When deleting a series, want to read page will now automatically remove that series from the view. * Fixed a series lookup which was ignoring format * Ignore XML comment warnings * Removed a note since it was already working that way * Fixed unit test * Bump versions by dotnet-bump-version. * Misc UI Fixes (#1477) * Tweaked a Migration to log correctly only if something is going to be done. * Refactored Reading List Controller code into a dedicated service and cleaned up some methods that aren't needed anymore. * Fixed a bug where adding a new item to a reading list wasn't adding it at the end. * Fixed an issue where collection page would re-render the same covers on multiple items. * Fixed a missing margin-top which made the page extras drawer not render correctly and hence unclosable on small screens. * Added some timeout on manage users screen to give data time to flush. Added a dedicated token log for account flows, in case url encoding plays a part (but from testing it doesn't). * Reverted back to building for ES6 instead of es2020 for old Safari 12.5.5 browsers (10MB difference in build size). * Cleaned up the logic in removing series not found during scan loop. * Tweaked the timings for Library Watcher to 1 min and reprocess queue every 30 seconds. * Bump versions by dotnet-bump-version. * Added fixes for libvips (#1479) * Bump versions by dotnet-bump-version. * Tachiyomi + Fixes (#1481) * Fixed a bootstrap bug * Fixed repeating images on collection detail * Fixed up some logic in library watcher which wasn't processing all of the queue. * When parsing non-epubs in Book library, use Manga parsing for Volume support to better support Light Novels * Fixed some bugs with the tachiyomi plugin api's for progress tracking * Bump versions by dotnet-bump-version. * Adding Health controller (#1480) * Adding Health controller - Added: Added API endpoint for a health check to streamline docker healthy status. * review comment fixes * Bump versions by dotnet-bump-version. * Simplify Folder Watcher (#1484) * Refactored Library Watcher to use Hangfire under the hood. * Support .kavitaignore at root level. * Refactored a lot of the library watching code to process faster and handle when FileSystemWatcher runs out of internal buffer space. It's still not perfect, but good enough for basic use. * Make folder watching as experimental and default it to off by default. * Revert #1479 * Tweaked the messaging for OPDS to remove a note about download role. Moved some code closer to where it's used. * Cleaned up how the events widget reports * Fixed a null issue when deleting series in the UI * Cleaned up some debug code * Added more information for when we skip a scan * Cleaned up some logging messages in CoverGen tasks * More log message tweaks * Added some debug to help identify a rare issue * Fixed a bug where save bookmarks as webp could get reset to false when saving other server settings * Updated some documentation on library watcher. * Make LibraryWatcher fire every 5 mins * Bump versions by dotnet-bump-version. * Sort series by chapter number only when some chapters have no volume (#1487) * Sort series by chapter number only when some chapters have no volume information * Implement a Default static instance of ChapterSortComparer * Further use Default static Comparers * Add missing ToLit() as per comments * SQLite Hangfire (#1488) * Update to use SQLIte for Hangfire to retain information on tasks * Updated all external links to have noopener noreferrer * When watching folders, ensure the folders exist before creating watchers. * Tweaked the messaging for Email Service and added link to the project. * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Fixed typeahead not working correctly (#1490) * Bump versions by dotnet-bump-version. * Release Testing Day 1 (#1491) * Fixed a bug where typeahead wouldn't automatically show results on relationship screen without an additional click. * Tweaked the code which checks if a modification occured to check on seconds rather than minutes * Clear cache will now clear temp/ directory as well. * Fixed an issue where Chrome was caching api responses when it shouldn't had. * Added a cleanup temp code * Ensure genres get removed during series scan when removed from metadata. * Fixed a bug where all epubs with a volume would show as Volume 0 in reading list * When a scan is in progress, don't let the user delete the library. * Bump versions by dotnet-bump-version. * Scan Loop Last Write Time Change (#1492) * Refactored invite user flow to separate error handling on create user flow and email flow. This should help users that have unique situations. * Switch to using files to check LastWriteTime. Debug code in for Robbie to test on rclone * Updated Parser namespace. Changed the LastWriteTime to check all files and folders. * Bump versions by dotnet-bump-version. * Release Testing Day 2 (#1493) * Added a no data section to collection detail. * Remove an optimization for skipping the whole library scan as it wasn't reliable * When resetting password, ensure the input is colored correctly * Fixed setting new password after resetting, throwing an error despite it actually being successful. Fixed incorrect messaging for Password Reset page. * Fixed a bug where reset password would show the side nav button and skew the page. Updated a lot of references to use Typed version for formcontrols. * Removed a migration from 0.5.0, 6 releases ago. * Added a null check so we don't throw an exception when connecting with signalR on unauthenticated users. * Bump versions by dotnet-bump-version. * Fixed a bug where a series with a relationship couldn't be deleted. (#1495) * Bump versions by dotnet-bump-version. * Release Testing Day 3 (#1496) * Tweaked log messaging for library scan when no files were scanned. * When a theme that is set gets removed due to a scan, inform the user to refresh. * Fixed a typo and make Darkness -> Brightness * Make download theme files allowed to be invoked by non-authenticated users, to allow new users to get the default theme. * Hide all series side nav item if there are no libraries exposed to the user * Fixed an API for Tachiyomi when syncing progress * Fixed dashboard not responding to Series Removed and Added events. Ensure we send SeriesRemoved events when they are deleted. * Reverted Hangfire SQLite due to aborted jobs being resumed, when they shouldnt. Fixed some scan loop issues where cover gen wouldn't be invoked always on new libraries. * Bump versions by dotnet-bump-version. * Updating series detail cover style (#1498) # FIxed - Fixed: Fixed an issue with series detail cover when scaled down. * Bump versions by dotnet-bump-version. * Version bump * v0.5.6 Release (#1499) Co-authored-by: tjarls Co-authored-by: Robbie Davis Co-authored-by: Chris Plaatjes --- .gitignore | 1 - API.Benchmark/ParseScannedFilesBenchmarks.cs | 69 - API.Tests/Entities/SeriesTest.cs | 2 +- .../ParserInfoListExtensionsTests.cs | 2 +- API.Tests/Extensions/SeriesExtensionsTests.cs | 8 +- API.Tests/Helpers/EntityFactory.cs | 8 +- API.Tests/Helpers/ParserInfoFactory.cs | 8 +- API.Tests/Helpers/ParserInfoHelperTests.cs | 8 +- API.Tests/Helpers/SeriesHelperTests.cs | 32 +- API.Tests/Parser/BookParserTests.cs | 6 +- API.Tests/Parser/ComicParserTests.cs | 8 +- API.Tests/Parser/MangaParserTests.cs | 15 +- API.Tests/Parser/ParserTest.cs | 4 +- API.Tests/Repository/SeriesRepositoryTests.cs | 160 ++ API.Tests/Services/ArchiveServiceTests.cs | 3 +- API.Tests/Services/BackupServiceTests.cs | 6 +- API.Tests/Services/BookmarkServiceTests.cs | 72 +- API.Tests/Services/CacheServiceTests.cs | 5 + API.Tests/Services/CleanupServiceTests.cs | 4 +- API.Tests/Services/DirectoryServiceTests.cs | 171 +- API.Tests/Services/ParseScannedFilesTests.cs | 345 ++-- API.Tests/Services/ReadingListServiceTests.cs | 109 ++ API.Tests/Services/ScannerServiceTests.cs | 12 +- API.Tests/Services/SiteThemeServiceTests.cs | 10 +- .../Archives/macos_withdotunder_one.zip | Bin 0 -> 424 bytes API/API.csproj | 7 + API/Comparators/ChapterSortComparer.cs | 4 + API/Controllers/AccountController.cs | 89 +- API/Controllers/CollectionController.cs | 2 +- API/Controllers/HealthController.cs | 17 + API/Controllers/LibraryController.cs | 60 +- API/Controllers/OPDSController.cs | 3 +- API/Controllers/PluginController.cs | 8 +- API/Controllers/ReaderController.cs | 10 +- API/Controllers/ReadingListController.cs | 144 +- API/Controllers/SeriesController.cs | 2 + API/Controllers/SettingsController.cs | 20 +- API/Controllers/TachiyomiController.cs | 12 +- API/Controllers/ThemeController.cs | 1 + API/DTOs/Reader/BookmarkDto.cs | 8 +- .../ReadingLists/UpdateReadingListPosition.cs | 10 +- API/DTOs/ScanFolderDto.cs | 17 + API/DTOs/SeriesDto.cs | 8 + API/DTOs/Settings/ServerSettingDTO.cs | 11 +- API/Data/DataContext.cs | 5 +- API/Data/DbFactory.cs | 36 +- API/Data/Metadata/ComicInfo.cs | 20 +- API/Data/MigrateBookmarks.cs | 105 -- API/Data/MigrateNormalizedLocalizedName.cs | 38 + API/Data/MigrateRemoveExtraThemes.cs | 5 +- .../20220817173731_SeriesFolder.Designer.cs | 1605 ++++++++++++++++ .../Migrations/20220817173731_SeriesFolder.cs | 37 + ...223212_NormalizedLocalizedName.Designer.cs | 1608 +++++++++++++++++ .../20220819223212_NormalizedLocalizedName.cs | 25 + .../Migrations/DataContextModelSnapshot.cs | 9 + .../Repositories/CollectionTagRepository.cs | 1 + API/Data/Repositories/GenreRepository.cs | 2 +- API/Data/Repositories/LibraryRepository.cs | 51 +- API/Data/Repositories/PersonRepository.cs | 2 +- API/Data/Repositories/SeriesRepository.cs | 233 ++- API/Data/Repositories/TagRepository.cs | 2 +- API/Data/Repositories/UserRepository.cs | 40 +- API/Data/Seed.cs | 3 +- API/Data/UnitOfWork.cs | 23 +- API/Entities/Enums/MangaFormat.cs | 4 +- API/Entities/Enums/ServerSettingKey.cs | 5 + API/Entities/FolderPath.cs | 3 +- API/Entities/Library.cs | 18 + API/Entities/Series.cs | 16 +- .../ApplicationServiceExtensions.cs | 4 + API/Extensions/SeriesExtensions.cs | 14 +- .../Converters/ServerSettingConverter.cs | 3 + API/Helpers/GenreHelper.cs | 16 +- API/Helpers/ParserInfoHelpers.cs | 8 +- API/Helpers/PersonHelper.cs | 25 +- API/Helpers/SeriesHelper.cs | 4 +- API/Helpers/TagHelper.cs | 17 +- API/Services/ArchiveService.cs | 33 +- API/Services/BookService.cs | 28 +- API/Services/BookmarkService.cs | 4 +- API/Services/CacheService.cs | 15 +- API/Services/DirectoryService.cs | 221 ++- API/Services/EmailService.cs | 11 +- .../StartupTasksHostedService.cs | 19 + API/Services/ImageService.cs | 2 +- API/Services/MetadataService.cs | 52 +- API/Services/ReaderService.cs | 4 +- API/Services/ReadingItemService.cs | 74 +- API/Services/ReadingListService.cs | 182 ++ API/Services/SeriesService.cs | 36 +- API/Services/TaskScheduler.cs | 83 +- API/Services/Tasks/CleanupService.cs | 22 +- .../Metadata/WordCountAnalyzerService.cs | 8 +- API/Services/Tasks/Scanner/LibraryWatcher.cs | 250 +++ .../Tasks/Scanner/ParseScannedFiles.cs | 316 ++-- .../Tasks/Scanner}/Parser/DefaultParser.cs | 70 +- .../Tasks/Scanner}/Parser/Parser.cs | 15 +- .../Tasks/Scanner}/Parser/ParserInfo.cs | 1 + API/Services/Tasks/Scanner/ProcessSeries.cs | 819 +++++++++ API/Services/Tasks/ScannerService.cs | 1188 ++++-------- API/Services/Tasks/SiteThemeService.cs | 8 +- API/SignalR/MessageFactory.cs | 45 +- API/SignalR/Presence/PresenceTracker.cs | 1 + API/Startup.cs | 36 +- API/config/appsettings.Development.json | 8 +- API/config/appsettings.json | 22 + Dockerfile | 2 +- Kavita.Common/Helpers/GlobMatcher.cs | 64 + Kavita.Common/Kavita.Common.csproj | 5 +- UI/Web/src/app/_models/events/info-event.ts | 32 + UI/Web/src/app/_models/series.ts | 8 + UI/Web/src/app/_services/account.service.ts | 5 +- .../app/_services/action-factory.service.ts | 8 +- UI/Web/src/app/_services/action.service.ts | 26 +- UI/Web/src/app/_services/jumpbar.service.ts | 2 +- UI/Web/src/app/_services/library.service.ts | 8 +- .../src/app/_services/message-hub.service.ts | 13 +- UI/Web/src/app/_services/series.service.ts | 4 +- UI/Web/src/app/_services/theme.service.ts | 15 +- .../directory-picker.component.html | 2 +- .../library-editor-modal.component.ts | 8 +- .../src/app/admin/_models/server-settings.ts | 1 + .../admin/edit-user/edit-user.component.ts | 8 +- .../invite-user/invite-user.component.html | 4 +- .../invite-user/invite-user.component.ts | 6 +- .../manage-email-settings.component.html | 4 +- .../manage-email-settings.component.ts | 6 +- .../manage-media-settings.component.ts | 6 +- .../manage-settings.component.html | 13 +- .../manage-settings.component.ts | 37 +- .../manage-system.component.html | 12 +- .../manage-system/manage-system.component.ts | 16 +- .../manage-tasks-settings.component.ts | 8 +- .../manage-users/manage-users.component.ts | 12 +- .../changelog/changelog.component.html | 4 +- .../book-reader/book-reader.component.ts | 19 +- .../reader-settings.component.ts | 18 +- .../bookmarks/bookmarks.component.html | 2 +- .../bulk-add-to-collection.component.scss | 4 +- .../bulk-add-to-collection.component.ts | 8 +- .../edit-collection-tags.component.ts | 12 +- .../edit-series-modal.component.html | 9 +- .../card-detail-drawer.component.scss | 2 +- .../card-detail-layout.component.html | 6 +- .../card-detail-layout.component.scss | 16 +- .../card-detail-layout.component.ts | 44 +- .../card-actionables.component.html | 2 +- .../cards/card-item/card-item.component.ts | 47 +- .../edit-series-relation.component.html | 4 +- .../edit-series-relation.component.ts | 24 +- .../series-card/series-card.component.ts | 2 +- .../all-collections.component.html | 7 +- .../all-collections.component.ts | 4 +- .../collection-detail.component.html | 17 +- .../collection-detail.component.ts | 1 + .../src/app/dashboard/dashboard.component.ts | 9 +- .../library-detail.component.html | 1 + .../library-detail.component.ts | 34 +- .../manga-reader/manga-reader.component.html | 2 +- .../manga-reader/manga-reader.component.ts | 5 - .../metadata-filter.component.ts | 24 +- .../events-widget.component.html | 47 +- .../events-widget.component.scss | 24 + .../events-widget/events-widget.component.ts | 58 +- .../grouped-typeahead.component.ts | 6 +- .../nav/nav-header/nav-header.component.html | 180 +- .../nav/nav-header/nav-header.component.scss | 2 +- UI/Web/src/app/pipe/default-date.pipe.ts | 13 + UI/Web/src/app/pipe/pipe.module.ts | 7 +- .../add-to-list-modal.component.ts | 8 +- .../edit-reading-list-modal.component.ts | 10 +- .../reading-list-item.component.ts | 7 +- .../reading-lists.component.html | 6 +- ...il-to-account-migration-modal.component.ts | 10 +- .../confirm-email/confirm-email.component.ts | 12 +- .../confirm-reset-password.component.html | 2 +- .../confirm-reset-password.component.scss | 4 + .../confirm-reset-password.component.ts | 15 +- .../register/register.component.ts | 10 +- .../user-login/user-login.component.ts | 8 +- .../review-series-modal.component.ts | 10 +- .../series-detail.component.html | 74 +- .../series-detail.component.scss | 8 + .../series-detail/series-detail.component.ts | 14 +- UI/Web/src/app/shared/_models/download.ts | 2 +- .../app/shared/_services/utility.service.ts | 6 + .../confirm-dialog/_models/confirm-button.ts | 2 +- .../confirm-dialog/_models/confirm-config.ts | 4 + .../confirm-dialog.component.html | 5 +- UI/Web/src/app/shared/confirm.service.ts | 3 + .../src/app/shared/image/image.component.ts | 16 + .../shared/tag-badge/tag-badge.component.ts | 2 +- .../update-notification-modal.component.html | 2 +- .../side-nav-item/side-nav-item.component.ts | 1 + .../sidenav/side-nav/side-nav.component.html | 2 +- .../sidenav/side-nav/side-nav.component.ts | 3 +- .../src/app/typeahead/typeahead-settings.ts | 4 +- .../src/app/typeahead/typeahead.component.ts | 23 +- .../theme-manager.component.html | 2 +- .../theme-manager/theme-manager.component.ts | 2 +- .../want-to-read/want-to-read.component.html | 18 +- .../want-to-read/want-to-read.component.ts | 35 +- UI/Web/src/theme/components/_modal.scss | 4 - UI/Web/src/theme/components/_offcanvas.scss | 2 +- UI/Web/src/theme/themes/dark.scss | 2 +- UI/Web/src/theme/themes/e-ink.scss | 1 - UI/Web/tsconfig.json | 2 +- 207 files changed, 8183 insertions(+), 2218 deletions(-) delete mode 100644 API.Benchmark/ParseScannedFilesBenchmarks.cs create mode 100644 API.Tests/Repository/SeriesRepositoryTests.cs create mode 100644 API.Tests/Services/ReadingListServiceTests.cs create mode 100644 API.Tests/Services/Test Data/ArchiveService/Archives/macos_withdotunder_one.zip create mode 100644 API/Controllers/HealthController.cs create mode 100644 API/DTOs/ScanFolderDto.cs delete mode 100644 API/Data/MigrateBookmarks.cs create mode 100644 API/Data/MigrateNormalizedLocalizedName.cs create mode 100644 API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs create mode 100644 API/Data/Migrations/20220817173731_SeriesFolder.cs create mode 100644 API/Data/Migrations/20220819223212_NormalizedLocalizedName.Designer.cs create mode 100644 API/Data/Migrations/20220819223212_NormalizedLocalizedName.cs create mode 100644 API/Services/ReadingListService.cs create mode 100644 API/Services/Tasks/Scanner/LibraryWatcher.cs rename API/{ => Services/Tasks/Scanner}/Parser/DefaultParser.cs (52%) rename API/{ => Services/Tasks/Scanner}/Parser/Parser.cs (98%) rename API/{ => Services/Tasks/Scanner}/Parser/ParserInfo.cs (99%) create mode 100644 API/Services/Tasks/Scanner/ProcessSeries.cs create mode 100644 API/config/appsettings.json create mode 100644 Kavita.Common/Helpers/GlobMatcher.cs create mode 100644 UI/Web/src/app/_models/events/info-event.ts create mode 100644 UI/Web/src/app/pipe/default-date.pipe.ts diff --git a/.gitignore b/.gitignore index eec036cbe..9e470748b 100644 --- a/.gitignore +++ b/.gitignore @@ -485,7 +485,6 @@ Thumbs.db ssl/ # App specific -appsettings.json /API/kavita.db /API/kavita.db-shm /API/kavita.db-wal diff --git a/API.Benchmark/ParseScannedFilesBenchmarks.cs b/API.Benchmark/ParseScannedFilesBenchmarks.cs deleted file mode 100644 index 1dcca79b9..000000000 --- a/API.Benchmark/ParseScannedFilesBenchmarks.cs +++ /dev/null @@ -1,69 +0,0 @@ -using System.IO; -using System.IO.Abstractions; -using System.Threading.Tasks; -using API.Entities.Enums; -using API.Parser; -using API.Services; -using API.Services.Tasks.Scanner; -using API.SignalR; -using BenchmarkDotNet.Attributes; -using BenchmarkDotNet.Order; -using Microsoft.Extensions.Logging; -using NSubstitute; - -namespace API.Benchmark -{ - [MemoryDiagnoser] - [Orderer(SummaryOrderPolicy.FastestToSlowest)] - [RankColumn] - //[SimpleJob(launchCount: 1, warmupCount: 3, targetCount: 5, invocationCount: 100, id: "Test"), ShortRunJob] - public class ParseScannedFilesBenchmarks - { - private readonly ParseScannedFiles _parseScannedFiles; - private readonly ILogger _logger = Substitute.For>(); - private readonly ILogger _bookLogger = Substitute.For>(); - private readonly IArchiveService _archiveService = Substitute.For(); - - public ParseScannedFilesBenchmarks() - { - var directoryService = new DirectoryService(Substitute.For>(), new FileSystem()); - _parseScannedFiles = new ParseScannedFiles( - Substitute.For(), - directoryService, - new ReadingItemService(_archiveService, new BookService(_bookLogger, directoryService, new ImageService(Substitute.For>(), directoryService)), Substitute.For(), directoryService), - Substitute.For()); - } - - // [Benchmark] - // public void Test() - // { - // var libraryPath = Path.Join(Directory.GetCurrentDirectory(), - // "../../../Services/Test Data/ScannerService/Manga"); - // var parsedSeries = _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new string[] {libraryPath}, - // out var totalFiles, out var scanElapsedTime); - // } - - /// - /// Generate a list of Series and another list with - /// - [Benchmark] - public async Task MergeName() - { - var libraryPath = Path.Join(Directory.GetCurrentDirectory(), - "../../../Services/Test Data/ScannerService/Manga"); - var p1 = new ParserInfo() - { - Chapters = "0", - Edition = "", - Format = MangaFormat.Archive, - FullFilePath = Path.Join(libraryPath, "A Town Where You Live", "A_Town_Where_You_Live_v01.zip"), - IsSpecial = false, - Series = "A Town Where You Live", - Title = "A Town Where You Live", - Volumes = "1" - }; - await _parseScannedFiles.ScanLibrariesForSeries(LibraryType.Manga, new [] {libraryPath}, "Manga"); - _parseScannedFiles.MergeName(p1); - } - } -} diff --git a/API.Tests/Entities/SeriesTest.cs b/API.Tests/Entities/SeriesTest.cs index f0cab0239..70897b49f 100644 --- a/API.Tests/Entities/SeriesTest.cs +++ b/API.Tests/Entities/SeriesTest.cs @@ -12,7 +12,7 @@ namespace API.Tests.Entities [InlineData("Darker than Black")] public void CreateSeries(string name) { - var key = API.Parser.Parser.Normalize(name); + var key = API.Services.Tasks.Scanner.Parser.Parser.Normalize(name); var series = DbFactory.Series(name); Assert.Equal(0, series.Id); Assert.Equal(0, series.Pages); diff --git a/API.Tests/Extensions/ParserInfoListExtensionsTests.cs b/API.Tests/Extensions/ParserInfoListExtensionsTests.cs index e7c8e9994..ff20403b1 100644 --- a/API.Tests/Extensions/ParserInfoListExtensionsTests.cs +++ b/API.Tests/Extensions/ParserInfoListExtensionsTests.cs @@ -14,7 +14,7 @@ namespace API.Tests.Extensions { public class ParserInfoListExtensions { - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; public ParserInfoListExtensions() { _defaultParser = diff --git a/API.Tests/Extensions/SeriesExtensionsTests.cs b/API.Tests/Extensions/SeriesExtensionsTests.cs index c00ade1e8..b339b306d 100644 --- a/API.Tests/Extensions/SeriesExtensionsTests.cs +++ b/API.Tests/Extensions/SeriesExtensionsTests.cs @@ -28,7 +28,7 @@ namespace API.Tests.Extensions Name = seriesInput[0], LocalizedName = seriesInput[1], OriginalName = seriesInput[2], - NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]), + NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]), Metadata = new SeriesMetadata() }; @@ -52,14 +52,14 @@ namespace API.Tests.Extensions Name = seriesInput[0], LocalizedName = seriesInput[1], OriginalName = seriesInput[2], - NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]), + NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]), Metadata = new SeriesMetadata(), }; var parserInfos = list.Select(s => new ParsedSeries() { Name = s, - NormalizedName = API.Parser.Parser.Normalize(s), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(s), }).ToList(); // This doesn't do any checks against format @@ -78,7 +78,7 @@ namespace API.Tests.Extensions Name = seriesInput[0], LocalizedName = seriesInput[1], OriginalName = seriesInput[2], - NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]), + NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Services.Tasks.Scanner.Parser.Parser.Normalize(seriesInput[0]), Metadata = new SeriesMetadata() }; var info = new ParserInfo(); diff --git a/API.Tests/Helpers/EntityFactory.cs b/API.Tests/Helpers/EntityFactory.cs index 3632ff9a0..55d947cf5 100644 --- a/API.Tests/Helpers/EntityFactory.cs +++ b/API.Tests/Helpers/EntityFactory.cs @@ -18,7 +18,7 @@ namespace API.Tests.Helpers Name = name, SortName = name, LocalizedName = name, - NormalizedName = API.Parser.Parser.Normalize(name), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(name), Volumes = new List(), Metadata = new SeriesMetadata() }; @@ -31,7 +31,7 @@ namespace API.Tests.Helpers return new Volume() { Name = volumeNumber, - Number = (int) API.Parser.Parser.MinNumberFromRange(volumeNumber), + Number = (int) API.Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(volumeNumber), Pages = pages, Chapters = chaps }; @@ -43,7 +43,7 @@ namespace API.Tests.Helpers { IsSpecial = isSpecial, Range = range, - Number = API.Parser.Parser.MinNumberFromRange(range) + string.Empty, + Number = API.Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(range) + string.Empty, Files = files ?? new List(), Pages = pageCount, @@ -73,7 +73,7 @@ namespace API.Tests.Helpers return new CollectionTag() { Id = id, - NormalizedTitle = API.Parser.Parser.Normalize(title).ToUpper(), + NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize(title).ToUpper(), Title = title, Summary = summary, Promoted = promoted diff --git a/API.Tests/Helpers/ParserInfoFactory.cs b/API.Tests/Helpers/ParserInfoFactory.cs index 2dc2f2869..4b4a8e22a 100644 --- a/API.Tests/Helpers/ParserInfoFactory.cs +++ b/API.Tests/Helpers/ParserInfoFactory.cs @@ -26,19 +26,19 @@ namespace API.Tests.Helpers }; } - public static void AddToParsedInfo(IDictionary> collectedSeries, ParserInfo info) + public static void AddToParsedInfo(IDictionary> collectedSeries, ParserInfo info) { var existingKey = collectedSeries.Keys.FirstOrDefault(ps => - ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series)); + ps.Format == info.Format && ps.NormalizedName == API.Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series)); existingKey ??= new ParsedSeries() { Format = info.Format, Name = info.Series, - NormalizedName = API.Parser.Parser.Normalize(info.Series) + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) }; if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>)) { - ((ConcurrentDictionary>) collectedSeries).AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => + ((ConcurrentDictionary>) collectedSeries).AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => { oldValue ??= new List(); if (!oldValue.Contains(info)) diff --git a/API.Tests/Helpers/ParserInfoHelperTests.cs b/API.Tests/Helpers/ParserInfoHelperTests.cs index d3b58d96b..e51362b81 100644 --- a/API.Tests/Helpers/ParserInfoHelperTests.cs +++ b/API.Tests/Helpers/ParserInfoHelperTests.cs @@ -16,7 +16,7 @@ public class ParserInfoHelperTests [Fact] public void SeriesHasMatchingParserInfoFormat_ShouldBeFalse() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); //AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); @@ -34,7 +34,7 @@ public class ParserInfoHelperTests Name = "1" } }, - NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"), Metadata = new SeriesMetadata(), Format = MangaFormat.Epub }; @@ -45,7 +45,7 @@ public class ParserInfoHelperTests [Fact] public void SeriesHasMatchingParserInfoFormat_ShouldBeTrue() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); @@ -63,7 +63,7 @@ public class ParserInfoHelperTests Name = "1" } }, - NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"), Metadata = new SeriesMetadata(), Format = MangaFormat.Epub }; diff --git a/API.Tests/Helpers/SeriesHelperTests.cs b/API.Tests/Helpers/SeriesHelperTests.cs index a8ffd95c3..139803e0a 100644 --- a/API.Tests/Helpers/SeriesHelperTests.cs +++ b/API.Tests/Helpers/SeriesHelperTests.cs @@ -22,21 +22,21 @@ public class SeriesHelperTests { Format = MangaFormat.Archive, Name = "Darker than Black", - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Archive, Name = "Darker than Black".ToLower(), - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Archive, Name = "Darker than Black".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); } @@ -50,21 +50,21 @@ public class SeriesHelperTests { Format = MangaFormat.Image, Name = "Darker than Black", - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Darker than Black".ToLower(), - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Darker than Black".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("Darker than Black") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker than Black") })); } @@ -78,28 +78,28 @@ public class SeriesHelperTests { Format = MangaFormat.Image, Name = "Something Random", - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Something Random".ToLower(), - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Something Random".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "SomethingRandom".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("SomethingRandom") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("SomethingRandom") })); } @@ -113,28 +113,28 @@ public class SeriesHelperTests { Format = MangaFormat.Image, Name = "Something Random", - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Something Random".ToLower(), - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "Something Random".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("Something Random") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something Random") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Image, Name = "SomethingRandom".ToUpper(), - NormalizedName = API.Parser.Parser.Normalize("SomethingRandom") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("SomethingRandom") })); } @@ -148,14 +148,14 @@ public class SeriesHelperTests { Format = MangaFormat.Archive, Name = "My Dress-Up Darling", - NormalizedName = API.Parser.Parser.Normalize("My Dress-Up Darling") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("My Dress-Up Darling") })); Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries() { Format = MangaFormat.Archive, Name = "Sono Bisque Doll wa Koi wo Suru".ToLower(), - NormalizedName = API.Parser.Parser.Normalize("Sono Bisque Doll wa Koi wo Suru") + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Sono Bisque Doll wa Koi wo Suru") })); } #endregion diff --git a/API.Tests/Parser/BookParserTests.cs b/API.Tests/Parser/BookParserTests.cs index cb91fc947..23b9c6e63 100644 --- a/API.Tests/Parser/BookParserTests.cs +++ b/API.Tests/Parser/BookParserTests.cs @@ -7,16 +7,18 @@ namespace API.Tests.Parser [Theory] [InlineData("Gifting The Wonderful World With Blessings! - 3 Side Stories [yuNS][Unknown]", "Gifting The Wonderful World With Blessings!")] [InlineData("BBC Focus 00 The Science of Happiness 2nd Edition (2018)", "BBC Focus 00 The Science of Happiness 2nd Edition")] + [InlineData("Faust - Volume 01 [Del Rey][Scans_Compressed]", "Faust")] public void ParseSeriesTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseSeries(filename)); } [Theory] [InlineData("Harrison, Kim - Dates from Hell - Hollows Vol 2.5.epub", "2.5")] + [InlineData("Faust - Volume 01 [Del Rey][Scans_Compressed]", "1")] public void ParseVolumeTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseVolume(filename)); } // [Theory] diff --git a/API.Tests/Parser/ComicParserTests.cs b/API.Tests/Parser/ComicParserTests.cs index 73f7cede4..74a2b8bb2 100644 --- a/API.Tests/Parser/ComicParserTests.cs +++ b/API.Tests/Parser/ComicParserTests.cs @@ -79,7 +79,7 @@ namespace API.Tests.Parser [InlineData("Fables 2010 Vol. 1 Legends in Exile", "Fables 2010")] public void ParseComicSeriesTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(filename)); } [Theory] @@ -126,7 +126,7 @@ namespace API.Tests.Parser [InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "1")] public void ParseComicVolumeTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(filename)); } [Theory] @@ -171,7 +171,7 @@ namespace API.Tests.Parser [InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "0")] public void ParseComicChapterTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(filename)); } @@ -190,7 +190,7 @@ namespace API.Tests.Parser [InlineData("Adventure Time 2013_-_Annual #001 (2013)", true)] public void ParseComicSpecialTest(string input, bool expected) { - Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseComicSpecial(input))); + Assert.Equal(expected, !string.IsNullOrEmpty(API.Services.Tasks.Scanner.Parser.Parser.ParseComicSpecial(input))); } } } diff --git a/API.Tests/Parser/MangaParserTests.cs b/API.Tests/Parser/MangaParserTests.cs index 546837fd1..12e312661 100644 --- a/API.Tests/Parser/MangaParserTests.cs +++ b/API.Tests/Parser/MangaParserTests.cs @@ -75,7 +75,7 @@ namespace API.Tests.Parser [InlineData("スライム倒して300年、知らないうちにレベルMAXになってました 1-3巻", "1-3")] public void ParseVolumeTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseVolume(filename)); } [Theory] @@ -180,9 +180,10 @@ namespace API.Tests.Parser [InlineData("Highschool of the Dead - Full Color Edition v02 [Uasaha] (Yen Press)", "Highschool of the Dead - Full Color Edition")] [InlineData("諌山創] 進撃の巨人 第23巻", "諌山創] 進撃の巨人")] [InlineData("(一般コミック) [奥浩哉] いぬやしき 第09巻", "いぬやしき")] + [InlineData("Highschool of the Dead - 02", "Highschool of the Dead")] public void ParseSeriesTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseSeries(filename)); } [Theory] @@ -260,7 +261,7 @@ namespace API.Tests.Parser [InlineData("[ハレム]ナナとカオル ~高校生のSMごっこ~ 第10話", "10")] public void ParseChaptersTest(string filename, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseChapter(filename)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseChapter(filename)); } @@ -276,7 +277,7 @@ namespace API.Tests.Parser [InlineData("Love Hina Omnibus v05 (2015) (Digital-HD) (Asgard-Empire).cbz", "Omnibus")] public void ParseEditionTest(string input, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseEdition(input)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseEdition(input)); } [Theory] [InlineData("Beelzebub Special OneShot - Minna no Kochikame x Beelzebub (2016) [Mangastream].cbz", true)] @@ -295,7 +296,7 @@ namespace API.Tests.Parser [InlineData("The League of Extra-ordinary Gentlemen", false)] public void ParseMangaSpecialTest(string input, bool expected) { - Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseMangaSpecial(input))); + Assert.Equal(expected, !string.IsNullOrEmpty(API.Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(input))); } [Theory] @@ -304,14 +305,14 @@ namespace API.Tests.Parser [InlineData("image.txt", MangaFormat.Unknown)] public void ParseFormatTest(string inputFile, MangaFormat expected) { - Assert.Equal(expected, API.Parser.Parser.ParseFormat(inputFile)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseFormat(inputFile)); } [Theory] [InlineData("Gifting The Wonderful World With Blessings! - 3 Side Stories [yuNS][Unknown].epub", "Side Stories")] public void ParseSpecialTest(string inputFile, string expected) { - Assert.Equal(expected, API.Parser.Parser.ParseMangaSpecial(inputFile)); + Assert.Equal(expected, API.Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(inputFile)); } diff --git a/API.Tests/Parser/ParserTest.cs b/API.Tests/Parser/ParserTest.cs index 4ae75d91b..c1ef966c9 100644 --- a/API.Tests/Parser/ParserTest.cs +++ b/API.Tests/Parser/ParserTest.cs @@ -1,6 +1,6 @@ using System.Linq; using Xunit; -using static API.Parser.Parser; +using static API.Services.Tasks.Scanner.Parser.Parser; namespace API.Tests.Parser { @@ -223,7 +223,7 @@ namespace API.Tests.Parser [InlineData("/manga/1/1/1", "/manga/1/1/1")] [InlineData("/manga/1/1/1.jpg", "/manga/1/1/1.jpg")] [InlineData(@"/manga/1/1\1.jpg", @"/manga/1/1/1.jpg")] - [InlineData("/manga/1/1//1", "/manga/1/1//1")] + [InlineData("/manga/1/1//1", "/manga/1/1/1")] [InlineData("/manga/1\\1\\1", "/manga/1/1/1")] [InlineData("C:/manga/1\\1\\1.jpg", "C:/manga/1/1/1.jpg")] public void NormalizePathTest(string inputPath, string expected) diff --git a/API.Tests/Repository/SeriesRepositoryTests.cs b/API.Tests/Repository/SeriesRepositoryTests.cs new file mode 100644 index 000000000..65491d333 --- /dev/null +++ b/API.Tests/Repository/SeriesRepositoryTests.cs @@ -0,0 +1,160 @@ +using System.Collections.Generic; +using System.Data.Common; +using System.IO.Abstractions.TestingHelpers; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Entities; +using API.Entities.Enums; +using API.Helpers; +using API.Services; +using AutoMapper; +using Microsoft.Data.Sqlite; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; +using Microsoft.Extensions.Logging; +using NSubstitute; +using Xunit; + +namespace API.Tests.Repository; + +public class SeriesRepositoryTests +{ + private readonly IUnitOfWork _unitOfWork; + + private readonly DbConnection _connection; + private readonly DataContext _context; + + private const string CacheDirectory = "C:/kavita/config/cache/"; + private const string CoverImageDirectory = "C:/kavita/config/covers/"; + private const string BackupDirectory = "C:/kavita/config/backups/"; + private const string DataDirectory = "C:/data/"; + + public SeriesRepositoryTests() + { + var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options; + _connection = RelationalOptionsExtension.Extract(contextOptions).Connection; + + _context = new DataContext(contextOptions); + Task.Run(SeedDb).GetAwaiter().GetResult(); + + var config = new MapperConfiguration(cfg => cfg.AddProfile()); + var mapper = config.CreateMapper(); + _unitOfWork = new UnitOfWork(_context, mapper, null); + } + + #region Setup + + private static DbConnection CreateInMemoryDatabase() + { + var connection = new SqliteConnection("Filename=:memory:"); + + connection.Open(); + + return connection; + } + + private async Task SeedDb() + { + await _context.Database.MigrateAsync(); + var filesystem = CreateFileSystem(); + + await Seed.SeedSettings(_context, + new DirectoryService(Substitute.For>(), filesystem)); + + var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync(); + setting.Value = CacheDirectory; + + setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync(); + setting.Value = BackupDirectory; + + _context.ServerSetting.Update(setting); + + var lib = new Library() + { + Name = "Manga", Folders = new List() {new FolderPath() {Path = "C:/data/"}} + }; + + _context.AppUser.Add(new AppUser() + { + UserName = "majora2007", + Libraries = new List() + { + lib + } + }); + + return await _context.SaveChangesAsync() > 0; + } + + private async Task ResetDb() + { + _context.Series.RemoveRange(_context.Series.ToList()); + _context.AppUserRating.RemoveRange(_context.AppUserRating.ToList()); + _context.Genre.RemoveRange(_context.Genre.ToList()); + _context.CollectionTag.RemoveRange(_context.CollectionTag.ToList()); + _context.Person.RemoveRange(_context.Person.ToList()); + + await _context.SaveChangesAsync(); + } + + private static MockFileSystem CreateFileSystem() + { + var fileSystem = new MockFileSystem(); + fileSystem.Directory.SetCurrentDirectory("C:/kavita/"); + fileSystem.AddDirectory("C:/kavita/config/"); + fileSystem.AddDirectory(CacheDirectory); + fileSystem.AddDirectory(CoverImageDirectory); + fileSystem.AddDirectory(BackupDirectory); + fileSystem.AddDirectory(DataDirectory); + + return fileSystem; + } + + #endregion + + private async Task SetupSeriesData() + { + var library = new Library() + { + Name = "Manga", + Type = LibraryType.Manga, + Folders = new List() + { + new FolderPath() {Path = "C:/data/manga/"} + } + }; + + var s = DbFactory.Series("The Idaten Deities Know Only Peace", "Heion Sedai no Idaten-tachi"); + s.Format = MangaFormat.Archive; + + library.Series = new List() + { + s, + }; + + _unitOfWork.LibraryRepository.Add(library); + await _unitOfWork.CommitAsync(); + } + + + [InlineData("Heion Sedai no Idaten-tachi", "", MangaFormat.Archive, "The Idaten Deities Know Only Peace")] // Matching on localized name in DB + [InlineData("Heion Sedai no Idaten-tachi", "", MangaFormat.Pdf, null)] + public async Task GetFullSeriesByAnyName_Should(string seriesName, string localizedName, string? expected) + { + var firstSeries = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(1); + var series = + await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(seriesName, localizedName, + 1, MangaFormat.Unknown); + if (expected == null) + { + Assert.Null(series); + } + else + { + Assert.NotNull(series); + Assert.Equal(expected, series.Name); + } + } + +} diff --git a/API.Tests/Services/ArchiveServiceTests.cs b/API.Tests/Services/ArchiveServiceTests.cs index de27464c9..2521d17af 100644 --- a/API.Tests/Services/ArchiveServiceTests.cs +++ b/API.Tests/Services/ArchiveServiceTests.cs @@ -68,6 +68,7 @@ namespace API.Tests.Services [InlineData("macos_none.zip", 0)] [InlineData("macos_one.zip", 1)] [InlineData("macos_native.zip", 21)] + [InlineData("macos_withdotunder_one.zip", 1)] public void GetNumberOfPagesFromArchiveTest(string archivePath, int expected) { var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives"); @@ -197,7 +198,7 @@ namespace API.Tests.Services var imageService = new ImageService(Substitute.For>(), _directoryService); var archiveService = Substitute.For(_logger, new DirectoryService(_directoryServiceLogger, new FileSystem()), imageService); - var testDirectory = API.Parser.Parser.NormalizePath(Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages"))); + var testDirectory = API.Services.Tasks.Scanner.Parser.Parser.NormalizePath(Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages"))); var outputDir = Path.Join(testDirectory, "output"); _directoryService.ClearDirectory(outputDir); diff --git a/API.Tests/Services/BackupServiceTests.cs b/API.Tests/Services/BackupServiceTests.cs index 4ad416dc6..ad7f8b9f9 100644 --- a/API.Tests/Services/BackupServiceTests.cs +++ b/API.Tests/Services/BackupServiceTests.cs @@ -147,7 +147,7 @@ public class BackupServiceTests var backupLogFiles = backupService.GetLogFiles(0, LogDirectory).ToList(); Assert.Single(backupLogFiles); - Assert.Equal(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log"), API.Parser.Parser.NormalizePath(backupLogFiles.First())); + Assert.Equal(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log"), API.Services.Tasks.Scanner.Parser.Parser.NormalizePath(backupLogFiles.First())); } [Fact] @@ -168,8 +168,8 @@ public class BackupServiceTests var backupService = new BackupService(_logger, _unitOfWork, ds, configuration, _messageHub); - var backupLogFiles = backupService.GetLogFiles(1, LogDirectory).Select(API.Parser.Parser.NormalizePath).ToList(); - Assert.NotEmpty(backupLogFiles.Where(file => file.Equals(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log")) || file.Equals(API.Parser.Parser.NormalizePath($"{LogDirectory}kavita1.log")))); + var backupLogFiles = backupService.GetLogFiles(1, LogDirectory).Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList(); + Assert.NotEmpty(backupLogFiles.Where(file => file.Equals(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita.log")) || file.Equals(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath($"{LogDirectory}kavita1.log")))); } diff --git a/API.Tests/Services/BookmarkServiceTests.cs b/API.Tests/Services/BookmarkServiceTests.cs index 0083a047d..88f0fc587 100644 --- a/API.Tests/Services/BookmarkServiceTests.cs +++ b/API.Tests/Services/BookmarkServiceTests.cs @@ -401,9 +401,79 @@ public class BookmarkServiceTests var files = await bookmarkService.GetBookmarkFilesById(new[] {1}); var actualFiles = ds.GetFiles(BookmarkDirectory, searchOption: SearchOption.AllDirectories); - Assert.Equal(files.Select(API.Parser.Parser.NormalizePath).ToList(), actualFiles.Select(API.Parser.Parser.NormalizePath).ToList()); + Assert.Equal(files.Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList(), actualFiles.Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList()); } + #endregion + + #region Misc + + [Fact] + public async Task ShouldNotDeleteBookmarkOnChapterDeletion() + { + var filesystem = CreateFileSystem(); + filesystem.AddFile($"{CacheDirectory}1/0001.jpg", new MockFileData("123")); + filesystem.AddFile($"{BookmarkDirectory}1/1/0001.jpg", new MockFileData("123")); + + // Delete all Series to reset state + await ResetDB(); + + _context.Series.Add(new Series() + { + Name = "Test", + Library = new Library() { + Name = "Test LIb", + Type = LibraryType.Manga, + }, + Volumes = new List() + { + new Volume() + { + Chapters = new List() + { + new Chapter() + { + + } + } + } + } + }); + + + _context.AppUser.Add(new AppUser() + { + UserName = "Joe", + Bookmarks = new List() + { + new AppUserBookmark() + { + Page = 1, + ChapterId = 1, + FileName = $"1/1/0001.jpg", + SeriesId = 1, + VolumeId = 1 + } + } + }); + + await _context.SaveChangesAsync(); + + + var ds = new DirectoryService(Substitute.For>(), filesystem); + var bookmarkService = Create(ds); + var user = await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Bookmarks); + + var vol = await _unitOfWork.VolumeRepository.GetVolumeAsync(1); + vol.Chapters = new List(); + _unitOfWork.VolumeRepository.Update(vol); + await _unitOfWork.CommitAsync(); + + + Assert.Equal(1, ds.GetFiles(BookmarkDirectory, searchOption:SearchOption.AllDirectories).Count()); + Assert.NotNull(await _unitOfWork.UserRepository.GetBookmarkAsync(1)); + } + #endregion } diff --git a/API.Tests/Services/CacheServiceTests.cs b/API.Tests/Services/CacheServiceTests.cs index c29a78036..a812e5bdd 100644 --- a/API.Tests/Services/CacheServiceTests.cs +++ b/API.Tests/Services/CacheServiceTests.cs @@ -55,6 +55,11 @@ namespace API.Tests.Services { throw new System.NotImplementedException(); } + + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + throw new System.NotImplementedException(); + } } public class CacheServiceTests { diff --git a/API.Tests/Services/CleanupServiceTests.cs b/API.Tests/Services/CleanupServiceTests.cs index a7575577c..a0934a5ca 100644 --- a/API.Tests/Services/CleanupServiceTests.cs +++ b/API.Tests/Services/CleanupServiceTests.cs @@ -312,13 +312,13 @@ public class CleanupServiceTests new ReadingList() { Title = "Something", - NormalizedTitle = API.Parser.Parser.Normalize("Something"), + NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something"), CoverImage = $"{ImageService.GetReadingListFormat(1)}.jpg" }, new ReadingList() { Title = "Something 2", - NormalizedTitle = API.Parser.Parser.Normalize("Something 2"), + NormalizedTitle = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Something 2"), CoverImage = $"{ImageService.GetReadingListFormat(2)}.jpg" } } diff --git a/API.Tests/Services/DirectoryServiceTests.cs b/API.Tests/Services/DirectoryServiceTests.cs index 23a7dfad1..b6ebf6722 100644 --- a/API.Tests/Services/DirectoryServiceTests.cs +++ b/API.Tests/Services/DirectoryServiceTests.cs @@ -34,7 +34,7 @@ namespace API.Tests.Services var ds = new DirectoryService(Substitute.For>(), fileSystem); var files = new List(); var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s), - API.Parser.Parser.ArchiveFileExtensions, _logger); + API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions, _logger); Assert.Equal(28, fileCount); Assert.Equal(28, files.Count); @@ -59,7 +59,7 @@ namespace API.Tests.Services try { var fileCount = ds.TraverseTreeParallelForEach("/manga/", s => files.Add(s), - API.Parser.Parser.ImageFileExtensions, _logger); + API.Services.Tasks.Scanner.Parser.Parser.ImageFileExtensions, _logger); Assert.Equal(1, fileCount); } catch (Exception ex) @@ -90,7 +90,7 @@ namespace API.Tests.Services var ds = new DirectoryService(Substitute.For>(), fileSystem); var files = new List(); var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s), - API.Parser.Parser.ArchiveFileExtensions, _logger); + API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions, _logger); Assert.Equal(28, fileCount); Assert.Equal(28, files.Count); @@ -111,7 +111,7 @@ namespace API.Tests.Services fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData("")); var ds = new DirectoryService(Substitute.For>(), fileSystem); - var files = ds.GetFilesWithExtension(testDirectory, API.Parser.Parser.ArchiveFileExtensions); + var files = ds.GetFilesWithExtension(testDirectory, API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions); Assert.Equal(10, files.Length); Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip")); @@ -150,7 +150,7 @@ namespace API.Tests.Services fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData("")); var ds = new DirectoryService(Substitute.For>(), fileSystem); - var files = ds.GetFiles(testDirectory, API.Parser.Parser.ArchiveFileExtensions).ToList(); + var files = ds.GetFiles(testDirectory, API.Services.Tasks.Scanner.Parser.Parser.ArchiveFileExtensions).ToList(); Assert.Equal(10, files.Count()); Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip")); @@ -586,12 +586,12 @@ namespace API.Tests.Services var ds = new DirectoryService(Substitute.For>(), fileSystem); ds.CopyFilesToDirectory(new []{MockUnixSupport.Path($"{testDirectory}file.zip")}, "/manga/output/"); ds.CopyFilesToDirectory(new []{MockUnixSupport.Path($"{testDirectory}file.zip")}, "/manga/output/"); - var outputFiles = ds.GetFiles("/manga/output/").Select(API.Parser.Parser.NormalizePath).ToList(); + var outputFiles = ds.GetFiles("/manga/output/").Select(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath).ToList(); Assert.Equal(4, outputFiles.Count()); // we have 2 already there and 2 copies // For some reason, this has C:/ on directory even though everything is emulated (System.IO.Abstractions issue, not changing) // https://github.com/TestableIO/System.IO.Abstractions/issues/831 - Assert.True(outputFiles.Contains(API.Parser.Parser.NormalizePath("/manga/output/file (3).zip")) - || outputFiles.Contains(API.Parser.Parser.NormalizePath("C:/manga/output/file (3).zip"))); + Assert.True(outputFiles.Contains(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath("/manga/output/file (3).zip")) + || outputFiles.Contains(API.Services.Tasks.Scanner.Parser.Parser.NormalizePath("C:/manga/output/file (3).zip"))); } #endregion @@ -677,6 +677,8 @@ namespace API.Tests.Services [InlineData(new [] {"C:/Manga/"}, new [] {"C:/Manga/Love Hina/Vol. 01.cbz"}, "C:/Manga/Love Hina")] [InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/Dir 2/"}, new [] {"C:/Manga/Dir 1/Love Hina/Vol. 01.cbz"}, "C:/Manga/Dir 1/Love Hina")] [InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/"}, new [] {"D:/Manga/Love Hina/Vol. 01.cbz", "D:/Manga/Vol. 01.cbz"}, "")] + [InlineData(new [] {"C:/Manga/"}, new [] {"C:/Manga//Love Hina/Vol. 01.cbz"}, "C:/Manga/Love Hina")] + [InlineData(new [] {@"C:\mount\drive\Library\Test Library\Comics\"}, new [] {@"C:\mount\drive\Library\Test Library\Comics\Bruce Lee (1994)\Bruce Lee #001 (1994).cbz"}, @"C:/mount/drive/Library/Test Library/Comics/Bruce Lee (1994)")] public void FindHighestDirectoriesFromFilesTest(string[] rootDirectories, string[] files, string expectedDirectory) { var fileSystem = new MockFileSystem(); @@ -841,5 +843,158 @@ namespace API.Tests.Services Assert.Equal(expected, DirectoryService.GetHumanReadableBytes(bytes)); } #endregion + + #region ScanFiles + + [Fact] + public Task ScanFiles_ShouldFindNoFiles_AllAreIgnored() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("*.*")); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(0, allFiles.Count); + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_ShouldFindNoNestedFiles_IgnoreNestedFiles() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*")); + fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(1, allFiles.Count); // Ignore files are not counted in files, only valid extensions + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_NestedIgnore_IgnoreNestedFilesInOneDirectoryOnly() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddDirectory("C:/Data/Specials/"); + fileSystem.AddDirectory("C:/Data/Specials/ArtBooks/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/.kavitaignore", new MockFileData("**/Accel World/*")); + fileSystem.AddFile("C:/Data/Specials/.kavitaignore", new MockFileData("**/ArtBooks/*")); + fileSystem.AddFile("C:/Data/Specials/Hi.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Specials/ArtBooks/art book 01.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Hello.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(2, allFiles.Count); // Ignore files are not counted in files, only valid extensions + + return Task.CompletedTask; + } + + + [Fact] + public Task ScanFiles_ShouldFindAllFiles() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.txt", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Nothing.pdf", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + + var allFiles = ds.ScanFiles("C:/Data/"); + + Assert.Equal(5, allFiles.Count); + + return Task.CompletedTask; + } + + #endregion + + #region GetAllDirectories + + [Fact] + public void GetAllDirectories_ShouldFindAllNestedDirectories() + { + const string testDirectory = "C:/manga/base/"; + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 2")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "A")); + fileSystem.AddDirectory(fileSystem.Path.Join(testDirectory, "folder 1", "B")); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(2, ds.GetAllDirectories(fileSystem.Path.Join(testDirectory, "folder 1")).Count()); + } + + #endregion + + #region GetParentDirectory + + [Theory] + [InlineData(@"C:/file.txt", "C:/")] + [InlineData(@"C:/folder/file.txt", "C:/folder")] + [InlineData(@"C:/folder/subfolder/file.txt", "C:/folder/subfolder")] + public void GetParentDirectoryName_ShouldFindParentOfFiles(string path, string expected) + { + var fileSystem = new MockFileSystem(new Dictionary + { + { path, new MockFileData(string.Empty)} + }); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(expected, ds.GetParentDirectoryName(path)); + } + [Theory] + [InlineData(@"C:/folder", "C:/")] + [InlineData(@"C:/folder/subfolder", "C:/folder")] + [InlineData(@"C:/folder/subfolder/another", "C:/folder/subfolder")] + public void GetParentDirectoryName_ShouldFindParentOfDirectories(string path, string expected) + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory(path); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + Assert.Equal(expected, ds.GetParentDirectoryName(path)); + } + + #endregion } } diff --git a/API.Tests/Services/ParseScannedFilesTests.cs b/API.Tests/Services/ParseScannedFilesTests.cs index 39f990bbf..c019b9643 100644 --- a/API.Tests/Services/ParseScannedFilesTests.cs +++ b/API.Tests/Services/ParseScannedFilesTests.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Data.Common; using System.IO.Abstractions.TestingHelpers; @@ -14,6 +15,8 @@ using API.Services.Tasks.Scanner; using API.SignalR; using API.Tests.Helpers; using AutoMapper; +using DotNet.Globbing; +using Flurl.Util; using Microsoft.Data.Sqlite; using Microsoft.EntityFrameworkCore; using Microsoft.EntityFrameworkCore.Infrastructure; @@ -25,9 +28,9 @@ namespace API.Tests.Services; internal class MockReadingItemService : IReadingItemService { - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; - public MockReadingItemService(DefaultParser defaultParser) + public MockReadingItemService(IDefaultParser defaultParser) { _defaultParser = defaultParser; } @@ -56,6 +59,11 @@ internal class MockReadingItemService : IReadingItemService { return _defaultParser.Parse(path, rootPath, type); } + + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + return _defaultParser.Parse(path, rootPath, type); + } } public class ParseScannedFilesTests @@ -148,138 +156,73 @@ public class ParseScannedFilesTests #endregion - #region GetInfosByName - - [Fact] - public void GetInfosByName_ShouldReturnGivenMatchingSeriesName() - { - var fileSystem = new MockFileSystem(); - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - var infos = new List() - { - ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), - ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) - }; - var parsedSeries = new Dictionary> - { - { - new ParsedSeries() - { - Format = MangaFormat.Archive, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - infos - }, - { - new ParsedSeries() - { - Format = MangaFormat.Pdf, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - new List() - } - }; - - var series = DbFactory.Series("Accel World"); - series.Format = MangaFormat.Pdf; - - Assert.Empty(ParseScannedFiles.GetInfosByName(parsedSeries, series)); - - series.Format = MangaFormat.Archive; - Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count()); - - } - - [Fact] - public void GetInfosByName_ShouldReturnGivenMatchingNormalizedSeriesName() - { - var fileSystem = new MockFileSystem(); - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - var infos = new List() - { - ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false), - ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false) - }; - var parsedSeries = new Dictionary> - { - { - new ParsedSeries() - { - Format = MangaFormat.Archive, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - infos - }, - { - new ParsedSeries() - { - Format = MangaFormat.Pdf, - Name = "Accel World", - NormalizedName = API.Parser.Parser.Normalize("Accel World") - }, - new List() - } - }; - - var series = DbFactory.Series("accel world"); - series.Format = MangaFormat.Archive; - Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count()); - - } - - #endregion - #region MergeName - [Fact] - public async Task MergeName_ShouldMergeMatchingFormatAndName() - { - var fileSystem = new MockFileSystem(); - fileSystem.AddDirectory("C:/Data/"); - fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); - - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - - await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); - - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false))); - } - - [Fact] - public async Task MergeName_ShouldMerge_MismatchedFormatSameName() - { - var fileSystem = new MockFileSystem(); - fileSystem.AddDirectory("C:/Data/"); - fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); - fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); - - var ds = new DirectoryService(Substitute.For>(), fileSystem); - var psf = new ParseScannedFiles(Substitute.For>(), ds, - new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); - - - await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); - - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false))); - Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false))); - } + // NOTE: I don't think I can test MergeName as it relies on Tracking Files, which is more complicated than I need + // [Fact] + // public async Task MergeName_ShouldMergeMatchingFormatAndName() + // { + // var fileSystem = new MockFileSystem(); + // fileSystem.AddDirectory("C:/Data/"); + // fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); + // + // var ds = new DirectoryService(Substitute.For>(), fileSystem); + // var psf = new ParseScannedFiles(Substitute.For>(), ds, + // new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + // + // var parsedSeries = new Dictionary>(); + // var parsedFiles = new ConcurrentDictionary>(); + // + // void TrackFiles(Tuple> parsedInfo) + // { + // var skippedScan = parsedInfo.Item1; + // var parsedFiles = parsedInfo.Item2; + // if (parsedFiles.Count == 0) return; + // + // var foundParsedSeries = new ParsedSeries() + // { + // Name = parsedFiles.First().Series, + // NormalizedName = API.Parser.Parser.Normalize(parsedFiles.First().Series), + // Format = parsedFiles.First().Format + // }; + // + // parsedSeries.Add(foundParsedSeries, parsedFiles); + // } + // + // await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName", + // false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles); + // + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false))); + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false))); + // Assert.Equal("Accel World", + // psf.MergeName(parsedFiles, ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false))); + // } + // + // [Fact] + // public async Task MergeName_ShouldMerge_MismatchedFormatSameName() + // { + // var fileSystem = new MockFileSystem(); + // fileSystem.AddDirectory("C:/Data/"); + // fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty)); + // fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty)); + // + // var ds = new DirectoryService(Substitute.For>(), fileSystem); + // var psf = new ParseScannedFiles(Substitute.For>(), ds, + // new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + // + // + // await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); + // + // Assert.Equal("Accel World", + // psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false))); + // Assert.Equal("Accel World", + // psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false))); + // } #endregion @@ -299,14 +242,150 @@ public class ParseScannedFilesTests var psf = new ParseScannedFiles(Substitute.For>(), ds, new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + var parsedSeries = new Dictionary>(); + + void TrackFiles(Tuple> parsedInfo) + { + var skippedScan = parsedInfo.Item1; + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + parsedSeries.Add(foundParsedSeries, parsedFiles); + } + + + await psf.ScanLibrariesForSeries(LibraryType.Manga, + new List() {"C:/Data/"}, "libraryName", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), TrackFiles); - var parsedSeries = await psf.ScanLibrariesForSeries(LibraryType.Manga, new List() {"C:/Data/"}, "libraryName"); Assert.Equal(3, parsedSeries.Values.Count); Assert.NotEmpty(parsedSeries.Keys.Where(p => p.Format == MangaFormat.Archive && p.Name.Equals("Accel World"))); + } + #endregion + + + #region ProcessFiles + + private static MockFileSystem CreateTestFilesystem() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + return fileSystem; + } + + [Fact] + public async Task ProcessFiles_ForLibraryMode_OnlyCallsFolderActionForEachTopLevelFolder() + { + var fileSystem = CreateTestFilesystem(); + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var directoriesSeen = new HashSet(); + await psf.ProcessFiles("C:/Data/", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1), + (files, directoryPath) => + { + directoriesSeen.Add(directoryPath); + return Task.CompletedTask; + }); + + Assert.Equal(2, directoriesSeen.Count); + } + + [Fact] + public async Task ProcessFiles_ForNonLibraryMode_CallsFolderActionOnce() + { + var fileSystem = CreateTestFilesystem(); + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var directoriesSeen = new HashSet(); + await psf.ProcessFiles("C:/Data/", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, directoryPath) => + { + directoriesSeen.Add(directoryPath); + return Task.CompletedTask; + }); + + Assert.Single(directoriesSeen); + directoriesSeen.TryGetValue("C:/Data/", out var actual); + Assert.Equal("C:/Data/", actual); + } + + [Fact] + public async Task ProcessFiles_ShouldCallFolderActionTwice() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var callCount = 0; + await psf.ProcessFiles("C:/Data", true, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) => + { + callCount++; + + return Task.CompletedTask; + }); + + Assert.Equal(2, callCount); } + /// + /// Due to this not being a library, it's going to consider everything under C:/Data as being one folder aka a series folder + /// + [Fact] + public async Task ProcessFiles_ShouldCallFolderActionOnce() + { + var fileSystem = new MockFileSystem(); + fileSystem.AddDirectory("C:/Data/"); + fileSystem.AddDirectory("C:/Data/Accel World"); + fileSystem.AddDirectory("C:/Data/Accel World/Specials/"); + fileSystem.AddFile("C:/Data/Accel World/Accel World v1.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Accel World v2.pdf", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Accel World/Specials/Accel World SP01.cbz", new MockFileData(string.Empty)); + fileSystem.AddFile("C:/Data/Black World/Black World SP01.cbz", new MockFileData(string.Empty)); + + var ds = new DirectoryService(Substitute.For>(), fileSystem); + var psf = new ParseScannedFiles(Substitute.For>(), ds, + new MockReadingItemService(new DefaultParser(ds)), Substitute.For()); + + var callCount = 0; + await psf.ProcessFiles("C:/Data", false, await _unitOfWork.SeriesRepository.GetFolderPathMap(1),(files, folderPath) => + { + callCount++; + return Task.CompletedTask; + }); + + Assert.Equal(1, callCount); + } + #endregion } diff --git a/API.Tests/Services/ReadingListServiceTests.cs b/API.Tests/Services/ReadingListServiceTests.cs new file mode 100644 index 000000000..4df8fb688 --- /dev/null +++ b/API.Tests/Services/ReadingListServiceTests.cs @@ -0,0 +1,109 @@ +using System.Collections.Generic; +using System.Data.Common; +using System.IO.Abstractions.TestingHelpers; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Entities; +using API.Entities.Enums; +using API.Helpers; +using API.Services; +using AutoMapper; +using Microsoft.Data.Sqlite; +using Microsoft.EntityFrameworkCore; +using Microsoft.Extensions.Logging; +using NSubstitute; +using Xunit; + +namespace API.Tests.Services; + +public class ReadingListServiceTests +{ + private readonly IUnitOfWork _unitOfWork; + private readonly IReadingListService _readingListService; + + private readonly DataContext _context; + + private const string CacheDirectory = "C:/kavita/config/cache/"; + private const string CoverImageDirectory = "C:/kavita/config/covers/"; + private const string BackupDirectory = "C:/kavita/config/backups/"; + private const string DataDirectory = "C:/data/"; + + public ReadingListServiceTests() + { + var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options; + + _context = new DataContext(contextOptions); + Task.Run(SeedDb).GetAwaiter().GetResult(); + + var config = new MapperConfiguration(cfg => cfg.AddProfile()); + var mapper = config.CreateMapper(); + _unitOfWork = new UnitOfWork(_context, mapper, null); + + _readingListService = new ReadingListService(_unitOfWork, Substitute.For>()); + } + + #region Setup + + private static DbConnection CreateInMemoryDatabase() + { + var connection = new SqliteConnection("Filename=:memory:"); + + connection.Open(); + + return connection; + } + + private async Task SeedDb() + { + await _context.Database.MigrateAsync(); + var filesystem = CreateFileSystem(); + + await Seed.SeedSettings(_context, + new DirectoryService(Substitute.For>(), filesystem)); + + var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync(); + setting.Value = CacheDirectory; + + setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync(); + setting.Value = BackupDirectory; + + _context.ServerSetting.Update(setting); + + _context.Library.Add(new Library() + { + Name = "Manga", Folders = new List() {new FolderPath() {Path = "C:/data/"}} + }); + return await _context.SaveChangesAsync() > 0; + } + + private async Task ResetDb() + { + _context.Series.RemoveRange(_context.Series.ToList()); + + await _context.SaveChangesAsync(); + } + + private static MockFileSystem CreateFileSystem() + { + var fileSystem = new MockFileSystem(); + fileSystem.Directory.SetCurrentDirectory("C:/kavita/"); + fileSystem.AddDirectory("C:/kavita/config/"); + fileSystem.AddDirectory(CacheDirectory); + fileSystem.AddDirectory(CoverImageDirectory); + fileSystem.AddDirectory(BackupDirectory); + fileSystem.AddDirectory(DataDirectory); + + return fileSystem; + } + + #endregion + + + #region RemoveFullyReadItems + + // TODO: Implement all methods here + + #endregion + +} diff --git a/API.Tests/Services/ScannerServiceTests.cs b/API.Tests/Services/ScannerServiceTests.cs index e3331bf6d..f54f2d3e9 100644 --- a/API.Tests/Services/ScannerServiceTests.cs +++ b/API.Tests/Services/ScannerServiceTests.cs @@ -16,7 +16,7 @@ namespace API.Tests.Services [Fact] public void FindSeriesNotOnDisk_Should_Remove1() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive}); //AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub}); @@ -36,7 +36,7 @@ namespace API.Tests.Services Name = "1" } }, - NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"), Metadata = new SeriesMetadata(), Format = MangaFormat.Epub } @@ -48,7 +48,7 @@ namespace API.Tests.Services [Fact] public void FindSeriesNotOnDisk_Should_RemoveNothing_Test() { - var infos = new Dictionary>(); + var infos = new Dictionary>(); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive}); ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive}); @@ -61,7 +61,7 @@ namespace API.Tests.Services Name = "Cage of Eden", LocalizedName = "Cage of Eden", OriginalName = "Cage of Eden", - NormalizedName = API.Parser.Parser.Normalize("Cage of Eden"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Cage of Eden"), Metadata = new SeriesMetadata(), Format = MangaFormat.Archive }, @@ -70,7 +70,7 @@ namespace API.Tests.Services Name = "Darker Than Black", LocalizedName = "Darker Than Black", OriginalName = "Darker Than Black", - NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Darker Than Black"), Metadata = new SeriesMetadata(), Format = MangaFormat.Archive } @@ -125,6 +125,8 @@ namespace API.Tests.Services // } + // TODO: I want a test for UpdateSeries where if I have chapter 10 and now it's mapping into Vol 2 Chapter 10, + // if I can do it without deleting the underlying chapter (aka id change) } } diff --git a/API.Tests/Services/SiteThemeServiceTests.cs b/API.Tests/Services/SiteThemeServiceTests.cs index ea43c6644..2ab523e59 100644 --- a/API.Tests/Services/SiteThemeServiceTests.cs +++ b/API.Tests/Services/SiteThemeServiceTests.cs @@ -157,7 +157,7 @@ public class SiteThemeServiceTests await siteThemeService.Scan(); var customThemes = (await _unitOfWork.SiteThemeRepository.GetThemeDtos()).Where(t => - API.Parser.Parser.Normalize(t.Name).Equals(API.Parser.Parser.Normalize("custom"))); + API.Services.Tasks.Scanner.Parser.Parser.Normalize(t.Name).Equals(API.Services.Tasks.Scanner.Parser.Parser.Normalize("custom"))); Assert.Single(customThemes); } @@ -177,7 +177,7 @@ public class SiteThemeServiceTests await siteThemeService.Scan(); var customThemes = (await _unitOfWork.SiteThemeRepository.GetThemeDtos()).Where(t => - API.Parser.Parser.Normalize(t.Name).Equals(API.Parser.Parser.Normalize("custom"))); + API.Services.Tasks.Scanner.Parser.Parser.Normalize(t.Name).Equals(API.Services.Tasks.Scanner.Parser.Parser.Normalize("custom"))); Assert.Empty(customThemes); } @@ -194,7 +194,7 @@ public class SiteThemeServiceTests _context.SiteTheme.Add(new SiteTheme() { Name = "Custom", - NormalizedName = API.Parser.Parser.Normalize("Custom"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"), Provider = ThemeProvider.User, FileName = "custom.css", IsDefault = false @@ -219,7 +219,7 @@ public class SiteThemeServiceTests _context.SiteTheme.Add(new SiteTheme() { Name = "Custom", - NormalizedName = API.Parser.Parser.Normalize("Custom"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"), Provider = ThemeProvider.User, FileName = "custom.css", IsDefault = false @@ -247,7 +247,7 @@ public class SiteThemeServiceTests _context.SiteTheme.Add(new SiteTheme() { Name = "Custom", - NormalizedName = API.Parser.Parser.Normalize("Custom"), + NormalizedName = API.Services.Tasks.Scanner.Parser.Parser.Normalize("Custom"), Provider = ThemeProvider.User, FileName = "custom.css", IsDefault = false diff --git a/API.Tests/Services/Test Data/ArchiveService/Archives/macos_withdotunder_one.zip b/API.Tests/Services/Test Data/ArchiveService/Archives/macos_withdotunder_one.zip new file mode 100644 index 0000000000000000000000000000000000000000..00ce4eb53f77bc837ea1cc185af93864af05a341 GIT binary patch literal 424 zcmWIWW@h1H0D(shEkR%glwbwYC8@x?jZ=gGnX5I?{hVJxPDVa9^Y zQ3hhLu`m}Q8-IrHP6^03bQ;-sOiv;k?*}s;;bmkC7F9Bafh1701;1702;1591 + + + True + $(NoWarn);1591 + + en @@ -48,6 +54,7 @@ + diff --git a/API/Comparators/ChapterSortComparer.cs b/API/Comparators/ChapterSortComparer.cs index 0e7f61a61..ca55381bc 100644 --- a/API/Comparators/ChapterSortComparer.cs +++ b/API/Comparators/ChapterSortComparer.cs @@ -23,6 +23,8 @@ namespace API.Comparators return x.CompareTo(y); } + + public static readonly ChapterSortComparer Default = new ChapterSortComparer(); } /// @@ -44,6 +46,8 @@ namespace API.Comparators return x.CompareTo(y); } + + public static readonly ChapterSortComparerZeroFirst Default = new ChapterSortComparerZeroFirst(); } public class SortComparerZeroLast : IComparer diff --git a/API/Controllers/AccountController.cs b/API/Controllers/AccountController.cs index d5336917c..8e549d5e1 100644 --- a/API/Controllers/AccountController.cs +++ b/API/Controllers/AccountController.cs @@ -354,7 +354,7 @@ namespace API.Controllers lib.AppUsers.Remove(user); } - libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList(); } foreach (var lib in libraries) @@ -458,11 +458,11 @@ namespace API.Controllers { _logger.LogInformation("{UserName} is being registered as admin. Granting access to all libraries", user.UserName); - libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync()).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibrariesAsync(LibraryIncludes.AppUser)).ToList(); } else { - libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries)).ToList(); + libraries = (await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(dto.Libraries, LibraryIncludes.AppUser)).ToList(); } foreach (var lib in libraries) @@ -472,37 +472,55 @@ namespace API.Controllers } var token = await _userManager.GenerateEmailConfirmationTokenAsync(user); - if (string.IsNullOrEmpty(token)) return BadRequest("There was an issue sending email"); + if (string.IsNullOrEmpty(token)) + { + _logger.LogError("There was an issue generating a token for the email"); + return BadRequest("There was an creating the invite user"); + } + user.ConfirmationToken = token; + await _unitOfWork.CommitAsync(); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an error during invite user flow, unable to create user. Deleting user for retry"); + _unitOfWork.UserRepository.Delete(user); + await _unitOfWork.CommitAsync(); + } - var emailLink = GenerateEmailLink(token, "confirm-email", dto.Email); + try + { + var emailLink = GenerateEmailLink(user.ConfirmationToken, "confirm-email", dto.Email); _logger.LogCritical("[Invite User]: Email Link for {UserName}: {Link}", user.UserName, emailLink); + _logger.LogCritical("[Invite User]: Token {UserName}: {Token}", user.UserName, user.ConfirmationToken); var host = _environment.IsDevelopment() ? "localhost:4200" : Request.Host.ToString(); var accessible = await _emailService.CheckIfAccessible(host); if (accessible) { - await _emailService.SendConfirmationEmail(new ConfirmationEmailDto() + try { - EmailAddress = dto.Email, - InvitingUser = adminUser.UserName, - ServerConfirmationLink = emailLink - }); + await _emailService.SendConfirmationEmail(new ConfirmationEmailDto() + { + EmailAddress = dto.Email, + InvitingUser = adminUser.UserName, + ServerConfirmationLink = emailLink + }); + } + catch (Exception) + { + /* Swallow exception */ + } } - user.ConfirmationToken = token; - - await _unitOfWork.CommitAsync(); - return Ok(new InviteUserResponse { EmailLink = emailLink, EmailSent = accessible }); } - catch (Exception) + catch (Exception ex) { - _unitOfWork.UserRepository.Delete(user); - await _unitOfWork.CommitAsync(); + _logger.LogError(ex, "There was an error during invite user flow, unable to send an email"); } return BadRequest("There was an error setting up your account. Please check the logs"); @@ -561,17 +579,26 @@ namespace API.Controllers [HttpPost("confirm-password-reset")] public async Task> ConfirmForgotPassword(ConfirmPasswordResetDto dto) { - var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(dto.Email); - if (user == null) + try { - return BadRequest("Invalid Details"); + var user = await _unitOfWork.UserRepository.GetUserByEmailAsync(dto.Email); + if (user == null) + { + return BadRequest("Invalid Details"); + } + + var result = await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider, + "ResetPassword", dto.Token); + if (!result) return BadRequest("Unable to reset password, your email token is not correct."); + + var errors = await _accountService.ChangeUserPassword(user, dto.Password); + return errors.Any() ? BadRequest(errors) : Ok("Password updated"); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an unexpected error when confirming new password"); + return BadRequest("There was an unexpected error when confirming new password"); } - - var result = await _userManager.VerifyUserTokenAsync(user, TokenOptions.DefaultProvider, "ResetPassword", dto.Token); - if (!result) return BadRequest("Unable to reset password, your email token is not correct."); - - var errors = await _accountService.ChangeUserPassword(user, dto.Password); - return errors.Any() ? BadRequest(errors) : Ok("Password updated"); } @@ -597,8 +624,10 @@ namespace API.Controllers if (!roles.Any(r => r is PolicyConstants.AdminRole or PolicyConstants.ChangePasswordRole)) return Unauthorized("You are not permitted to this operation."); - var emailLink = GenerateEmailLink(await _userManager.GeneratePasswordResetTokenAsync(user), "confirm-reset-password", user.Email); + var token = await _userManager.GeneratePasswordResetTokenAsync(user); + var emailLink = GenerateEmailLink(token, "confirm-reset-password", user.Email); _logger.LogCritical("[Forgot Password]: Email Link for {UserName}: {Link}", user.UserName, emailLink); + _logger.LogCritical("[Forgot Password]: Token {UserName}: {Token}", user.UserName, token); var host = _environment.IsDevelopment() ? "localhost:4200" : Request.Host.ToString(); if (await _emailService.CheckIfAccessible(host)) { @@ -651,8 +680,10 @@ namespace API.Controllers "This user needs to migrate. Have them log out and login to trigger a migration flow"); if (user.EmailConfirmed) return BadRequest("User already confirmed"); - var emailLink = GenerateEmailLink(await _userManager.GenerateEmailConfirmationTokenAsync(user), "confirm-email", user.Email); + var token = await _userManager.GenerateEmailConfirmationTokenAsync(user); + var emailLink = GenerateEmailLink(token, "confirm-email", user.Email); _logger.LogCritical("[Email Migration]: Email Link: {Link}", emailLink); + _logger.LogCritical("[Email Migration]: Token {UserName}: {Token}", user.UserName, token); await _emailService.SendMigrationEmail(new EmailMigrationDto() { EmailAddress = user.Email, @@ -729,6 +760,8 @@ namespace API.Controllers var result = await _userManager.ConfirmEmailAsync(user, token); if (result.Succeeded) return true; + + _logger.LogCritical("[Account] Email validation failed"); if (!result.Errors.Any()) return false; diff --git a/API/Controllers/CollectionController.cs b/API/Controllers/CollectionController.cs index 0b2f2bcd6..f030bd166 100644 --- a/API/Controllers/CollectionController.cs +++ b/API/Controllers/CollectionController.cs @@ -76,7 +76,7 @@ namespace API.Controllers existingTag.Promoted = updatedTag.Promoted; existingTag.Title = updatedTag.Title.Trim(); - existingTag.NormalizedTitle = Parser.Parser.Normalize(updatedTag.Title).ToUpper(); + existingTag.NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(updatedTag.Title).ToUpper(); existingTag.Summary = updatedTag.Summary.Trim(); if (_unitOfWork.HasChanges()) diff --git a/API/Controllers/HealthController.cs b/API/Controllers/HealthController.cs new file mode 100644 index 000000000..8d588fb44 --- /dev/null +++ b/API/Controllers/HealthController.cs @@ -0,0 +1,17 @@ +using System; +using System.Threading.Tasks; +using Microsoft.AspNetCore.Authorization; +using Microsoft.AspNetCore.Mvc; + +namespace API.Controllers; + +[AllowAnonymous] +public class HealthController : BaseApiController +{ + + [HttpGet()] + public ActionResult GetHealth() + { + return Ok("Ok"); + } +} diff --git a/API/Controllers/LibraryController.cs b/API/Controllers/LibraryController.cs index 7b99763a2..3a387d83e 100644 --- a/API/Controllers/LibraryController.cs +++ b/API/Controllers/LibraryController.cs @@ -13,11 +13,14 @@ using API.Entities; using API.Entities.Enums; using API.Extensions; using API.Services; +using API.Services.Tasks.Scanner; using API.SignalR; using AutoMapper; +using Kavita.Common; using Microsoft.AspNetCore.Authorization; using Microsoft.AspNetCore.Mvc; using Microsoft.Extensions.Logging; +using TaskScheduler = API.Services.TaskScheduler; namespace API.Controllers { @@ -30,10 +33,11 @@ namespace API.Controllers private readonly ITaskScheduler _taskScheduler; private readonly IUnitOfWork _unitOfWork; private readonly IEventHub _eventHub; + private readonly ILibraryWatcher _libraryWatcher; public LibraryController(IDirectoryService directoryService, ILogger logger, IMapper mapper, ITaskScheduler taskScheduler, - IUnitOfWork unitOfWork, IEventHub eventHub) + IUnitOfWork unitOfWork, IEventHub eventHub, ILibraryWatcher libraryWatcher) { _directoryService = directoryService; _logger = logger; @@ -41,6 +45,7 @@ namespace API.Controllers _taskScheduler = taskScheduler; _unitOfWork = unitOfWork; _eventHub = eventHub; + _libraryWatcher = libraryWatcher; } /// @@ -77,6 +82,7 @@ namespace API.Controllers if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical issue. Please try again."); _logger.LogInformation("Created a new library: {LibraryName}", library.Name); + await _libraryWatcher.RestartWatching(); _taskScheduler.ScanLibrary(library.Id); await _eventHub.SendMessageAsync(MessageFactory.LibraryModified, MessageFactory.LibraryModifiedEvent(library.Id, "create"), false); @@ -129,7 +135,7 @@ namespace API.Controllers var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(updateLibraryForUserDto.Username); if (user == null) return BadRequest("Could not validate user"); - var libraryString = String.Join(",", updateLibraryForUserDto.SelectedLibraries.Select(x => x.Name)); + var libraryString = string.Join(",", updateLibraryForUserDto.SelectedLibraries.Select(x => x.Name)); _logger.LogInformation("Granting user {UserName} access to: {Libraries}", updateLibraryForUserDto.Username, libraryString); var allLibraries = await _unitOfWork.LibraryRepository.GetLibrariesAsync(); @@ -168,17 +174,17 @@ namespace API.Controllers [Authorize(Policy = "RequireAdminRole")] [HttpPost("scan")] - public ActionResult Scan(int libraryId) + public ActionResult Scan(int libraryId, bool force = false) { - _taskScheduler.ScanLibrary(libraryId); + _taskScheduler.ScanLibrary(libraryId, force); return Ok(); } [Authorize(Policy = "RequireAdminRole")] [HttpPost("refresh-metadata")] - public ActionResult RefreshMetadata(int libraryId) + public ActionResult RefreshMetadata(int libraryId, bool force = true) { - _taskScheduler.RefreshMetadata(libraryId); + _taskScheduler.RefreshMetadata(libraryId, force); return Ok(); } @@ -196,6 +202,37 @@ namespace API.Controllers return Ok(await _unitOfWork.LibraryRepository.GetLibraryDtosForUsernameAsync(User.GetUsername())); } + /// + /// Given a valid path, will invoke either a Scan Series or Scan Library. If the folder does not exist within Kavita, the request will be ignored + /// + /// + /// + [AllowAnonymous] + [HttpPost("scan-folder")] + public async Task ScanFolder(ScanFolderDto dto) + { + var userId = await _unitOfWork.UserRepository.GetUserIdByApiKeyAsync(dto.ApiKey); + var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId); + // Validate user has Admin privileges + var isAdmin = await _unitOfWork.UserRepository.IsUserAdminAsync(user); + if (!isAdmin) return BadRequest("API key must belong to an admin"); + if (dto.FolderPath.Contains("..")) return BadRequest("Invalid Path"); + + dto.FolderPath = Services.Tasks.Scanner.Parser.Parser.NormalizePath(dto.FolderPath); + + var libraryFolder = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()) + .SelectMany(l => l.Folders) + .Distinct() + .Select(Services.Tasks.Scanner.Parser.Parser.NormalizePath); + + var seriesFolder = _directoryService.FindHighestDirectoriesFromFiles(libraryFolder, + new List() {dto.FolderPath}); + + _taskScheduler.ScanFolder(seriesFolder.Keys.Count == 1 ? seriesFolder.Keys.First() : dto.FolderPath); + + return Ok(); + } + [Authorize(Policy = "RequireAdminRole")] [HttpDelete("delete")] public async Task> DeleteLibrary(int libraryId) @@ -207,10 +244,16 @@ namespace API.Controllers var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(seriesIds); - try { var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None); + if (TaskScheduler.HasScanTaskRunningForLibrary(libraryId)) + { + // TODO: Figure out how to cancel a job + _logger.LogInformation("User is attempting to delete a library while a scan is in progress"); + return BadRequest( + "You cannot delete a library while a scan is in progress. Please wait for scan to continue then try to delete"); + } _unitOfWork.LibraryRepository.Delete(library); await _unitOfWork.CommitAsync(); @@ -221,6 +264,8 @@ namespace API.Controllers _taskScheduler.CleanupChapters(chapterIds); } + await _libraryWatcher.RestartWatching(); + foreach (var seriesId in seriesIds) { await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved, @@ -264,6 +309,7 @@ namespace API.Controllers if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical issue updating the library."); if (originalFolders.Count != libraryForUserDto.Folders.Count() || typeUpdate) { + await _libraryWatcher.RestartWatching(); _taskScheduler.ScanLibrary(library.Id); } diff --git a/API/Controllers/OPDSController.cs b/API/Controllers/OPDSController.cs index 255c38f19..e5165ae42 100644 --- a/API/Controllers/OPDSController.cs +++ b/API/Controllers/OPDSController.cs @@ -6,6 +6,7 @@ using System.Threading.Tasks; using System.Xml.Serialization; using API.Comparators; using API.Data; +using API.Data.Repositories; using API.DTOs; using API.DTOs.CollectionTags; using API.DTOs.Filtering; @@ -305,7 +306,7 @@ public class OpdsController : BaseApiController var userId = await GetUser(apiKey); var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId); - var userWithLists = await _unitOfWork.UserRepository.GetUserWithReadingListsByUsernameAsync(user.UserName); + var userWithLists = await _unitOfWork.UserRepository.GetUserByUsernameAsync(user.UserName, AppUserIncludes.ReadingListsWithItems); var readingList = userWithLists.ReadingLists.SingleOrDefault(t => t.Id == readingListId); if (readingList == null) { diff --git a/API/Controllers/PluginController.cs b/API/Controllers/PluginController.cs index dfb32f406..4a1209710 100644 --- a/API/Controllers/PluginController.cs +++ b/API/Controllers/PluginController.cs @@ -1,4 +1,5 @@ -using System.Threading.Tasks; +using System.ComponentModel.DataAnnotations; +using System.Threading.Tasks; using API.Data; using API.DTOs; using API.Services; @@ -24,12 +25,13 @@ namespace API.Controllers /// /// Authenticate with the Server given an apiKey. This will log you in by returning the user object and the JWT token. /// - /// + /// This API is not fully built out and may require more information in later releases + /// API key which will be used to authenticate and return a valid user token back /// Name of the Plugin /// [AllowAnonymous] [HttpPost("authenticate")] - public async Task> Authenticate(string apiKey, string pluginName) + public async Task> Authenticate([Required] string apiKey, [Required] string pluginName) { // NOTE: In order to log information about plugins, we need some Plugin Description information for each request // Should log into access table so we can tell the user diff --git a/API/Controllers/ReaderController.cs b/API/Controllers/ReaderController.cs index bafac20d2..5569fb9f8 100644 --- a/API/Controllers/ReaderController.cs +++ b/API/Controllers/ReaderController.cs @@ -11,6 +11,7 @@ using API.Entities; using API.Entities.Enums; using API.Extensions; using API.Services; +using API.Services.Tasks; using Hangfire; using Microsoft.AspNetCore.Authorization; using Microsoft.AspNetCore.Mvc; @@ -60,6 +61,7 @@ namespace API.Controllers try { + var path = _cacheService.GetCachedFile(chapter); if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"Pdf doesn't exist when it should."); @@ -90,7 +92,7 @@ namespace API.Controllers try { var path = _cacheService.GetCachedPagePath(chapter, page); - if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}"); + if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}. Try refreshing to allow re-cache."); var format = Path.GetExtension(path).Replace(".", ""); return PhysicalFile(path, "image/" + format, Path.GetFileName(path), true); @@ -177,17 +179,17 @@ namespace API.Controllers info.Title += " - " + info.ChapterTitle; } - if (info.IsSpecial && dto.VolumeNumber.Equals(Parser.Parser.DefaultVolume)) + if (info.IsSpecial && dto.VolumeNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) { info.Subtitle = info.FileName; - } else if (!info.IsSpecial && info.VolumeNumber.Equals(Parser.Parser.DefaultVolume)) + } else if (!info.IsSpecial && info.VolumeNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) { info.Subtitle = _readerService.FormatChapterName(info.LibraryType, true, true) + info.ChapterNumber; } else { info.Subtitle = "Volume " + info.VolumeNumber; - if (!info.ChapterNumber.Equals(Parser.Parser.DefaultChapter)) + if (!info.ChapterNumber.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) { info.Subtitle += " " + _readerService.FormatChapterName(info.LibraryType, true, true) + info.ChapterNumber; diff --git a/API/Controllers/ReadingListController.cs b/API/Controllers/ReadingListController.cs index 53d6cfb56..5f2b61ff0 100644 --- a/API/Controllers/ReadingListController.cs +++ b/API/Controllers/ReadingListController.cs @@ -8,6 +8,7 @@ using API.DTOs.ReadingLists; using API.Entities; using API.Extensions; using API.Helpers; +using API.Services; using API.SignalR; using Microsoft.AspNetCore.Authorization; using Microsoft.AspNetCore.Mvc; @@ -19,12 +20,14 @@ namespace API.Controllers { private readonly IUnitOfWork _unitOfWork; private readonly IEventHub _eventHub; + private readonly IReadingListService _readingListService; private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst(); - public ReadingListController(IUnitOfWork unitOfWork, IEventHub eventHub) + public ReadingListController(IUnitOfWork unitOfWork, IEventHub eventHub, IReadingListService readingListService) { _unitOfWork = unitOfWork; _eventHub = eventHub; + _readingListService = readingListService; } /// @@ -55,6 +58,11 @@ namespace API.Controllers return Ok(items); } + /// + /// Returns all Reading Lists the user has access to that have a series within it. + /// + /// + /// [HttpGet("lists-for-series")] public async Task>> GetListsForSeries(int seriesId) { @@ -78,17 +86,6 @@ namespace API.Controllers return Ok(items); } - private async Task UserHasReadingListAccess(int readingListId) - { - var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), - AppUserIncludes.ReadingLists); - if (user.ReadingLists.SingleOrDefault(rl => rl.Id == readingListId) == null && !await _unitOfWork.UserRepository.IsUserAdminAsync(user)) - { - return null; - } - - return user; - } /// /// Updates an items position @@ -99,25 +96,14 @@ namespace API.Controllers public async Task UpdateListItemPosition(UpdateReadingListPosition dto) { // Make sure UI buffers events - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); } - var items = (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(dto.ReadingListId)).ToList(); - var item = items.Find(r => r.Id == dto.ReadingListItemId); - items.Remove(item); - items.Insert(dto.ToPosition, item); - for (var i = 0; i < items.Count; i++) - { - items[i].Order = i; - } + if (await _readingListService.UpdateReadingListItemPosition(dto)) return Ok("Updated"); - if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync()) - { - return Ok("Updated"); - } return BadRequest("Couldn't update position"); } @@ -130,25 +116,13 @@ namespace API.Controllers [HttpPost("delete-item")] public async Task DeleteListItem(UpdateReadingListPosition dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); } - var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId); - readingList.Items = readingList.Items.Where(r => r.Id != dto.ReadingListItemId).ToList(); - - var index = 0; - foreach (var readingListItem in readingList.Items) - { - readingListItem.Order = index; - index++; - } - - if (!_unitOfWork.HasChanges()) return Ok(); - - if (await _unitOfWork.CommitAsync()) + if (await _readingListService.DeleteReadingListItem(dto)) { return Ok("Updated"); } @@ -164,34 +138,16 @@ namespace API.Controllers [HttpPost("remove-read")] public async Task DeleteReadFromList([FromQuery] int readingListId) { - var user = await UserHasReadingListAccess(readingListId); + var user = await _readingListService.UserHasReadingListAccess(readingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); } - var items = await _unitOfWork.ReadingListRepository.GetReadingListItemDtosByIdAsync(readingListId, user.Id); - items = await _unitOfWork.ReadingListRepository.AddReadingProgressModifiers(user.Id, items.ToList()); - - // Collect all Ids to remove - var itemIdsToRemove = items.Where(item => item.PagesRead == item.PagesTotal).Select(item => item.Id); - - try + if (await _readingListService.RemoveFullyReadItems(readingListId, user)) { - var listItems = - (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(readingListId)).Where(r => - itemIdsToRemove.Contains(r.Id)); - _unitOfWork.ReadingListRepository.BulkRemove(listItems); - - if (!_unitOfWork.HasChanges()) return Ok("Nothing to remove"); - - await _unitOfWork.CommitAsync(); return Ok("Updated"); } - catch - { - await _unitOfWork.RollbackAsync(); - } return BadRequest("Could not remove read items"); } @@ -204,20 +160,13 @@ namespace API.Controllers [HttpDelete] public async Task DeleteList([FromQuery] int readingListId) { - var user = await UserHasReadingListAccess(readingListId); + var user = await _readingListService.UserHasReadingListAccess(readingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); } - var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(readingListId); - - user.ReadingLists.Remove(readingList); - - if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync()) - { - return Ok("Deleted"); - } + if (await _readingListService.DeleteReadingList(readingListId, user)) return Ok("List was deleted"); return BadRequest("There was an issue deleting reading list"); } @@ -230,7 +179,8 @@ namespace API.Controllers [HttpPost("create")] public async Task> CreateList(CreateReadingListDto dto) { - var user = await _unitOfWork.UserRepository.GetUserWithReadingListsByUsernameAsync(User.GetUsername()); + + var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), AppUserIncludes.ReadingListsWithItems); // When creating, we need to make sure Title is unique var hasExisting = user.ReadingLists.Any(l => l.Title.Equals(dto.Title)); @@ -260,7 +210,7 @@ namespace API.Controllers var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId); if (readingList == null) return BadRequest("List does not exist"); - var user = await UserHasReadingListAccess(readingList.Id); + var user = await _readingListService.UserHasReadingListAccess(readingList.Id, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -269,7 +219,7 @@ namespace API.Controllers if (!string.IsNullOrEmpty(dto.Title)) { readingList.Title = dto.Title; // Should I check if this is unique? - readingList.NormalizedTitle = Parser.Parser.Normalize(readingList.Title); + readingList.NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(readingList.Title); } if (!string.IsNullOrEmpty(dto.Title)) { @@ -308,7 +258,7 @@ namespace API.Controllers [HttpPost("update-by-series")] public async Task UpdateListBySeries(UpdateReadingListBySeriesDto dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -320,7 +270,7 @@ namespace API.Controllers await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new [] {dto.SeriesId}); // If there are adds, tell tracking this has been modified - if (await AddChaptersToReadingList(dto.SeriesId, chapterIdsForSeries, readingList)) + if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIdsForSeries, readingList)) { _unitOfWork.ReadingListRepository.Update(readingList); } @@ -350,7 +300,7 @@ namespace API.Controllers [HttpPost("update-by-multiple")] public async Task UpdateListByMultiple(UpdateReadingListByMultipleDto dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -365,7 +315,7 @@ namespace API.Controllers } // If there are adds, tell tracking this has been modified - if (await AddChaptersToReadingList(dto.SeriesId, chapterIds, readingList)) + if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIds, readingList)) { _unitOfWork.ReadingListRepository.Update(readingList); } @@ -394,7 +344,7 @@ namespace API.Controllers [HttpPost("update-by-multiple-series")] public async Task UpdateListByMultipleSeries(UpdateReadingListByMultipleSeriesDto dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -407,7 +357,7 @@ namespace API.Controllers foreach (var seriesId in ids.Keys) { // If there are adds, tell tracking this has been modified - if (await AddChaptersToReadingList(seriesId, ids[seriesId], readingList)) + if (await _readingListService.AddChaptersToReadingList(seriesId, ids[seriesId], readingList)) { _unitOfWork.ReadingListRepository.Update(readingList); } @@ -432,7 +382,7 @@ namespace API.Controllers [HttpPost("update-by-volume")] public async Task UpdateListByVolume(UpdateReadingListByVolumeDto dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -444,7 +394,7 @@ namespace API.Controllers (await _unitOfWork.ChapterRepository.GetChaptersAsync(dto.VolumeId)).Select(c => c.Id).ToList(); // If there are adds, tell tracking this has been modified - if (await AddChaptersToReadingList(dto.SeriesId, chapterIdsForVolume, readingList)) + if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, chapterIdsForVolume, readingList)) { _unitOfWork.ReadingListRepository.Update(readingList); } @@ -468,7 +418,7 @@ namespace API.Controllers [HttpPost("update-by-chapter")] public async Task UpdateListByChapter(UpdateReadingListByChapterDto dto) { - var user = await UserHasReadingListAccess(dto.ReadingListId); + var user = await _readingListService.UserHasReadingListAccess(dto.ReadingListId, User.GetUsername()); if (user == null) { return BadRequest("You do not have permissions on this reading list or the list doesn't exist"); @@ -477,7 +427,7 @@ namespace API.Controllers if (readingList == null) return BadRequest("Reading List does not exist"); // If there are adds, tell tracking this has been modified - if (await AddChaptersToReadingList(dto.SeriesId, new List() { dto.ChapterId }, readingList)) + if (await _readingListService.AddChaptersToReadingList(dto.SeriesId, new List() { dto.ChapterId }, readingList)) { _unitOfWork.ReadingListRepository.Update(readingList); } @@ -498,39 +448,7 @@ namespace API.Controllers return Ok("Nothing to do"); } - /// - /// Adds a list of Chapters as reading list items to the passed reading list. - /// - /// - /// - /// - /// True if new chapters were added - private async Task AddChaptersToReadingList(int seriesId, IList chapterIds, - ReadingList readingList) - { - // TODO: Move to ReadingListService and Unit Test - readingList.Items ??= new List(); - var lastOrder = 0; - if (readingList.Items.Any()) - { - lastOrder = readingList.Items.DefaultIfEmpty().Max(rli => rli.Order); - } - var existingChapterExists = readingList.Items.Select(rli => rli.ChapterId).ToHashSet(); - var chaptersForSeries = (await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds)) - .OrderBy(c => Parser.Parser.MinNumberFromRange(c.Volume.Name)) - .ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting); - - var index = lastOrder + 1; - foreach (var chapter in chaptersForSeries) - { - if (existingChapterExists.Contains(chapter.Id)) continue; - readingList.Items.Add(DbFactory.ReadingListItem(index, seriesId, chapter.VolumeId, chapter.Id)); - index += 1; - } - - return index > lastOrder + 1; - } /// /// Returns the next chapter within the reading list diff --git a/API/Controllers/SeriesController.cs b/API/Controllers/SeriesController.cs index bc3acb1b8..6f458b6b8 100644 --- a/API/Controllers/SeriesController.cs +++ b/API/Controllers/SeriesController.cs @@ -156,12 +156,14 @@ namespace API.Controllers } series.Name = updateSeries.Name.Trim(); + series.NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name); if (!string.IsNullOrEmpty(updateSeries.SortName.Trim())) { series.SortName = updateSeries.SortName.Trim(); } series.LocalizedName = updateSeries.LocalizedName.Trim(); + series.NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName); series.NameLocked = updateSeries.NameLocked; series.SortNameLocked = updateSeries.SortNameLocked; diff --git a/API/Controllers/SettingsController.cs b/API/Controllers/SettingsController.cs index a712a39cc..e1a758775 100644 --- a/API/Controllers/SettingsController.cs +++ b/API/Controllers/SettingsController.cs @@ -10,6 +10,7 @@ using API.Entities.Enums; using API.Extensions; using API.Helpers.Converters; using API.Services; +using API.Services.Tasks.Scanner; using AutoMapper; using Flurl.Http; using Kavita.Common; @@ -29,9 +30,10 @@ namespace API.Controllers private readonly IDirectoryService _directoryService; private readonly IMapper _mapper; private readonly IEmailService _emailService; + private readonly ILibraryWatcher _libraryWatcher; public SettingsController(ILogger logger, IUnitOfWork unitOfWork, ITaskScheduler taskScheduler, - IDirectoryService directoryService, IMapper mapper, IEmailService emailService) + IDirectoryService directoryService, IMapper mapper, IEmailService emailService, ILibraryWatcher libraryWatcher) { _logger = logger; _unitOfWork = unitOfWork; @@ -39,6 +41,7 @@ namespace API.Controllers _directoryService = directoryService; _mapper = mapper; _emailService = emailService; + _libraryWatcher = libraryWatcher; } [AllowAnonymous] @@ -227,6 +230,21 @@ namespace API.Controllers _unitOfWork.SettingsRepository.Update(setting); } + + if (setting.Key == ServerSettingKey.EnableFolderWatching && updateSettingsDto.EnableFolderWatching + string.Empty != setting.Value) + { + setting.Value = updateSettingsDto.EnableFolderWatching + string.Empty; + _unitOfWork.SettingsRepository.Update(setting); + + if (updateSettingsDto.EnableFolderWatching) + { + await _libraryWatcher.StartWatching(); + } + else + { + _libraryWatcher.StopWatching(); + } + } } if (!_unitOfWork.HasChanges()) return Ok(updateSettingsDto); diff --git a/API/Controllers/TachiyomiController.cs b/API/Controllers/TachiyomiController.cs index 5a9fdeded..f1f6a1f03 100644 --- a/API/Controllers/TachiyomiController.cs +++ b/API/Controllers/TachiyomiController.cs @@ -49,9 +49,8 @@ public class TachiyomiController : BaseApiController // If prevChapterId is -1, this means either nothing is read or everything is read. if (prevChapterId == -1) { - var userWithProgress = await _unitOfWork.UserRepository.GetUserByIdAsync(userId, AppUserIncludes.Progress); - var userHasProgress = - userWithProgress.Progresses.Any(x => x.SeriesId == seriesId); + var series = await _unitOfWork.SeriesRepository.GetSeriesDtoByIdAsync(seriesId, userId); + var userHasProgress = series.PagesRead != 0 && series.PagesRead < series.Pages; // If the user doesn't have progress, then return null, which the extension will catch as 204 (no content) and report nothing as read if (!userHasProgress) return null; @@ -61,21 +60,22 @@ public class TachiyomiController : BaseApiController var looseLeafChapterVolume = volumes.FirstOrDefault(v => v.Number == 0); if (looseLeafChapterVolume == null) { - var volumeChapter = _mapper.Map(volumes.Last().Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparerZeroFirst()).Last()); + var volumeChapter = _mapper.Map(volumes.Last().Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparerZeroFirst.Default).Last()); return Ok(new ChapterDto() { Number = $"{int.Parse(volumeChapter.Number) / 100f}" }); } - var lastChapter = looseLeafChapterVolume.Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparer()).Last(); + var lastChapter = looseLeafChapterVolume.Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default).Last(); return Ok(_mapper.Map(lastChapter)); } // There is progress, we now need to figure out the highest volume or chapter and return that. var prevChapter = await _unitOfWork.ChapterRepository.GetChapterDtoAsync(prevChapterId); var volumeWithProgress = await _unitOfWork.VolumeRepository.GetVolumeDtoAsync(prevChapter.VolumeId, userId); - if (volumeWithProgress.Number != 0) + // We only encode for single-file volumes + if (volumeWithProgress.Number != 0 && volumeWithProgress.Chapters.Count == 1) { // The progress is on a volume, encode it as a fake chapterDTO return Ok(new ChapterDto() diff --git a/API/Controllers/ThemeController.cs b/API/Controllers/ThemeController.cs index 69793df46..6defbe574 100644 --- a/API/Controllers/ThemeController.cs +++ b/API/Controllers/ThemeController.cs @@ -51,6 +51,7 @@ public class ThemeController : BaseApiController /// Returns css content to the UI. UI is expected to escape the content /// /// + [AllowAnonymous] [HttpGet("download-content")] public async Task> GetThemeContent(int themeId) { diff --git a/API/DTOs/Reader/BookmarkDto.cs b/API/DTOs/Reader/BookmarkDto.cs index 3653bcaa0..33f55cf8d 100644 --- a/API/DTOs/Reader/BookmarkDto.cs +++ b/API/DTOs/Reader/BookmarkDto.cs @@ -1,11 +1,17 @@ -namespace API.DTOs.Reader +using System.ComponentModel.DataAnnotations; + +namespace API.DTOs.Reader { public class BookmarkDto { public int Id { get; set; } + [Required] public int Page { get; set; } + [Required] public int VolumeId { get; set; } + [Required] public int SeriesId { get; set; } + [Required] public int ChapterId { get; set; } } } diff --git a/API/DTOs/ReadingLists/UpdateReadingListPosition.cs b/API/DTOs/ReadingLists/UpdateReadingListPosition.cs index 023849024..5407a1ad5 100644 --- a/API/DTOs/ReadingLists/UpdateReadingListPosition.cs +++ b/API/DTOs/ReadingLists/UpdateReadingListPosition.cs @@ -1,10 +1,18 @@ -namespace API.DTOs.ReadingLists +using System.ComponentModel.DataAnnotations; + +namespace API.DTOs.ReadingLists { + /// + /// DTO for moving a reading list item to another position within the same list + /// public class UpdateReadingListPosition { + [Required] public int ReadingListId { get; set; } + [Required] public int ReadingListItemId { get; set; } public int FromPosition { get; set; } + [Required] public int ToPosition { get; set; } } } diff --git a/API/DTOs/ScanFolderDto.cs b/API/DTOs/ScanFolderDto.cs new file mode 100644 index 000000000..59ce4d0b5 --- /dev/null +++ b/API/DTOs/ScanFolderDto.cs @@ -0,0 +1,17 @@ +namespace API.DTOs; + +/// +/// DTO for requesting a folder to be scanned +/// +public class ScanFolderDto +{ + /// + /// Api key for a user with Admin permissions + /// + public string ApiKey { get; set; } + /// + /// Folder Path to Scan + /// + /// JSON cannot accept /, so you may need to use // escaping on paths + public string FolderPath { get; set; } +} diff --git a/API/DTOs/SeriesDto.cs b/API/DTOs/SeriesDto.cs index b5fc63473..bbf65e9fb 100644 --- a/API/DTOs/SeriesDto.cs +++ b/API/DTOs/SeriesDto.cs @@ -54,5 +54,13 @@ namespace API.DTOs public int MaxHoursToRead { get; set; } /// public int AvgHoursToRead { get; set; } + /// + /// The highest level folder for this Series + /// + public string FolderPath { get; set; } + /// + /// The last time the folder for this series was scanned + /// + public DateTime LastFolderScanned { get; set; } } } diff --git a/API/DTOs/Settings/ServerSettingDTO.cs b/API/DTOs/Settings/ServerSettingDTO.cs index 9f33b6908..f979684af 100644 --- a/API/DTOs/Settings/ServerSettingDTO.cs +++ b/API/DTOs/Settings/ServerSettingDTO.cs @@ -1,5 +1,4 @@ -using System.Collections.Generic; -using API.Services; +using API.Services; namespace API.DTOs.Settings { @@ -43,7 +42,9 @@ namespace API.DTOs.Settings /// Represents a unique Id to this Kavita installation. Only used in Stats to identify unique installs. /// public string InstallId { get; set; } - + /// + /// If the server should save bookmarks as WebP encoding + /// public bool ConvertBookmarkToWebP { get; set; } /// /// If the Swagger UI Should be exposed. Does not require authentication, but does require a JWT. @@ -55,5 +56,9 @@ namespace API.DTOs.Settings /// /// Value should be between 1 and 30 public int TotalBackups { get; set; } = 30; + /// + /// If Kavita should watch the library folders and process changes + /// + public bool EnableFolderWatching { get; set; } = true; } } diff --git a/API/Data/DataContext.cs b/API/Data/DataContext.cs index 7b2ca2654..7c76e4a78 100644 --- a/API/Data/DataContext.cs +++ b/API/Data/DataContext.cs @@ -43,6 +43,7 @@ namespace API.Data public DbSet Tag { get; set; } public DbSet SiteTheme { get; set; } public DbSet SeriesRelation { get; set; } + public DbSet FolderPath { get; set; } protected override void OnModelCreating(ModelBuilder builder) @@ -71,7 +72,9 @@ namespace API.Data builder.Entity() .HasOne(pt => pt.TargetSeries) .WithMany(t => t.RelationOf) - .HasForeignKey(pt => pt.TargetSeriesId); + .HasForeignKey(pt => pt.TargetSeriesId) + .OnDelete(DeleteBehavior.ClientCascade); + builder.Entity() .Property(b => b.BookThemeName) diff --git a/API/Data/DbFactory.cs b/API/Data/DbFactory.cs index ad97958da..921b55c54 100644 --- a/API/Data/DbFactory.cs +++ b/API/Data/DbFactory.cs @@ -23,7 +23,27 @@ namespace API.Data Name = name, OriginalName = name, LocalizedName = name, - NormalizedName = Parser.Parser.Normalize(name), + NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name), + NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name), + SortName = name, + Volumes = new List(), + Metadata = SeriesMetadata(Array.Empty()) + }; + } + + public static Series Series(string name, string localizedName) + { + if (string.IsNullOrEmpty(localizedName)) + { + localizedName = name; + } + return new Series + { + Name = name, + OriginalName = name, + LocalizedName = localizedName, + NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name), + NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(localizedName), SortName = name, Volumes = new List(), Metadata = SeriesMetadata(Array.Empty()) @@ -35,7 +55,7 @@ namespace API.Data return new Volume() { Name = volumeNumber, - Number = (int) Parser.Parser.MinNumberFromRange(volumeNumber), + Number = (int) Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(volumeNumber), Chapters = new List() }; } @@ -46,7 +66,7 @@ namespace API.Data var specialTitle = specialTreatment ? info.Filename : info.Chapters; return new Chapter() { - Number = specialTreatment ? "0" : Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty, + Number = specialTreatment ? "0" : Services.Tasks.Scanner.Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty, Range = specialTreatment ? info.Filename : info.Chapters, Title = (specialTreatment && info.Format == MangaFormat.Epub) ? info.Title @@ -75,7 +95,7 @@ namespace API.Data return new CollectionTag() { Id = id, - NormalizedTitle = API.Parser.Parser.Normalize(title?.Trim()).ToUpper(), + NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(title?.Trim()).ToUpper(), Title = title?.Trim(), Summary = summary?.Trim(), Promoted = promoted @@ -86,7 +106,7 @@ namespace API.Data { return new ReadingList() { - NormalizedTitle = API.Parser.Parser.Normalize(title?.Trim()).ToUpper(), + NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(title?.Trim()).ToUpper(), Title = title?.Trim(), Summary = summary?.Trim(), Promoted = promoted, @@ -110,7 +130,7 @@ namespace API.Data return new Genre() { Title = name.Trim().SentenceCase(), - NormalizedTitle = Parser.Parser.Normalize(name), + NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(name), ExternalTag = external }; } @@ -120,7 +140,7 @@ namespace API.Data return new Tag() { Title = name.Trim().SentenceCase(), - NormalizedTitle = Parser.Parser.Normalize(name), + NormalizedTitle = Services.Tasks.Scanner.Parser.Parser.Normalize(name), ExternalTag = external }; } @@ -130,7 +150,7 @@ namespace API.Data return new Person() { Name = name.Trim(), - NormalizedName = Parser.Parser.Normalize(name), + NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name), Role = role }; } diff --git a/API/Data/Metadata/ComicInfo.cs b/API/Data/Metadata/ComicInfo.cs index 167638a01..d34901daa 100644 --- a/API/Data/Metadata/ComicInfo.cs +++ b/API/Data/Metadata/ComicInfo.cs @@ -107,16 +107,16 @@ namespace API.Data.Metadata info.SeriesSort = info.SeriesSort.Trim(); info.LocalizedSeries = info.LocalizedSeries.Trim(); - info.Writer = Parser.Parser.CleanAuthor(info.Writer); - info.Colorist = Parser.Parser.CleanAuthor(info.Colorist); - info.Editor = Parser.Parser.CleanAuthor(info.Editor); - info.Inker = Parser.Parser.CleanAuthor(info.Inker); - info.Letterer = Parser.Parser.CleanAuthor(info.Letterer); - info.Penciller = Parser.Parser.CleanAuthor(info.Penciller); - info.Publisher = Parser.Parser.CleanAuthor(info.Publisher); - info.Characters = Parser.Parser.CleanAuthor(info.Characters); - info.Translator = Parser.Parser.CleanAuthor(info.Translator); - info.CoverArtist = Parser.Parser.CleanAuthor(info.CoverArtist); + info.Writer = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Writer); + info.Colorist = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Colorist); + info.Editor = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Editor); + info.Inker = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Inker); + info.Letterer = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Letterer); + info.Penciller = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Penciller); + info.Publisher = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Publisher); + info.Characters = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Characters); + info.Translator = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.Translator); + info.CoverArtist = Services.Tasks.Scanner.Parser.Parser.CleanAuthor(info.CoverArtist); } diff --git a/API/Data/MigrateBookmarks.cs b/API/Data/MigrateBookmarks.cs deleted file mode 100644 index 6649e83e7..000000000 --- a/API/Data/MigrateBookmarks.cs +++ /dev/null @@ -1,105 +0,0 @@ -using System; -using System.Linq; -using System.Threading.Tasks; -using API.Comparators; -using API.Entities.Enums; -using API.Services; -using Microsoft.Extensions.Logging; - -namespace API.Data; - -/// -/// Responsible to migrate existing bookmarks to files. Introduced in v0.4.9.27 -/// -public static class MigrateBookmarks -{ - /// - /// This will migrate existing bookmarks to bookmark folder based. - /// If the bookmarks folder already exists, this will not run. - /// - /// Bookmark directory is configurable. This will always use the default bookmark directory. - /// - /// - /// - /// - /// - public static async Task Migrate(IDirectoryService directoryService, IUnitOfWork unitOfWork, - ILogger logger, ICacheService cacheService) - { - var bookmarkDirectory = (await unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)) - .Value; - if (string.IsNullOrEmpty(bookmarkDirectory)) - { - bookmarkDirectory = directoryService.BookmarkDirectory; - } - - if (directoryService.Exists(bookmarkDirectory)) return; - - logger.LogInformation("Bookmark migration is needed....This may take some time"); - - var allBookmarks = (await unitOfWork.UserRepository.GetAllBookmarksAsync()).ToList(); - - var uniqueChapterIds = allBookmarks.Select(b => b.ChapterId).Distinct().ToList(); - var uniqueUserIds = allBookmarks.Select(b => b.AppUserId).Distinct().ToList(); - foreach (var userId in uniqueUserIds) - { - foreach (var chapterId in uniqueChapterIds) - { - var chapterBookmarks = allBookmarks.Where(b => b.ChapterId == chapterId).ToList(); - var chapterPages = chapterBookmarks - .Select(b => b.Page).ToList(); - var seriesId = chapterBookmarks - .Select(b => b.SeriesId).First(); - var mangaFiles = await unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId); - var chapterExtractPath = directoryService.FileSystem.Path.Join(directoryService.TempDirectory, $"bookmark_c{chapterId}_u{userId}_s{seriesId}"); - - var numericComparer = new NumericComparer(); - if (!mangaFiles.Any()) continue; - - switch (mangaFiles.First().Format) - { - case MangaFormat.Image: - directoryService.ExistOrCreate(chapterExtractPath); - directoryService.CopyFilesToDirectory(mangaFiles.Select(f => f.FilePath), chapterExtractPath); - break; - case MangaFormat.Archive: - case MangaFormat.Pdf: - cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList()); - break; - case MangaFormat.Epub: - continue; - default: - continue; - } - - var files = directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions); - // Filter out images that aren't in bookmarks - Array.Sort(files, numericComparer); - foreach (var chapterPage in chapterPages) - { - var file = files.ElementAt(chapterPage); - var bookmark = allBookmarks.FirstOrDefault(b => - b.ChapterId == chapterId && b.SeriesId == seriesId && b.AppUserId == userId && - b.Page == chapterPage); - if (bookmark == null) continue; - - var filename = directoryService.FileSystem.Path.GetFileName(file); - var newLocation = directoryService.FileSystem.Path.Join( - ReaderService.FormatBookmarkFolderPath(String.Empty, userId, seriesId, chapterId), - filename); - bookmark.FileName = newLocation; - directoryService.CopyFileToDirectory(file, - ReaderService.FormatBookmarkFolderPath(bookmarkDirectory, userId, seriesId, chapterId)); - unitOfWork.UserRepository.Update(bookmark); - } - } - // Clear temp after each user to avoid too much space being eaten - directoryService.ClearDirectory(directoryService.TempDirectory); - } - - await unitOfWork.CommitAsync(); - // Run CleanupService as we cache a ton of files - directoryService.ClearDirectory(directoryService.TempDirectory); - - } -} diff --git a/API/Data/MigrateNormalizedLocalizedName.cs b/API/Data/MigrateNormalizedLocalizedName.cs new file mode 100644 index 000000000..37ea705e3 --- /dev/null +++ b/API/Data/MigrateNormalizedLocalizedName.cs @@ -0,0 +1,38 @@ +using System.Linq; +using System.Threading.Tasks; +using Microsoft.EntityFrameworkCore; +using Microsoft.Extensions.Logging; + +namespace API.Data; + +/// +/// v0.5.6 introduced Normalized Localized Name, which allows for faster lookups and less memory usage. This migration will calculate them once +/// +public static class MigrateNormalizedLocalizedName +{ + public static async Task Migrate(IUnitOfWork unitOfWork, DataContext dataContext, ILogger logger) + { + if (!await dataContext.Series.Where(s => s.NormalizedLocalizedName == null).AnyAsync()) + { + return; + } + logger.LogInformation("Running MigrateNormalizedLocalizedName migration. Please be patient, this may take some time"); + + + foreach (var series in await dataContext.Series.ToListAsync()) + { + series.NormalizedLocalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName ?? string.Empty); + logger.LogInformation("Updated {SeriesName} normalized localized name: {LocalizedName}", series.Name, series.NormalizedLocalizedName); + unitOfWork.SeriesRepository.Update(series); + } + + if (unitOfWork.HasChanges()) + { + await unitOfWork.CommitAsync(); + } + + logger.LogInformation("MigrateNormalizedLocalizedName migration finished"); + + } + +} diff --git a/API/Data/MigrateRemoveExtraThemes.cs b/API/Data/MigrateRemoveExtraThemes.cs index 1c9a1e9b0..747c910c0 100644 --- a/API/Data/MigrateRemoveExtraThemes.cs +++ b/API/Data/MigrateRemoveExtraThemes.cs @@ -13,16 +13,15 @@ public static class MigrateRemoveExtraThemes { public static async Task Migrate(IUnitOfWork unitOfWork, IThemeService themeService) { - Console.WriteLine("Removing Dark and E-Ink themes"); - var themes = (await unitOfWork.SiteThemeRepository.GetThemes()).ToList(); if (themes.FirstOrDefault(t => t.Name.Equals("Light")) == null) { - Console.WriteLine("Done. Nothing to do"); return; } + Console.WriteLine("Removing Dark and E-Ink themes"); + var darkTheme = themes.Single(t => t.Name.Equals("Dark")); var lightTheme = themes.Single(t => t.Name.Equals("Light")); var eInkTheme = themes.Single(t => t.Name.Equals("E-Ink")); diff --git a/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs b/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs new file mode 100644 index 000000000..96fed7004 --- /dev/null +++ b/API/Data/Migrations/20220817173731_SeriesFolder.Designer.cs @@ -0,0 +1,1605 @@ +// +using System; +using API.Data; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; +using Microsoft.EntityFrameworkCore.Migrations; +using Microsoft.EntityFrameworkCore.Storage.ValueConversion; + +#nullable disable + +namespace API.Data.Migrations +{ + [DbContext(typeof(DataContext))] + [Migration("20220817173731_SeriesFolder")] + partial class SeriesFolder + { + protected override void BuildTargetModel(ModelBuilder modelBuilder) + { +#pragma warning disable 612, 618 + modelBuilder.HasAnnotation("ProductVersion", "6.0.7"); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("Name") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedName") + .IsUnique() + .HasDatabaseName("RoleNameIndex"); + + b.ToTable("AspNetRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AccessFailedCount") + .HasColumnType("INTEGER"); + + b.Property("ApiKey") + .HasColumnType("TEXT"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("ConfirmationToken") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("Email") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("EmailConfirmed") + .HasColumnType("INTEGER"); + + b.Property("LastActive") + .HasColumnType("TEXT"); + + b.Property("LockoutEnabled") + .HasColumnType("INTEGER"); + + b.Property("LockoutEnd") + .HasColumnType("TEXT"); + + b.Property("NormalizedEmail") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedUserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("PasswordHash") + .HasColumnType("TEXT"); + + b.Property("PhoneNumber") + .HasColumnType("TEXT"); + + b.Property("PhoneNumberConfirmed") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SecurityStamp") + .HasColumnType("TEXT"); + + b.Property("TwoFactorEnabled") + .HasColumnType("INTEGER"); + + b.Property("UserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedEmail") + .HasDatabaseName("EmailIndex"); + + b.HasIndex("NormalizedUserName") + .IsUnique() + .HasDatabaseName("UserNameIndex"); + + b.ToTable("AspNetUsers", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Page") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("AppUserBookmark"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AutoCloseMenu") + .HasColumnType("INTEGER"); + + b.Property("BackgroundColor") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("#000000"); + + b.Property("BlurUnreadSummaries") + .HasColumnType("INTEGER"); + + b.Property("BookReaderFontFamily") + .HasColumnType("TEXT"); + + b.Property("BookReaderFontSize") + .HasColumnType("INTEGER"); + + b.Property("BookReaderImmersiveMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLayoutMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLineSpacing") + .HasColumnType("INTEGER"); + + b.Property("BookReaderMargin") + .HasColumnType("INTEGER"); + + b.Property("BookReaderReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("BookReaderTapToPaginate") + .HasColumnType("INTEGER"); + + b.Property("BookThemeName") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("Dark"); + + b.Property("GlobalPageLayoutMode") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER") + .HasDefaultValue(0); + + b.Property("LayoutMode") + .HasColumnType("INTEGER"); + + b.Property("PageSplitOption") + .HasColumnType("INTEGER"); + + b.Property("PromptForDownloadSize") + .HasColumnType("INTEGER"); + + b.Property("ReaderMode") + .HasColumnType("INTEGER"); + + b.Property("ReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("ScalingOption") + .HasColumnType("INTEGER"); + + b.Property("ShowScreenHints") + .HasColumnType("INTEGER"); + + b.Property("ThemeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId") + .IsUnique(); + + b.HasIndex("ThemeId"); + + b.ToTable("AppUserPreferences"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("BookScrollId") + .HasColumnType("TEXT"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("PagesRead") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserProgresses"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("Rating") + .HasColumnType("INTEGER"); + + b.Property("Review") + .HasColumnType("TEXT"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserRating"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("UserId", "RoleId"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetUserRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Count") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("IsSpecial") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Number") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("Range") + .HasColumnType("TEXT"); + + b.Property("ReleaseDate") + .HasColumnType("TEXT"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.Property("TitleName") + .HasColumnType("TEXT"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("VolumeId"); + + b.ToTable("Chapter"); + }); + + modelBuilder.Entity("API.Entities.CollectionTag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("Id", "Promoted") + .IsUnique(); + + b.ToTable("CollectionTag"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("Path") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("LibraryId"); + + b.ToTable("FolderPath"); + }); + + modelBuilder.Entity("API.Entities.Genre", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Genre"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Type") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FilePath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastFileAnalysis") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.ToTable("MangaFile"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AgeRatingLocked") + .HasColumnType("INTEGER"); + + b.Property("CharacterLocked") + .HasColumnType("INTEGER"); + + b.Property("ColoristLocked") + .HasColumnType("INTEGER"); + + b.Property("CoverArtistLocked") + .HasColumnType("INTEGER"); + + b.Property("EditorLocked") + .HasColumnType("INTEGER"); + + b.Property("GenresLocked") + .HasColumnType("INTEGER"); + + b.Property("InkerLocked") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LanguageLocked") + .HasColumnType("INTEGER"); + + b.Property("LettererLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxCount") + .HasColumnType("INTEGER"); + + b.Property("PencillerLocked") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatus") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatusLocked") + .HasColumnType("INTEGER"); + + b.Property("PublisherLocked") + .HasColumnType("INTEGER"); + + b.Property("ReleaseYear") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("SummaryLocked") + .HasColumnType("INTEGER"); + + b.Property("TagsLocked") + .HasColumnType("INTEGER"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("TranslatorLocked") + .HasColumnType("INTEGER"); + + b.Property("WriterLocked") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId") + .IsUnique(); + + b.HasIndex("Id", "SeriesId") + .IsUnique(); + + b.ToTable("SeriesMetadata"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("RelationKind") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("TargetSeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.HasIndex("TargetSeriesId"); + + b.ToTable("SeriesRelation"); + }); + + modelBuilder.Entity("API.Entities.Person", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Role") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Person"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("ReadingList"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Order") + .HasColumnType("INTEGER"); + + b.Property("ReadingListId") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.HasIndex("ReadingListId"); + + b.HasIndex("SeriesId"); + + b.HasIndex("VolumeId"); + + b.ToTable("ReadingListItem"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FolderPath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastChapterAdded") + .HasColumnType("TEXT"); + + b.Property("LastFolderScanned") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("LocalizedName") + .HasColumnType("TEXT"); + + b.Property("LocalizedNameLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NameLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("OriginalName") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SortName") + .HasColumnType("TEXT"); + + b.Property("SortNameLocked") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("LibraryId"); + + b.ToTable("Series"); + }); + + modelBuilder.Entity("API.Entities.ServerSetting", b => + { + b.Property("Key") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("Key"); + + b.ToTable("ServerSetting"); + }); + + modelBuilder.Entity("API.Entities.SiteTheme", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("IsDefault") + .HasColumnType("INTEGER"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Provider") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("SiteTheme"); + }); + + modelBuilder.Entity("API.Entities.Tag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Tag"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Number") + .HasColumnType("INTEGER"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.ToTable("Volume"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.Property("AppUsersId") + .HasColumnType("INTEGER"); + + b.Property("LibrariesId") + .HasColumnType("INTEGER"); + + b.HasKey("AppUsersId", "LibrariesId"); + + b.HasIndex("LibrariesId"); + + b.ToTable("AppUserLibrary"); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "GenresId"); + + b.HasIndex("GenresId"); + + b.ToTable("ChapterGenre"); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.Property("ChapterMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.HasKey("ChapterMetadatasId", "PeopleId"); + + b.HasIndex("PeopleId"); + + b.ToTable("ChapterPerson"); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("ChapterTag"); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.Property("CollectionTagsId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("CollectionTagsId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("CollectionTagSeriesMetadata"); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("GenresId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("GenreSeriesMetadata"); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetRoleClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("ProviderKey") + .HasColumnType("TEXT"); + + b.Property("ProviderDisplayName") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("LoginProvider", "ProviderKey"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserLogins", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("UserId", "LoginProvider", "Name"); + + b.ToTable("AspNetUserTokens", (string)null); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("PeopleId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("PersonSeriesMetadata"); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("SeriesMetadatasId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("SeriesMetadataTag"); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Bookmarks") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithOne("UserPreferences") + .HasForeignKey("API.Entities.AppUserPreferences", "AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.SiteTheme", "Theme") + .WithMany() + .HasForeignKey("ThemeId"); + + b.Navigation("AppUser"); + + b.Navigation("Theme"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Progresses") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Progress") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Ratings") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Ratings") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.HasOne("API.Entities.AppRole", "Role") + .WithMany("UserRoles") + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.AppUser", "User") + .WithMany("UserRoles") + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Role"); + + b.Navigation("User"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.HasOne("API.Entities.Volume", "Volume") + .WithMany("Chapters") + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.HasOne("API.Entities.Library", "Library") + .WithMany("Folders") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany("Files") + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithOne("Metadata") + .HasForeignKey("API.Entities.Metadata.SeriesMetadata", "SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Relations") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.ClientCascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "TargetSeries") + .WithMany("RelationOf") + .HasForeignKey("TargetSeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + + b.Navigation("TargetSeries"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("ReadingLists") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany() + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.ReadingList", "ReadingList") + .WithMany("Items") + .HasForeignKey("ReadingListId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "Series") + .WithMany() + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Volume", "Volume") + .WithMany() + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + + b.Navigation("ReadingList"); + + b.Navigation("Series"); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany("WantToRead") + .HasForeignKey("AppUserId"); + + b.HasOne("API.Entities.Library", "Library") + .WithMany("Series") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Volumes") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("AppUsersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Library", null) + .WithMany() + .HasForeignKey("LibrariesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChapterMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.HasOne("API.Entities.CollectionTag", null) + .WithMany() + .HasForeignKey("CollectionTagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.HasOne("API.Entities.AppRole", null) + .WithMany() + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Navigation("UserRoles"); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Navigation("Bookmarks"); + + b.Navigation("Progresses"); + + b.Navigation("Ratings"); + + b.Navigation("ReadingLists"); + + b.Navigation("UserPreferences"); + + b.Navigation("UserRoles"); + + b.Navigation("WantToRead"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Navigation("Files"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Navigation("Folders"); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Navigation("Items"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Navigation("Metadata"); + + b.Navigation("Progress"); + + b.Navigation("Ratings"); + + b.Navigation("RelationOf"); + + b.Navigation("Relations"); + + b.Navigation("Volumes"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Navigation("Chapters"); + }); +#pragma warning restore 612, 618 + } + } +} diff --git a/API/Data/Migrations/20220817173731_SeriesFolder.cs b/API/Data/Migrations/20220817173731_SeriesFolder.cs new file mode 100644 index 000000000..33373c0c4 --- /dev/null +++ b/API/Data/Migrations/20220817173731_SeriesFolder.cs @@ -0,0 +1,37 @@ +using System; +using Microsoft.EntityFrameworkCore.Migrations; + +#nullable disable + +namespace API.Data.Migrations +{ + public partial class SeriesFolder : Migration + { + protected override void Up(MigrationBuilder migrationBuilder) + { + migrationBuilder.AddColumn( + name: "FolderPath", + table: "Series", + type: "TEXT", + nullable: true); + + migrationBuilder.AddColumn( + name: "LastFolderScanned", + table: "Series", + type: "TEXT", + nullable: false, + defaultValue: new DateTime(1, 1, 1, 0, 0, 0, 0, DateTimeKind.Unspecified)); + } + + protected override void Down(MigrationBuilder migrationBuilder) + { + migrationBuilder.DropColumn( + name: "FolderPath", + table: "Series"); + + migrationBuilder.DropColumn( + name: "LastFolderScanned", + table: "Series"); + } + } +} diff --git a/API/Data/Migrations/20220819223212_NormalizedLocalizedName.Designer.cs b/API/Data/Migrations/20220819223212_NormalizedLocalizedName.Designer.cs new file mode 100644 index 000000000..41bf29e94 --- /dev/null +++ b/API/Data/Migrations/20220819223212_NormalizedLocalizedName.Designer.cs @@ -0,0 +1,1608 @@ +// +using System; +using API.Data; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; +using Microsoft.EntityFrameworkCore.Migrations; +using Microsoft.EntityFrameworkCore.Storage.ValueConversion; + +#nullable disable + +namespace API.Data.Migrations +{ + [DbContext(typeof(DataContext))] + [Migration("20220819223212_NormalizedLocalizedName")] + partial class NormalizedLocalizedName + { + protected override void BuildTargetModel(ModelBuilder modelBuilder) + { +#pragma warning disable 612, 618 + modelBuilder.HasAnnotation("ProductVersion", "6.0.7"); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("Name") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedName") + .IsUnique() + .HasDatabaseName("RoleNameIndex"); + + b.ToTable("AspNetRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AccessFailedCount") + .HasColumnType("INTEGER"); + + b.Property("ApiKey") + .HasColumnType("TEXT"); + + b.Property("ConcurrencyStamp") + .IsConcurrencyToken() + .HasColumnType("TEXT"); + + b.Property("ConfirmationToken") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("Email") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("EmailConfirmed") + .HasColumnType("INTEGER"); + + b.Property("LastActive") + .HasColumnType("TEXT"); + + b.Property("LockoutEnabled") + .HasColumnType("INTEGER"); + + b.Property("LockoutEnd") + .HasColumnType("TEXT"); + + b.Property("NormalizedEmail") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("NormalizedUserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.Property("PasswordHash") + .HasColumnType("TEXT"); + + b.Property("PhoneNumber") + .HasColumnType("TEXT"); + + b.Property("PhoneNumberConfirmed") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SecurityStamp") + .HasColumnType("TEXT"); + + b.Property("TwoFactorEnabled") + .HasColumnType("INTEGER"); + + b.Property("UserName") + .HasMaxLength(256) + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedEmail") + .HasDatabaseName("EmailIndex"); + + b.HasIndex("NormalizedUserName") + .IsUnique() + .HasDatabaseName("UserNameIndex"); + + b.ToTable("AspNetUsers", (string)null); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Page") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("AppUserBookmark"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AutoCloseMenu") + .HasColumnType("INTEGER"); + + b.Property("BackgroundColor") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("#000000"); + + b.Property("BlurUnreadSummaries") + .HasColumnType("INTEGER"); + + b.Property("BookReaderFontFamily") + .HasColumnType("TEXT"); + + b.Property("BookReaderFontSize") + .HasColumnType("INTEGER"); + + b.Property("BookReaderImmersiveMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLayoutMode") + .HasColumnType("INTEGER"); + + b.Property("BookReaderLineSpacing") + .HasColumnType("INTEGER"); + + b.Property("BookReaderMargin") + .HasColumnType("INTEGER"); + + b.Property("BookReaderReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("BookReaderTapToPaginate") + .HasColumnType("INTEGER"); + + b.Property("BookThemeName") + .ValueGeneratedOnAdd() + .HasColumnType("TEXT") + .HasDefaultValue("Dark"); + + b.Property("GlobalPageLayoutMode") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER") + .HasDefaultValue(0); + + b.Property("LayoutMode") + .HasColumnType("INTEGER"); + + b.Property("PageSplitOption") + .HasColumnType("INTEGER"); + + b.Property("PromptForDownloadSize") + .HasColumnType("INTEGER"); + + b.Property("ReaderMode") + .HasColumnType("INTEGER"); + + b.Property("ReadingDirection") + .HasColumnType("INTEGER"); + + b.Property("ScalingOption") + .HasColumnType("INTEGER"); + + b.Property("ShowScreenHints") + .HasColumnType("INTEGER"); + + b.Property("ThemeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId") + .IsUnique(); + + b.HasIndex("ThemeId"); + + b.ToTable("AppUserPreferences"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("BookScrollId") + .HasColumnType("TEXT"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("PagesRead") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserProgresses"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("Rating") + .HasColumnType("INTEGER"); + + b.Property("Review") + .HasColumnType("TEXT"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("SeriesId"); + + b.ToTable("AppUserRating"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("UserId", "RoleId"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetUserRoles", (string)null); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Count") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("IsSpecial") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Number") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("Range") + .HasColumnType("TEXT"); + + b.Property("ReleaseDate") + .HasColumnType("TEXT"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.Property("TitleName") + .HasColumnType("TEXT"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("VolumeId"); + + b.ToTable("Chapter"); + }); + + modelBuilder.Entity("API.Entities.CollectionTag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("Id", "Promoted") + .IsUnique(); + + b.ToTable("CollectionTag"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("Path") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("LibraryId"); + + b.ToTable("FolderPath"); + }); + + modelBuilder.Entity("API.Entities.Genre", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Genre"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LastScanned") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Type") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FilePath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastFileAnalysis") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.ToTable("MangaFile"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AgeRating") + .HasColumnType("INTEGER"); + + b.Property("AgeRatingLocked") + .HasColumnType("INTEGER"); + + b.Property("CharacterLocked") + .HasColumnType("INTEGER"); + + b.Property("ColoristLocked") + .HasColumnType("INTEGER"); + + b.Property("CoverArtistLocked") + .HasColumnType("INTEGER"); + + b.Property("EditorLocked") + .HasColumnType("INTEGER"); + + b.Property("GenresLocked") + .HasColumnType("INTEGER"); + + b.Property("InkerLocked") + .HasColumnType("INTEGER"); + + b.Property("Language") + .HasColumnType("TEXT"); + + b.Property("LanguageLocked") + .HasColumnType("INTEGER"); + + b.Property("LettererLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxCount") + .HasColumnType("INTEGER"); + + b.Property("PencillerLocked") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatus") + .HasColumnType("INTEGER"); + + b.Property("PublicationStatusLocked") + .HasColumnType("INTEGER"); + + b.Property("PublisherLocked") + .HasColumnType("INTEGER"); + + b.Property("ReleaseYear") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("SummaryLocked") + .HasColumnType("INTEGER"); + + b.Property("TagsLocked") + .HasColumnType("INTEGER"); + + b.Property("TotalCount") + .HasColumnType("INTEGER"); + + b.Property("TranslatorLocked") + .HasColumnType("INTEGER"); + + b.Property("WriterLocked") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId") + .IsUnique(); + + b.HasIndex("Id", "SeriesId") + .IsUnique(); + + b.ToTable("SeriesMetadata"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("RelationKind") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("TargetSeriesId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.HasIndex("TargetSeriesId"); + + b.ToTable("SeriesRelation"); + }); + + modelBuilder.Entity("API.Entities.Person", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Role") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("Person"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Promoted") + .HasColumnType("INTEGER"); + + b.Property("Summary") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.ToTable("ReadingList"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ChapterId") + .HasColumnType("INTEGER"); + + b.Property("Order") + .HasColumnType("INTEGER"); + + b.Property("ReadingListId") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("VolumeId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("ChapterId"); + + b.HasIndex("ReadingListId"); + + b.HasIndex("SeriesId"); + + b.HasIndex("VolumeId"); + + b.ToTable("ReadingListItem"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AppUserId") + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("CoverImageLocked") + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FolderPath") + .HasColumnType("TEXT"); + + b.Property("Format") + .HasColumnType("INTEGER"); + + b.Property("LastChapterAdded") + .HasColumnType("TEXT"); + + b.Property("LastFolderScanned") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("LibraryId") + .HasColumnType("INTEGER"); + + b.Property("LocalizedName") + .HasColumnType("TEXT"); + + b.Property("LocalizedNameLocked") + .HasColumnType("INTEGER"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NameLocked") + .HasColumnType("INTEGER"); + + b.Property("NormalizedLocalizedName") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("OriginalName") + .HasColumnType("TEXT"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SortName") + .HasColumnType("TEXT"); + + b.Property("SortNameLocked") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("AppUserId"); + + b.HasIndex("LibraryId"); + + b.ToTable("Series"); + }); + + modelBuilder.Entity("API.Entities.ServerSetting", b => + { + b.Property("Key") + .HasColumnType("INTEGER"); + + b.Property("RowVersion") + .IsConcurrencyToken() + .HasColumnType("INTEGER"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("Key"); + + b.ToTable("ServerSetting"); + }); + + modelBuilder.Entity("API.Entities.SiteTheme", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("FileName") + .HasColumnType("TEXT"); + + b.Property("IsDefault") + .HasColumnType("INTEGER"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("NormalizedName") + .HasColumnType("TEXT"); + + b.Property("Provider") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.ToTable("SiteTheme"); + }); + + modelBuilder.Entity("API.Entities.Tag", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ExternalTag") + .HasColumnType("INTEGER"); + + b.Property("NormalizedTitle") + .HasColumnType("TEXT"); + + b.Property("Title") + .HasColumnType("TEXT"); + + b.HasKey("Id"); + + b.HasIndex("NormalizedTitle", "ExternalTag") + .IsUnique(); + + b.ToTable("Tag"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("AvgHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("CoverImage") + .HasColumnType("TEXT"); + + b.Property("Created") + .HasColumnType("TEXT"); + + b.Property("LastModified") + .HasColumnType("TEXT"); + + b.Property("MaxHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("MinHoursToRead") + .HasColumnType("INTEGER"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Number") + .HasColumnType("INTEGER"); + + b.Property("Pages") + .HasColumnType("INTEGER"); + + b.Property("SeriesId") + .HasColumnType("INTEGER"); + + b.Property("WordCount") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("SeriesId"); + + b.ToTable("Volume"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.Property("AppUsersId") + .HasColumnType("INTEGER"); + + b.Property("LibrariesId") + .HasColumnType("INTEGER"); + + b.HasKey("AppUsersId", "LibrariesId"); + + b.HasIndex("LibrariesId"); + + b.ToTable("AppUserLibrary"); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "GenresId"); + + b.HasIndex("GenresId"); + + b.ToTable("ChapterGenre"); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.Property("ChapterMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.HasKey("ChapterMetadatasId", "PeopleId"); + + b.HasIndex("PeopleId"); + + b.ToTable("ChapterPerson"); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.Property("ChaptersId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("ChaptersId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("ChapterTag"); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.Property("CollectionTagsId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("CollectionTagsId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("CollectionTagSeriesMetadata"); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.Property("GenresId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("GenresId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("GenreSeriesMetadata"); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("RoleId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("RoleId"); + + b.ToTable("AspNetRoleClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.Property("Id") + .ValueGeneratedOnAdd() + .HasColumnType("INTEGER"); + + b.Property("ClaimType") + .HasColumnType("TEXT"); + + b.Property("ClaimValue") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("Id"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserClaims", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("ProviderKey") + .HasColumnType("TEXT"); + + b.Property("ProviderDisplayName") + .HasColumnType("TEXT"); + + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.HasKey("LoginProvider", "ProviderKey"); + + b.HasIndex("UserId"); + + b.ToTable("AspNetUserLogins", (string)null); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.Property("UserId") + .HasColumnType("INTEGER"); + + b.Property("LoginProvider") + .HasColumnType("TEXT"); + + b.Property("Name") + .HasColumnType("TEXT"); + + b.Property("Value") + .HasColumnType("TEXT"); + + b.HasKey("UserId", "LoginProvider", "Name"); + + b.ToTable("AspNetUserTokens", (string)null); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.Property("PeopleId") + .HasColumnType("INTEGER"); + + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.HasKey("PeopleId", "SeriesMetadatasId"); + + b.HasIndex("SeriesMetadatasId"); + + b.ToTable("PersonSeriesMetadata"); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.Property("SeriesMetadatasId") + .HasColumnType("INTEGER"); + + b.Property("TagsId") + .HasColumnType("INTEGER"); + + b.HasKey("SeriesMetadatasId", "TagsId"); + + b.HasIndex("TagsId"); + + b.ToTable("SeriesMetadataTag"); + }); + + modelBuilder.Entity("API.Entities.AppUserBookmark", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Bookmarks") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserPreferences", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithOne("UserPreferences") + .HasForeignKey("API.Entities.AppUserPreferences", "AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.SiteTheme", "Theme") + .WithMany() + .HasForeignKey("ThemeId"); + + b.Navigation("AppUser"); + + b.Navigation("Theme"); + }); + + modelBuilder.Entity("API.Entities.AppUserProgress", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Progresses") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Progress") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRating", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("Ratings") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", null) + .WithMany("Ratings") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.AppUserRole", b => + { + b.HasOne("API.Entities.AppRole", "Role") + .WithMany("UserRoles") + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.AppUser", "User") + .WithMany("UserRoles") + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Role"); + + b.Navigation("User"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.HasOne("API.Entities.Volume", "Volume") + .WithMany("Chapters") + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.FolderPath", b => + { + b.HasOne("API.Entities.Library", "Library") + .WithMany("Folders") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.MangaFile", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany("Files") + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithOne("Metadata") + .HasForeignKey("API.Entities.Metadata.SeriesMetadata", "SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.Metadata.SeriesRelation", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Relations") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.ClientCascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "TargetSeries") + .WithMany("RelationOf") + .HasForeignKey("TargetSeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + + b.Navigation("TargetSeries"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.HasOne("API.Entities.AppUser", "AppUser") + .WithMany("ReadingLists") + .HasForeignKey("AppUserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("AppUser"); + }); + + modelBuilder.Entity("API.Entities.ReadingListItem", b => + { + b.HasOne("API.Entities.Chapter", "Chapter") + .WithMany() + .HasForeignKey("ChapterId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.ReadingList", "ReadingList") + .WithMany("Items") + .HasForeignKey("ReadingListId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Series", "Series") + .WithMany() + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Volume", "Volume") + .WithMany() + .HasForeignKey("VolumeId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Chapter"); + + b.Navigation("ReadingList"); + + b.Navigation("Series"); + + b.Navigation("Volume"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany("WantToRead") + .HasForeignKey("AppUserId"); + + b.HasOne("API.Entities.Library", "Library") + .WithMany("Series") + .HasForeignKey("LibraryId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Library"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.HasOne("API.Entities.Series", "Series") + .WithMany("Volumes") + .HasForeignKey("SeriesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("AppUserLibrary", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("AppUsersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Library", null) + .WithMany() + .HasForeignKey("LibrariesId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterGenre", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterPerson", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChapterMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("ChapterTag", b => + { + b.HasOne("API.Entities.Chapter", null) + .WithMany() + .HasForeignKey("ChaptersId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("CollectionTagSeriesMetadata", b => + { + b.HasOne("API.Entities.CollectionTag", null) + .WithMany() + .HasForeignKey("CollectionTagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("GenreSeriesMetadata", b => + { + b.HasOne("API.Entities.Genre", null) + .WithMany() + .HasForeignKey("GenresId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim", b => + { + b.HasOne("API.Entities.AppRole", null) + .WithMany() + .HasForeignKey("RoleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken", b => + { + b.HasOne("API.Entities.AppUser", null) + .WithMany() + .HasForeignKey("UserId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("PersonSeriesMetadata", b => + { + b.HasOne("API.Entities.Person", null) + .WithMany() + .HasForeignKey("PeopleId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("SeriesMetadataTag", b => + { + b.HasOne("API.Entities.Metadata.SeriesMetadata", null) + .WithMany() + .HasForeignKey("SeriesMetadatasId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + + b.HasOne("API.Entities.Tag", null) + .WithMany() + .HasForeignKey("TagsId") + .OnDelete(DeleteBehavior.Cascade) + .IsRequired(); + }); + + modelBuilder.Entity("API.Entities.AppRole", b => + { + b.Navigation("UserRoles"); + }); + + modelBuilder.Entity("API.Entities.AppUser", b => + { + b.Navigation("Bookmarks"); + + b.Navigation("Progresses"); + + b.Navigation("Ratings"); + + b.Navigation("ReadingLists"); + + b.Navigation("UserPreferences"); + + b.Navigation("UserRoles"); + + b.Navigation("WantToRead"); + }); + + modelBuilder.Entity("API.Entities.Chapter", b => + { + b.Navigation("Files"); + }); + + modelBuilder.Entity("API.Entities.Library", b => + { + b.Navigation("Folders"); + + b.Navigation("Series"); + }); + + modelBuilder.Entity("API.Entities.ReadingList", b => + { + b.Navigation("Items"); + }); + + modelBuilder.Entity("API.Entities.Series", b => + { + b.Navigation("Metadata"); + + b.Navigation("Progress"); + + b.Navigation("Ratings"); + + b.Navigation("RelationOf"); + + b.Navigation("Relations"); + + b.Navigation("Volumes"); + }); + + modelBuilder.Entity("API.Entities.Volume", b => + { + b.Navigation("Chapters"); + }); +#pragma warning restore 612, 618 + } + } +} diff --git a/API/Data/Migrations/20220819223212_NormalizedLocalizedName.cs b/API/Data/Migrations/20220819223212_NormalizedLocalizedName.cs new file mode 100644 index 000000000..600a3a6b2 --- /dev/null +++ b/API/Data/Migrations/20220819223212_NormalizedLocalizedName.cs @@ -0,0 +1,25 @@ +using Microsoft.EntityFrameworkCore.Migrations; + +#nullable disable + +namespace API.Data.Migrations +{ + public partial class NormalizedLocalizedName : Migration + { + protected override void Up(MigrationBuilder migrationBuilder) + { + migrationBuilder.AddColumn( + name: "NormalizedLocalizedName", + table: "Series", + type: "TEXT", + nullable: true); + } + + protected override void Down(MigrationBuilder migrationBuilder) + { + migrationBuilder.DropColumn( + name: "NormalizedLocalizedName", + table: "Series"); + } + } +} diff --git a/API/Data/Migrations/DataContextModelSnapshot.cs b/API/Data/Migrations/DataContextModelSnapshot.cs index 6a4eba753..d65cc4adb 100644 --- a/API/Data/Migrations/DataContextModelSnapshot.cs +++ b/API/Data/Migrations/DataContextModelSnapshot.cs @@ -782,12 +782,18 @@ namespace API.Data.Migrations b.Property("Created") .HasColumnType("TEXT"); + b.Property("FolderPath") + .HasColumnType("TEXT"); + b.Property("Format") .HasColumnType("INTEGER"); b.Property("LastChapterAdded") .HasColumnType("TEXT"); + b.Property("LastFolderScanned") + .HasColumnType("TEXT"); + b.Property("LastModified") .HasColumnType("TEXT"); @@ -812,6 +818,9 @@ namespace API.Data.Migrations b.Property("NameLocked") .HasColumnType("INTEGER"); + b.Property("NormalizedLocalizedName") + .HasColumnType("TEXT"); + b.Property("NormalizedName") .HasColumnType("TEXT"); diff --git a/API/Data/Repositories/CollectionTagRepository.cs b/API/Data/Repositories/CollectionTagRepository.cs index da44d5e18..7b9398b85 100644 --- a/API/Data/Repositories/CollectionTagRepository.cs +++ b/API/Data/Repositories/CollectionTagRepository.cs @@ -56,6 +56,7 @@ public class CollectionTagRepository : ICollectionTagRepository /// public async Task RemoveTagsWithoutSeries() { + // TODO: Write a Unit test to validate this works var tagsToDelete = await _context.CollectionTag .Include(c => c.SeriesMetadatas) .Where(c => c.SeriesMetadatas.Count == 0) diff --git a/API/Data/Repositories/GenreRepository.cs b/API/Data/Repositories/GenreRepository.cs index c5b151ac7..7457adb24 100644 --- a/API/Data/Repositories/GenreRepository.cs +++ b/API/Data/Repositories/GenreRepository.cs @@ -44,7 +44,7 @@ public class GenreRepository : IGenreRepository public async Task FindByNameAsync(string genreName) { - var normalizedName = Parser.Parser.Normalize(genreName); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(genreName); return await _context.Genre .FirstOrDefaultAsync(g => g.NormalizedTitle.Equals(normalizedName)); } diff --git a/API/Data/Repositories/LibraryRepository.cs b/API/Data/Repositories/LibraryRepository.cs index 782247a1a..b967cece8 100644 --- a/API/Data/Repositories/LibraryRepository.cs +++ b/API/Data/Repositories/LibraryRepository.cs @@ -34,19 +34,19 @@ public interface ILibraryRepository Task> GetLibraryDtosAsync(); Task LibraryExists(string libraryName); Task GetLibraryForIdAsync(int libraryId, LibraryIncludes includes); - Task GetFullLibraryForIdAsync(int libraryId); - Task GetFullLibraryForIdAsync(int libraryId, int seriesId); Task> GetLibraryDtosForUsernameAsync(string userName); - Task> GetLibrariesAsync(); + Task> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None); Task DeleteLibrary(int libraryId); Task> GetLibrariesForUserIdAsync(int userId); Task GetLibraryTypeAsync(int libraryId); - Task> GetLibraryForIdsAsync(IList libraryIds); + Task> GetLibraryForIdsAsync(IEnumerable libraryIds, LibraryIncludes includes = LibraryIncludes.None); Task GetTotalFiles(); IEnumerable GetJumpBarAsync(int libraryId); Task> GetAllAgeRatingsDtosForLibrariesAsync(List libraryIds); Task> GetAllLanguagesForLibrariesAsync(List libraryIds); IEnumerable GetAllPublicationStatusesDtosForLibrariesAsync(List libraryIds); + Task DoAnySeriesFoldersMatch(IEnumerable folders); + Library GetLibraryByFolder(string folder); } public class LibraryRepository : ILibraryRepository @@ -87,11 +87,19 @@ public class LibraryRepository : ILibraryRepository .ToListAsync(); } - public async Task> GetLibrariesAsync() + /// + /// Returns all libraries including their AppUsers + extra includes + /// + /// + /// + public async Task> GetLibrariesAsync(LibraryIncludes includes = LibraryIncludes.None) { - return await _context.Library + var query = _context.Library .Include(l => l.AppUsers) - .ToListAsync(); + .Select(l => l); + + query = AddIncludesToQuery(query, includes); + return await query.ToListAsync(); } public async Task DeleteLibrary(int libraryId) @@ -120,11 +128,13 @@ public class LibraryRepository : ILibraryRepository .SingleAsync(); } - public async Task> GetLibraryForIdsAsync(IList libraryIds) + public async Task> GetLibraryForIdsAsync(IEnumerable libraryIds, LibraryIncludes includes = LibraryIncludes.None) { - return await _context.Library - .Where(x => libraryIds.Contains(x.Id)) - .ToListAsync(); + var query = _context.Library + .Where(x => libraryIds.Contains(x.Id)); + + AddIncludesToQuery(query, includes); + return await query.ToListAsync(); } public async Task GetTotalFiles() @@ -317,4 +327,23 @@ public class LibraryRepository : ILibraryRepository .OrderBy(s => s.Title); } + /// + /// Checks if any series folders match the folders passed in + /// + /// + /// + public async Task DoAnySeriesFoldersMatch(IEnumerable folders) + { + var normalized = folders.Select(Services.Tasks.Scanner.Parser.Parser.NormalizePath); + return await _context.Series.AnyAsync(s => normalized.Contains(s.FolderPath)); + } + + public Library? GetLibraryByFolder(string folder) + { + var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder); + return _context.Library + .Include(l => l.Folders) + .AsSplitQuery() + .SingleOrDefault(l => l.Folders.Select(f => f.Path).Contains(normalized)); + } } diff --git a/API/Data/Repositories/PersonRepository.cs b/API/Data/Repositories/PersonRepository.cs index ff59fe596..83aa18f62 100644 --- a/API/Data/Repositories/PersonRepository.cs +++ b/API/Data/Repositories/PersonRepository.cs @@ -42,7 +42,7 @@ public class PersonRepository : IPersonRepository public async Task FindByNameAsync(string name) { - var normalizedName = Parser.Parser.Normalize(name); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name); return await _context.Person .Where(p => normalizedName.Equals(p.NormalizedName)) .SingleOrDefaultAsync(); diff --git a/API/Data/Repositories/SeriesRepository.cs b/API/Data/Repositories/SeriesRepository.cs index c859ed2de..43b748a2a 100644 --- a/API/Data/Repositories/SeriesRepository.cs +++ b/API/Data/Repositories/SeriesRepository.cs @@ -1,6 +1,5 @@ using System; using System.Collections.Generic; -using System.Globalization; using System.Linq; using System.Text.RegularExpressions; using System.Threading.Tasks; @@ -19,12 +18,11 @@ using API.Extensions; using API.Helpers; using API.Services; using API.Services.Tasks; +using API.Services.Tasks.Scanner; using AutoMapper; using AutoMapper.QueryableExtensions; -using Kavita.Common.Extensions; -using Microsoft.AspNetCore.Mvc; using Microsoft.EntityFrameworkCore; -using SQLitePCL; + namespace API.Data.Repositories; @@ -120,6 +118,12 @@ public interface ISeriesRepository Task GetSeriesForMangaFile(int mangaFileId, int userId); Task GetSeriesForChapter(int chapterId, int userId); Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter); + Task GetSeriesIdByFolder(string folder); + Task GetSeriesByFolderPath(string folder); + Task GetFullSeriesByName(string series, int libraryId); + Task GetFullSeriesByAnyName(string seriesName, string localizedName, int libraryId, MangaFormat format, bool withFullIncludes = true); + Task> RemoveSeriesNotInList(IList seenSeries, int libraryId); + Task>> GetFolderPathMap(int libraryId); } public class SeriesRepository : ISeriesRepository @@ -156,6 +160,7 @@ public class SeriesRepository : ISeriesRepository /// Returns if a series name and format exists already in a library /// /// Name of series + /// /// Format of series /// public async Task DoesSeriesNameExistInLibrary(string name, int libraryId, MangaFormat format) @@ -179,6 +184,7 @@ public class SeriesRepository : ISeriesRepository /// Used for to /// /// + /// /// public async Task> GetFullSeriesForLibraryIdAsync(int libraryId, UserParams userParams) { @@ -224,6 +230,7 @@ public class SeriesRepository : ISeriesRepository { return await _context.Series .Where(s => s.Id == seriesId) + .Include(s => s.Relations) .Include(s => s.Metadata) .ThenInclude(m => m.People) .Include(s => s.Metadata) @@ -295,7 +302,7 @@ public class SeriesRepository : ISeriesRepository { const int maxRecords = 15; var result = new SearchResultGroupDto(); - var searchQueryNormalized = Parser.Parser.Normalize(searchQuery); + var searchQueryNormalized = Services.Tasks.Scanner.Parser.Parser.Normalize(searchQuery); var seriesIds = _context.Series .Where(s => libraryIds.Contains(s.LibraryId)) @@ -432,6 +439,7 @@ public class SeriesRepository : ISeriesRepository /// Returns Volumes, Metadata (Incl Genres and People), and Collection Tags /// /// + /// /// public async Task GetSeriesByIdAsync(int seriesId, SeriesIncludes includes = SeriesIncludes.Volumes | SeriesIncludes.Metadata) { @@ -477,6 +485,7 @@ public class SeriesRepository : ISeriesRepository .Include(s => s.Volumes) .Include(s => s.Metadata) .ThenInclude(m => m.CollectionTags) + .Include(s => s.Relations) .Where(s => seriesIds.Contains(s.Id)) .AsSplitQuery() .ToListAsync(); @@ -1136,21 +1145,162 @@ public class SeriesRepository : ISeriesRepository .SingleOrDefaultAsync(); } - public async Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter) + /// + /// Given a folder path return a Series with the that matches. + /// + /// This will apply normalization on the path. + /// + /// + public async Task GetSeriesIdByFolder(string folder) { - var libraryIds = GetLibraryIdsForUser(userId); - var query = _context.AppUser - .Where(user => user.Id == userId) - .SelectMany(u => u.WantToRead) - .Where(s => libraryIds.Contains(s.LibraryId)) - .AsSplitQuery() - .AsNoTracking(); - - var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query); - - return await PagedList.CreateAsync(filteredQuery.ProjectTo(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize); + var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder); + var series = await _context.Series + .Where(s => s.FolderPath.Equals(normalized)) + .SingleOrDefaultAsync(); + return series?.Id ?? 0; } + /// + /// Return a Series by Folder path. Null if not found. + /// + /// This will be normalized in the query + /// + public async Task GetSeriesByFolderPath(string folder) + { + var normalized = Services.Tasks.Scanner.Parser.Parser.NormalizePath(folder); + return await _context.Series.SingleOrDefaultAsync(s => s.FolderPath.Equals(normalized)); + } + + /// + /// Finds a series by series name for a given library. + /// + /// This pulls everything with the Series, so should be used only when needing tracking on all related tables + /// + /// + /// + public Task GetFullSeriesByName(string series, int libraryId) + { + var localizedSeries = Services.Tasks.Scanner.Parser.Parser.Normalize(series); + return _context.Series + .Where(s => (s.NormalizedName.Equals(localizedSeries) + || s.LocalizedName.Equals(series)) && s.LibraryId == libraryId) + .Include(s => s.Metadata) + .ThenInclude(m => m.People) + .Include(s => s.Metadata) + .ThenInclude(m => m.Genres) + .Include(s => s.Library) + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(cm => cm.People) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Genres) + + + .Include(s => s.Metadata) + .ThenInclude(m => m.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Files) + .AsSplitQuery() + .SingleOrDefaultAsync(); + } + + /// + /// Finds a series by series name or localized name for a given library. + /// + /// This pulls everything with the Series, so should be used only when needing tracking on all related tables + /// + /// + /// + /// Defaults to true. This will query against all foreign keys (deep). If false, just the series will come back + /// + public Task GetFullSeriesByAnyName(string seriesName, string localizedName, int libraryId, MangaFormat format, bool withFullIncludes = true) + { + var normalizedSeries = Services.Tasks.Scanner.Parser.Parser.Normalize(seriesName); + var normalizedLocalized = Services.Tasks.Scanner.Parser.Parser.Normalize(localizedName); + var query = _context.Series + .Where(s => s.LibraryId == libraryId) + .Where(s => s.Format == format && format != MangaFormat.Unknown) + .Where(s => s.NormalizedName.Equals(normalizedSeries) + || (s.NormalizedLocalizedName.Equals(normalizedSeries) && s.NormalizedLocalizedName != string.Empty)); + if (!string.IsNullOrEmpty(normalizedLocalized)) + { + query = query.Where(s => + s.NormalizedName.Equals(normalizedLocalized) || s.NormalizedLocalizedName.Equals(normalizedLocalized)); + } + + if (!withFullIncludes) + { + return query.SingleOrDefaultAsync(); + } + + return query.Include(s => s.Metadata) + .ThenInclude(m => m.People) + .Include(s => s.Metadata) + .ThenInclude(m => m.Genres) + .Include(s => s.Library) + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(cm => cm.People) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Genres) + + + .Include(s => s.Metadata) + .ThenInclude(m => m.Tags) + + .Include(s => s.Volumes) + .ThenInclude(v => v.Chapters) + .ThenInclude(c => c.Files) + .AsSplitQuery() + .SingleOrDefaultAsync(); + } + + + /// + /// Removes series that are not in the seenSeries list. Does not commit. + /// + /// + /// + public async Task> RemoveSeriesNotInList(IList seenSeries, int libraryId) + { + if (seenSeries.Count == 0) return new List(); + var ids = new List(); + foreach (var parsedSeries in seenSeries) + { + var series = await _context.Series + .Where(s => s.Format == parsedSeries.Format && s.NormalizedName == parsedSeries.NormalizedName && + s.LibraryId == libraryId) + .Select(s => s.Id) + .SingleOrDefaultAsync(); + if (series > 0) + { + ids.Add(series); + } + } + + var seriesToRemove = await _context.Series + .Where(s => s.LibraryId == libraryId) + .Where(s => !ids.Contains(s.Id)) + .ToListAsync(); + + _context.Series.RemoveRange(seriesToRemove); + + return seriesToRemove; + } public async Task> GetHighlyRated(int userId, int libraryId, UserParams userParams) { @@ -1320,4 +1470,53 @@ public class SeriesRepository : ISeriesRepository .AsEnumerable(); return ret; } + + public async Task> GetWantToReadForUserAsync(int userId, UserParams userParams, FilterDto filter) + { + var libraryIds = GetLibraryIdsForUser(userId); + var query = _context.AppUser + .Where(user => user.Id == userId) + .SelectMany(u => u.WantToRead) + .Where(s => libraryIds.Contains(s.LibraryId)) + .AsSplitQuery() + .AsNoTracking(); + + var filteredQuery = await CreateFilteredSearchQueryable(userId, 0, filter, query); + + return await PagedList.CreateAsync(filteredQuery.ProjectTo(_mapper.ConfigurationProvider), userParams.PageNumber, userParams.PageSize); + } + + public async Task>> GetFolderPathMap(int libraryId) + { + var info = await _context.Series + .Where(s => s.LibraryId == libraryId) + .AsNoTracking() + .Where(s => s.FolderPath != null) + .Select(s => new SeriesModified() + { + LastScanned = s.LastFolderScanned, + SeriesName = s.Name, + FolderPath = s.FolderPath, + Format = s.Format + }).ToListAsync(); + + var map = new Dictionary>(); + foreach (var series in info) + { + if (!map.ContainsKey(series.FolderPath)) + { + map.Add(series.FolderPath, new List() + { + series + }); + } + else + { + map[series.FolderPath].Add(series); + } + + } + + return map; + } } diff --git a/API/Data/Repositories/TagRepository.cs b/API/Data/Repositories/TagRepository.cs index 8ddb52d67..8faf0440b 100644 --- a/API/Data/Repositories/TagRepository.cs +++ b/API/Data/Repositories/TagRepository.cs @@ -43,7 +43,7 @@ public class TagRepository : ITagRepository public async Task FindByNameAsync(string tagName) { - var normalizedName = Parser.Parser.Normalize(tagName); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(tagName); return await _context.Tag .FirstOrDefaultAsync(g => g.NormalizedTitle.Equals(normalizedName)); } diff --git a/API/Data/Repositories/UserRepository.cs b/API/Data/Repositories/UserRepository.cs index a9e78fe73..e02f414f4 100644 --- a/API/Data/Repositories/UserRepository.cs +++ b/API/Data/Repositories/UserRepository.cs @@ -23,7 +23,9 @@ public enum AppUserIncludes ReadingLists = 8, Ratings = 16, UserPreferences = 32, - WantToRead = 64 + WantToRead = 64, + ReadingListsWithItems = 128, + } public interface IUserRepository @@ -36,7 +38,6 @@ public interface IUserRepository Task> GetEmailConfirmedMemberDtosAsync(); Task> GetPendingMemberDtosAsync(); Task> GetAdminUsersAsync(); - Task> GetNonAdminUsersAsync(); Task IsUserAdminAsync(AppUser user); Task GetUserRatingAsync(int seriesId, int userId); Task GetPreferencesAsync(string username); @@ -51,11 +52,9 @@ public interface IUserRepository Task GetUserByUsernameAsync(string username, AppUserIncludes includeFlags = AppUserIncludes.None); Task GetUserByIdAsync(int userId, AppUserIncludes includeFlags = AppUserIncludes.None); Task GetUserIdByUsernameAsync(string username); - Task GetUserWithReadingListsByUsernameAsync(string username); Task> GetAllBookmarksByIds(IList bookmarkIds); Task GetUserByEmailAsync(string email); Task> GetAllUsers(); - Task> GetAllPreferencesByThemeAsync(int themeId); Task HasAccessToLibrary(int libraryId, int userId); Task> GetAllUsersAsync(AppUserIncludes includeFlags); @@ -167,6 +166,11 @@ public class UserRepository : IUserRepository query = query.Include(u => u.ReadingLists); } + if (includeFlags.HasFlag(AppUserIncludes.ReadingListsWithItems)) + { + query = query.Include(u => u.ReadingLists).ThenInclude(r => r.Items); + } + if (includeFlags.HasFlag(AppUserIncludes.Ratings)) { query = query.Include(u => u.Ratings); @@ -201,19 +205,6 @@ public class UserRepository : IUserRepository .SingleOrDefaultAsync(); } - /// - /// Gets an AppUser by username. Returns back Reading List and their Items. - /// - /// - /// - public async Task GetUserWithReadingListsByUsernameAsync(string username) - { - return await _context.Users - .Include(u => u.ReadingLists) - .ThenInclude(l => l.Items) - .AsSplitQuery() - .SingleOrDefaultAsync(x => x.UserName == username); - } /// /// Returns all Bookmarks for a given set of Ids @@ -267,11 +258,6 @@ public class UserRepository : IUserRepository return await _userManager.GetUsersInRoleAsync(PolicyConstants.AdminRole); } - public async Task> GetNonAdminUsersAsync() - { - return await _userManager.GetUsersInRoleAsync(PolicyConstants.PlebRole); - } - public async Task IsUserAdminAsync(AppUser user) { return await _userManager.IsInRoleAsync(user, PolicyConstants.AdminRole); @@ -404,14 +390,4 @@ public class UserRepository : IUserRepository .AsNoTracking() .ToListAsync(); } - - public async Task ValidateUserExists(string username) - { - if (await _userManager.Users.AnyAsync(x => x.NormalizedUserName == username.ToUpper())) - { - throw new ValidationException("Username is taken."); - } - - return true; - } } diff --git a/API/Data/Seed.cs b/API/Data/Seed.cs index 893256357..97e141eab 100644 --- a/API/Data/Seed.cs +++ b/API/Data/Seed.cs @@ -29,7 +29,7 @@ namespace API.Data new() { Name = "Dark", - NormalizedName = Parser.Parser.Normalize("Dark"), + NormalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize("Dark"), Provider = ThemeProvider.System, FileName = "dark.scss", IsDefault = true, @@ -103,6 +103,7 @@ namespace API.Data new() {Key = ServerSettingKey.ConvertBookmarkToWebP, Value = "false"}, new() {Key = ServerSettingKey.EnableSwaggerUi, Value = "false"}, new() {Key = ServerSettingKey.TotalBackups, Value = "30"}, + new() {Key = ServerSettingKey.EnableFolderWatching, Value = "false"}, }.ToArray()); foreach (var defaultSetting in DefaultSettings) diff --git a/API/Data/UnitOfWork.cs b/API/Data/UnitOfWork.cs index 2e99ac32d..2d2adac42 100644 --- a/API/Data/UnitOfWork.cs +++ b/API/Data/UnitOfWork.cs @@ -1,4 +1,5 @@ -using System.Threading.Tasks; +using System; +using System.Threading.Tasks; using API.Data.Repositories; using API.Entities; using AutoMapper; @@ -26,7 +27,6 @@ public interface IUnitOfWork bool Commit(); Task CommitAsync(); bool HasChanges(); - bool Rollback(); Task RollbackAsync(); } public class UnitOfWork : IUnitOfWork @@ -93,16 +93,15 @@ public class UnitOfWork : IUnitOfWork /// public async Task RollbackAsync() { - await _context.DisposeAsync(); - return true; - } - /// - /// Rollback transaction - /// - /// - public bool Rollback() - { - _context.Dispose(); + try + { + await _context.Database.RollbackTransactionAsync(); + } + catch (Exception) + { + // Swallow exception (this might be used in places where a transaction isn't setup) + } + return true; } } diff --git a/API/Entities/Enums/MangaFormat.cs b/API/Entities/Enums/MangaFormat.cs index 45497df4b..07e34ed77 100644 --- a/API/Entities/Enums/MangaFormat.cs +++ b/API/Entities/Enums/MangaFormat.cs @@ -9,13 +9,13 @@ namespace API.Entities.Enums { /// /// Image file - /// See for supported extensions + /// See for supported extensions /// [Description("Image")] Image = 0, /// /// Archive based file - /// See for supported extensions + /// See for supported extensions /// [Description("Archive")] Archive = 1, diff --git a/API/Entities/Enums/ServerSettingKey.cs b/API/Entities/Enums/ServerSettingKey.cs index b387f1d85..3fcf938b2 100644 --- a/API/Entities/Enums/ServerSettingKey.cs +++ b/API/Entities/Enums/ServerSettingKey.cs @@ -91,5 +91,10 @@ namespace API.Entities.Enums /// [Description("TotalBackups")] TotalBackups = 16, + /// + /// If Kavita should watch the library folders and process changes + /// + [Description("EnableFolderWatching")] + EnableFolderWatching = 17, } } diff --git a/API/Entities/FolderPath.cs b/API/Entities/FolderPath.cs index 267564fe8..20ba4f466 100644 --- a/API/Entities/FolderPath.cs +++ b/API/Entities/FolderPath.cs @@ -8,8 +8,9 @@ namespace API.Entities public int Id { get; set; } public string Path { get; set; } /// - /// Used when scanning to see if we can skip if nothing has changed. (not implemented) + /// Used when scanning to see if we can skip if nothing has changed /// + /// Time stored in UTC public DateTime LastScanned { get; set; } // Relationship diff --git a/API/Entities/Library.cs b/API/Entities/Library.cs index c77fb68dd..fd9956b1f 100644 --- a/API/Entities/Library.cs +++ b/API/Entities/Library.cs @@ -1,5 +1,7 @@ using System; using System.Collections.Generic; +using System.IO; +using System.Linq; using API.Entities.Enums; using API.Entities.Interfaces; @@ -9,6 +11,10 @@ namespace API.Entities { public int Id { get; set; } public string Name { get; set; } + /// + /// Update this summary with a way it's used, else let's remove it. + /// + [Obsolete("This has never been coded for. Likely we can remove it.")] public string CoverImage { get; set; } public LibraryType Type { get; set; } public DateTime Created { get; set; } @@ -16,10 +22,22 @@ namespace API.Entities /// /// Last time Library was scanned /// + /// Time stored in UTC public DateTime LastScanned { get; set; } public ICollection Folders { get; set; } public ICollection AppUsers { get; set; } public ICollection Series { get; set; } + // Methods + /// + /// Has there been any modifications to the FolderPath's directory since the date + /// + /// + public bool AnyModificationsSinceLastScan() + { + // NOTE: I don't think we can do this due to NTFS + return Folders.All(folder => File.GetLastWriteTimeUtc(folder.Path) > folder.LastScanned); + } + } } diff --git a/API/Entities/Series.cs b/API/Entities/Series.cs index f345386d3..7fa02f67b 100644 --- a/API/Entities/Series.cs +++ b/API/Entities/Series.cs @@ -14,10 +14,14 @@ public class Series : IEntityDate, IHasReadTimeEstimate /// public string Name { get; set; } /// - /// Used internally for name matching. + /// Used internally for name matching. /// public string NormalizedName { get; set; } /// + /// Used internally for localized name matching. + /// + public string NormalizedLocalizedName { get; set; } + /// /// The name used to sort the Series. By default, will be the same as Name. /// public string SortName { get; set; } @@ -50,7 +54,15 @@ public class Series : IEntityDate, IHasReadTimeEstimate /// Sum of all Volume page counts /// public int Pages { get; set; } - + /// + /// Highest path (that is under library root) that contains the series. + /// + /// must be used before setting + public string FolderPath { get; set; } + /// + /// Last time the folder was scanned + /// + public DateTime LastFolderScanned { get; set; } /// /// The type of all the files attached to this series /// diff --git a/API/Extensions/ApplicationServiceExtensions.cs b/API/Extensions/ApplicationServiceExtensions.cs index 1b637b25f..d4fa19258 100644 --- a/API/Extensions/ApplicationServiceExtensions.cs +++ b/API/Extensions/ApplicationServiceExtensions.cs @@ -4,6 +4,7 @@ using API.Helpers; using API.Services; using API.Services.Tasks; using API.Services.Tasks.Metadata; +using API.Services.Tasks.Scanner; using API.SignalR; using API.SignalR.Presence; using Kavita.Common; @@ -46,10 +47,13 @@ namespace API.Extensions services.AddScoped(); services.AddScoped(); services.AddScoped(); + services.AddScoped(); + services.AddScoped(); services.AddScoped(); services.AddScoped(); services.AddScoped(); + services.AddScoped(); diff --git a/API/Extensions/SeriesExtensions.cs b/API/Extensions/SeriesExtensions.cs index cd3254e34..acd828480 100644 --- a/API/Extensions/SeriesExtensions.cs +++ b/API/Extensions/SeriesExtensions.cs @@ -16,8 +16,8 @@ namespace API.Extensions /// public static bool NameInList(this Series series, IEnumerable list) { - return list.Any(name => Parser.Parser.Normalize(name) == series.NormalizedName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.Name) - || name == series.Name || name == series.LocalizedName || name == series.OriginalName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.OriginalName)); + return list.Any(name => Services.Tasks.Scanner.Parser.Parser.Normalize(name) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name) + || name == series.Name || name == series.LocalizedName || name == series.OriginalName || Services.Tasks.Scanner.Parser.Parser.Normalize(name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName)); } /// @@ -28,8 +28,8 @@ namespace API.Extensions /// public static bool NameInList(this Series series, IEnumerable list) { - return list.Any(name => Parser.Parser.Normalize(name.Name) == series.NormalizedName || Parser.Parser.Normalize(name.Name) == Parser.Parser.Normalize(series.Name) - || name.Name == series.Name || name.Name == series.LocalizedName || name.Name == series.OriginalName || Parser.Parser.Normalize(name.Name) == Parser.Parser.Normalize(series.OriginalName) && series.Format == name.Format); + return list.Any(name => Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name) + || name.Name == series.Name || name.Name == series.LocalizedName || name.Name == series.OriginalName || Services.Tasks.Scanner.Parser.Parser.Normalize(name.Name) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName) && series.Format == name.Format); } /// @@ -41,9 +41,9 @@ namespace API.Extensions public static bool NameInParserInfo(this Series series, ParserInfo info) { if (info == null) return false; - return Parser.Parser.Normalize(info.Series) == series.NormalizedName || Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.Name) - || info.Series == series.Name || info.Series == series.LocalizedName || info.Series == series.OriginalName - || Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.OriginalName); + return Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == series.NormalizedName || Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name) + || info.Series == series.Name || info.Series == series.LocalizedName || info.Series == series.OriginalName + || Services.Tasks.Scanner.Parser.Parser.Normalize(info.Series) == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName); } } } diff --git a/API/Helpers/Converters/ServerSettingConverter.cs b/API/Helpers/Converters/ServerSettingConverter.cs index 35c2428f3..6cc48e9eb 100644 --- a/API/Helpers/Converters/ServerSettingConverter.cs +++ b/API/Helpers/Converters/ServerSettingConverter.cs @@ -60,6 +60,9 @@ namespace API.Helpers.Converters case ServerSettingKey.InstallId: destination.InstallId = row.Value; break; + case ServerSettingKey.EnableFolderWatching: + destination.EnableFolderWatching = bool.Parse(row.Value); + break; } } diff --git a/API/Helpers/GenreHelper.cs b/API/Helpers/GenreHelper.cs index aa465f58e..5eadea8fa 100644 --- a/API/Helpers/GenreHelper.cs +++ b/API/Helpers/GenreHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -21,7 +22,7 @@ public static class GenreHelper { if (string.IsNullOrEmpty(name.Trim())) continue; - var normalizedName = Parser.Parser.Normalize(name); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name); var genre = allGenres.FirstOrDefault(p => p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); if (genre == null) @@ -34,6 +35,7 @@ public static class GenreHelper } } + public static void KeepOnlySameGenreBetweenLists(ICollection existingGenres, ICollection removeAllExcept, Action action = null) { var existing = existingGenres.ToList(); @@ -55,7 +57,17 @@ public static class GenreHelper public static void AddGenreIfNotExists(ICollection metadataGenres, Genre genre) { var existingGenre = metadataGenres.FirstOrDefault(p => - p.NormalizedTitle == Parser.Parser.Normalize(genre.Title)); + p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(genre.Title)); + if (existingGenre == null) + { + metadataGenres.Add(genre); + } + } + + public static void AddGenreIfNotExists(BlockingCollection metadataGenres, Genre genre) + { + var existingGenre = metadataGenres.FirstOrDefault(p => + p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(genre.Title)); if (existingGenre == null) { metadataGenres.Add(genre); diff --git a/API/Helpers/ParserInfoHelpers.cs b/API/Helpers/ParserInfoHelpers.cs index a97601a43..c303fd2fb 100644 --- a/API/Helpers/ParserInfoHelpers.cs +++ b/API/Helpers/ParserInfoHelpers.cs @@ -16,20 +16,20 @@ public static class ParserInfoHelpers /// /// public static bool SeriesHasMatchingParserInfoFormat(Series series, - Dictionary> parsedSeries) + Dictionary> parsedSeries) { var format = MangaFormat.Unknown; foreach (var pSeries in parsedSeries.Keys) { var name = pSeries.Name; - var normalizedName = Parser.Parser.Normalize(name); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name); //if (series.NameInParserInfo(pSeries.)) if (normalizedName == series.NormalizedName || - normalizedName == Parser.Parser.Normalize(series.Name) || + normalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(series.Name) || name == series.Name || name == series.LocalizedName || name == series.OriginalName || - normalizedName == Parser.Parser.Normalize(series.OriginalName)) + normalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName)) { format = pSeries.Format; if (format == series.Format) diff --git a/API/Helpers/PersonHelper.cs b/API/Helpers/PersonHelper.cs index 18dbe1f2e..adcdd4b08 100644 --- a/API/Helpers/PersonHelper.cs +++ b/API/Helpers/PersonHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -25,7 +26,7 @@ public static class PersonHelper foreach (var name in names) { - var normalizedName = Parser.Parser.Normalize(name); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name); var person = allPeopleTypeRole.FirstOrDefault(p => p.NormalizedName.Equals(normalizedName)); if (person == null) @@ -48,7 +49,7 @@ public static class PersonHelper /// Callback which will be executed for each person removed public static void RemovePeople(ICollection existingPeople, IEnumerable people, PersonRole role, Action action = null) { - var normalizedPeople = people.Select(Parser.Parser.Normalize).ToList(); + var normalizedPeople = people.Select(Services.Tasks.Scanner.Parser.Parser.Normalize).ToList(); if (normalizedPeople.Count == 0) { var peopleToRemove = existingPeople.Where(p => p.Role == role).ToList(); @@ -81,7 +82,8 @@ public static class PersonHelper { foreach (var person in existingPeople) { - var existingPerson = removeAllExcept.FirstOrDefault(p => p.Role == person.Role && person.NormalizedName.Equals(p.NormalizedName)); + var existingPerson = removeAllExcept + .FirstOrDefault(p => p.Role == person.Role && person.NormalizedName.Equals(p.NormalizedName)); if (existingPerson == null) { action?.Invoke(person); @@ -97,7 +99,22 @@ public static class PersonHelper public static void AddPersonIfNotExists(ICollection metadataPeople, Person person) { var existingPerson = metadataPeople.SingleOrDefault(p => - p.NormalizedName == Parser.Parser.Normalize(person.Name) && p.Role == person.Role); + p.NormalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(person.Name) && p.Role == person.Role); + if (existingPerson == null) + { + metadataPeople.Add(person); + } + } + + /// + /// Adds the person to the list if it's not already in there + /// + /// + /// + public static void AddPersonIfNotExists(BlockingCollection metadataPeople, Person person) + { + var existingPerson = metadataPeople.SingleOrDefault(p => + p.NormalizedName == Services.Tasks.Scanner.Parser.Parser.Normalize(person.Name) && p.Role == person.Role); if (existingPerson == null) { metadataPeople.Add(person); diff --git a/API/Helpers/SeriesHelper.cs b/API/Helpers/SeriesHelper.cs index f92039a24..b30969805 100644 --- a/API/Helpers/SeriesHelper.cs +++ b/API/Helpers/SeriesHelper.cs @@ -17,8 +17,8 @@ public static class SeriesHelper public static bool FindSeries(Series series, ParsedSeries parsedInfoKey) { return (series.NormalizedName.Equals(parsedInfoKey.NormalizedName) || - Parser.Parser.Normalize(series.LocalizedName).Equals(parsedInfoKey.NormalizedName) || - Parser.Parser.Normalize(series.OriginalName).Equals(parsedInfoKey.NormalizedName)) + Services.Tasks.Scanner.Parser.Parser.Normalize(series.LocalizedName).Equals(parsedInfoKey.NormalizedName) || + Services.Tasks.Scanner.Parser.Parser.Normalize(series.OriginalName).Equals(parsedInfoKey.NormalizedName)) && (series.Format == parsedInfoKey.Format || series.Format == MangaFormat.Unknown); } diff --git a/API/Helpers/TagHelper.cs b/API/Helpers/TagHelper.cs index 4c230a053..f7b1abfd4 100644 --- a/API/Helpers/TagHelper.cs +++ b/API/Helpers/TagHelper.cs @@ -1,4 +1,5 @@ using System; +using System.Collections.Concurrent; using System.Collections.Generic; using System.Linq; using API.Data; @@ -22,7 +23,7 @@ public static class TagHelper if (string.IsNullOrEmpty(name.Trim())) continue; var added = false; - var normalizedName = Parser.Parser.Normalize(name); + var normalizedName = Services.Tasks.Scanner.Parser.Parser.Normalize(name); var genre = allTags.FirstOrDefault(p => p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); @@ -58,7 +59,17 @@ public static class TagHelper public static void AddTagIfNotExists(ICollection metadataTags, Tag tag) { var existingGenre = metadataTags.FirstOrDefault(p => - p.NormalizedTitle == Parser.Parser.Normalize(tag.Title)); + p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(tag.Title)); + if (existingGenre == null) + { + metadataTags.Add(tag); + } + } + + public static void AddTagIfNotExists(BlockingCollection metadataTags, Tag tag) + { + var existingGenre = metadataTags.FirstOrDefault(p => + p.NormalizedTitle == Services.Tasks.Scanner.Parser.Parser.Normalize(tag.Title)); if (existingGenre == null) { metadataTags.Add(tag); @@ -75,7 +86,7 @@ public static class TagHelper /// Callback which will be executed for each tag removed public static void RemoveTags(ICollection existingTags, IEnumerable tags, bool isExternal, Action action = null) { - var normalizedTags = tags.Select(Parser.Parser.Normalize).ToList(); + var normalizedTags = tags.Select(Services.Tasks.Scanner.Parser.Parser.Normalize).ToList(); foreach (var person in normalizedTags) { var existingTag = existingTags.FirstOrDefault(p => p.ExternalTag == isExternal && person.Equals(p.NormalizedTitle)); diff --git a/API/Services/ArchiveService.cs b/API/Services/ArchiveService.cs index f9f5b7588..58a2b0aae 100644 --- a/API/Services/ArchiveService.cs +++ b/API/Services/ArchiveService.cs @@ -60,7 +60,7 @@ namespace API.Services /// public virtual ArchiveLibrary CanOpen(string archivePath) { - if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported; + if (string.IsNullOrEmpty(archivePath) || !(File.Exists(archivePath) && Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported; var ext = _directoryService.FileSystem.Path.GetExtension(archivePath).ToUpper(); if (ext.Equals(".CBR") || ext.Equals(".RAR")) return ArchiveLibrary.SharpCompress; @@ -100,14 +100,14 @@ namespace API.Services case ArchiveLibrary.Default: { using var archive = ZipFile.OpenRead(archivePath); - return archive.Entries.Count(e => !Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Parser.Parser.IsImage(e.FullName)); + return archive.Entries.Count(e => !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName) && Tasks.Scanner.Parser.Parser.IsImage(e.FullName)); } case ArchiveLibrary.SharpCompress: { using var archive = ArchiveFactory.Open(archivePath); return archive.Entries.Count(entry => !entry.IsDirectory && - !Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty) - && Parser.Parser.IsImage(entry.Key)); + !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty) + && Tasks.Scanner.Parser.Parser.IsImage(entry.Key)); } case ArchiveLibrary.NotSupported: _logger.LogWarning("[GetNumberOfPagesFromArchive] This archive cannot be read: {ArchivePath}. Defaulting to 0 pages", archivePath); @@ -132,24 +132,25 @@ namespace API.Services public static string FindFolderEntry(IEnumerable entryFullNames) { var result = entryFullNames - .Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith))) + .Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith))) .OrderByNatural(Path.GetFileNameWithoutExtension) - .FirstOrDefault(Parser.Parser.IsCoverImage); + .FirstOrDefault(Tasks.Scanner.Parser.Parser.IsCoverImage); return string.IsNullOrEmpty(result) ? null : result; } /// - /// Returns first entry that is an image and is not in a blacklisted folder path. Uses for ordering files + /// Returns first entry that is an image and is not in a blacklisted folder path. Uses for ordering files /// /// + /// /// Entry name of match, null if no match public static string? FirstFileEntry(IEnumerable entryFullNames, string archiveName) { // First check if there are any files that are not in a nested folder before just comparing by filename. This is needed // because NaturalSortComparer does not work with paths and doesn't seem 001.jpg as before chapter 1/001.jpg. var fullNames = entryFullNames - .Where(path => !(Path.EndsInDirectorySeparator(path) || Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)) && Parser.Parser.IsImage(path)) + .Where(path => !(Path.EndsInDirectorySeparator(path) || Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(path) || path.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)) && Tasks.Scanner.Parser.Parser.IsImage(path)) .OrderByNatural(c => c.GetFullPathWithoutExtension()) .ToList(); if (fullNames.Count == 0) return null; @@ -186,7 +187,7 @@ namespace API.Services /// /// Generates byte array of cover image. - /// Given a path to a compressed file , will ensure the first image (respects directory structure) is returned unless + /// Given a path to a compressed file , will ensure the first image (respects directory structure) is returned unless /// a folder/cover.(image extension) exists in the the compressed file (if duplicate, the first is chosen) /// /// This skips over any __MACOSX folder/file iteration. @@ -264,7 +265,7 @@ namespace API.Services // Sometimes ZipArchive will list the directory and others it will just keep it in the FullName return archive.Entries.Count > 0 && !Path.HasExtension(archive.Entries.ElementAt(0).FullName) || - archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Parser.Parser.HasBlacklistedFolderInPath(e.FullName)); + archive.Entries.Any(e => e.FullName.Contains(Path.AltDirectorySeparatorChar) && !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(e.FullName)); } /// @@ -321,7 +322,7 @@ namespace API.Services return false; } - if (Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath)) return true; + if (Tasks.Scanner.Parser.Parser.IsArchive(archivePath) || Tasks.Scanner.Parser.Parser.IsEpub(archivePath)) return true; _logger.LogWarning("Archive {ArchivePath} is not a valid archive", archivePath); return false; @@ -330,10 +331,10 @@ namespace API.Services private static bool ValidComicInfoArchiveEntry(string fullName, string name) { var filenameWithoutExtension = Path.GetFileNameWithoutExtension(name).ToLower(); - return !Parser.Parser.HasBlacklistedFolderInPath(fullName) + return !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(fullName) && filenameWithoutExtension.Equals(ComicInfoFilename, StringComparison.InvariantCultureIgnoreCase) - && !filenameWithoutExtension.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith) - && Parser.Parser.IsXml(name); + && !filenameWithoutExtension.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith) + && Tasks.Scanner.Parser.Parser.IsXml(name); } /// @@ -466,8 +467,8 @@ namespace API.Services { using var archive = ArchiveFactory.Open(archivePath); ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory - && !Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty) - && Parser.Parser.IsImage(entry.Key)), extractPath); + && !Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty) + && Tasks.Scanner.Parser.Parser.IsImage(entry.Key)), extractPath); break; } case ArchiveLibrary.NotSupported: diff --git a/API/Services/BookService.cs b/API/Services/BookService.cs index 26c41edbb..d28183f9e 100644 --- a/API/Services/BookService.cs +++ b/API/Services/BookService.cs @@ -167,7 +167,7 @@ namespace API.Services // @Import statements will be handled by browser, so we must inline the css into the original file that request it, so they can be Scoped var prepend = filename.Length > 0 ? filename.Replace(Path.GetFileName(filename), string.Empty) : string.Empty; var importBuilder = new StringBuilder(); - foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml)) + foreach (Match match in Tasks.Scanner.Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml)) { if (!match.Success) continue; @@ -218,7 +218,7 @@ namespace API.Services private static void EscapeCssImportReferences(ref string stylesheetHtml, string apiBase, string prepend) { - foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml)) + foreach (Match match in Tasks.Scanner.Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml)) { if (!match.Success) continue; var importFile = match.Groups["Filename"].Value; @@ -228,7 +228,7 @@ namespace API.Services private static void EscapeFontFamilyReferences(ref string stylesheetHtml, string apiBase, string prepend) { - foreach (Match match in Parser.Parser.FontSrcUrlRegex.Matches(stylesheetHtml)) + foreach (Match match in Tasks.Scanner.Parser.Parser.FontSrcUrlRegex.Matches(stylesheetHtml)) { if (!match.Success) continue; var importFile = match.Groups["Filename"].Value; @@ -238,7 +238,7 @@ namespace API.Services private static void EscapeCssImageReferences(ref string stylesheetHtml, string apiBase, EpubBookRef book) { - var matches = Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml); + var matches = Tasks.Scanner.Parser.Parser.CssImageUrlRegex.Matches(stylesheetHtml); foreach (Match match in matches) { if (!match.Success) continue; @@ -394,7 +394,7 @@ namespace API.Services public ComicInfo GetComicInfo(string filePath) { - if (!IsValidFile(filePath) || Parser.Parser.IsPdf(filePath)) return null; + if (!IsValidFile(filePath) || Tasks.Scanner.Parser.Parser.IsPdf(filePath)) return null; try { @@ -425,7 +425,7 @@ namespace API.Services var info = new ComicInfo() { Summary = epubBook.Schema.Package.Metadata.Description, - Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Parser.Parser.CleanAuthor(c.Creator))), + Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Tasks.Scanner.Parser.Parser.CleanAuthor(c.Creator))), Publisher = string.Join(",", epubBook.Schema.Package.Metadata.Publishers), Month = month, Day = day, @@ -468,7 +468,7 @@ namespace API.Services return false; } - if (Parser.Parser.IsBook(filePath)) return true; + if (Tasks.Scanner.Parser.Parser.IsBook(filePath)) return true; _logger.LogWarning("[BookService] Book {EpubFile} is not a valid EPUB/PDF", filePath); return false; @@ -480,7 +480,7 @@ namespace API.Services try { - if (Parser.Parser.IsPdf(filePath)) + if (Tasks.Scanner.Parser.Parser.IsPdf(filePath)) { using var docReader = DocLib.Instance.GetDocReader(filePath, new PageDimensions(1080, 1920)); return docReader.GetPageCount(); @@ -536,7 +536,7 @@ namespace API.Services /// public ParserInfo ParseInfo(string filePath) { - if (!Parser.Parser.IsEpub(filePath)) return null; + if (!Tasks.Scanner.Parser.Parser.IsEpub(filePath)) return null; try { @@ -601,7 +601,7 @@ namespace API.Services } var info = new ParserInfo() { - Chapters = Parser.Parser.DefaultChapter, + Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter, Edition = string.Empty, Format = MangaFormat.Epub, Filename = Path.GetFileName(filePath), @@ -628,7 +628,7 @@ namespace API.Services return new ParserInfo() { - Chapters = Parser.Parser.DefaultChapter, + Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter, Edition = string.Empty, Format = MangaFormat.Epub, Filename = Path.GetFileName(filePath), @@ -636,7 +636,7 @@ namespace API.Services FullFilePath = filePath, IsSpecial = false, Series = epubBook.Title.Trim(), - Volumes = Parser.Parser.DefaultVolume, + Volumes = Tasks.Scanner.Parser.Parser.DefaultVolume, }; } catch (Exception ex) @@ -876,7 +876,7 @@ namespace API.Services { if (!IsValidFile(fileFilePath)) return string.Empty; - if (Parser.Parser.IsPdf(fileFilePath)) + if (Tasks.Scanner.Parser.Parser.IsPdf(fileFilePath)) { return GetPdfCoverImage(fileFilePath, fileName, outputDirectory); } @@ -887,7 +887,7 @@ namespace API.Services { // Try to get the cover image from OPF file, if not set, try to parse it from all the files, then result to the first one. var coverImageContent = epubBook.Content.Cover - ?? epubBook.Content.Images.Values.FirstOrDefault(file => Parser.Parser.IsCoverImage(file.FileName)) + ?? epubBook.Content.Images.Values.FirstOrDefault(file => Tasks.Scanner.Parser.Parser.IsCoverImage(file.FileName)) ?? epubBook.Content.Images.Values.FirstOrDefault(); if (coverImageContent == null) return string.Empty; diff --git a/API/Services/BookmarkService.cs b/API/Services/BookmarkService.cs index 6b07efd00..c798c47d1 100644 --- a/API/Services/BookmarkService.cs +++ b/API/Services/BookmarkService.cs @@ -51,7 +51,7 @@ public class BookmarkService : IBookmarkService var bookmarkDirectory = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value; - var bookmarkFilesToDelete = bookmarks.Select(b => Parser.Parser.NormalizePath( + var bookmarkFilesToDelete = bookmarks.Select(b => Tasks.Scanner.Parser.Parser.NormalizePath( _directoryService.FileSystem.Path.Join(bookmarkDirectory, b.FileName))).ToList(); @@ -165,7 +165,7 @@ public class BookmarkService : IBookmarkService var bookmarks = await _unitOfWork.UserRepository.GetAllBookmarksByIds(bookmarkIds.ToList()); return bookmarks - .Select(b => Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory, + .Select(b => Tasks.Scanner.Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory, b.FileName))); } diff --git a/API/Services/CacheService.cs b/API/Services/CacheService.cs index e9bb693eb..b81b87d91 100644 --- a/API/Services/CacheService.cs +++ b/API/Services/CacheService.cs @@ -57,7 +57,7 @@ namespace API.Services { // Calculate what chapter the page belongs to var path = GetBookmarkCachePath(seriesId); - var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions); + var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions); files = files .AsEnumerable() .OrderByNatural(Path.GetFileNameWithoutExtension) @@ -100,11 +100,9 @@ namespace API.Services var chapter = await _unitOfWork.ChapterRepository.GetChapterAsync(chapterId); var extractPath = GetCachePath(chapterId); - if (!_directoryService.Exists(extractPath)) - { - var files = chapter.Files.ToList(); - ExtractChapterFiles(extractPath, files); - } + if (_directoryService.Exists(extractPath)) return chapter; + var files = chapter.Files.ToList(); + ExtractChapterFiles(extractPath, files); return chapter; } @@ -215,9 +213,8 @@ namespace API.Services { // Calculate what chapter the page belongs to var path = GetCachePath(chapter.Id); - var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions); - files = files - .AsEnumerable() + // TODO: We can optimize this by extracting and renaming, so we don't need to scan for the files and can do a direct access + var files = _directoryService.GetFilesWithExtension(path, Tasks.Scanner.Parser.Parser.ImageFileExtensions) .OrderByNatural(Path.GetFileNameWithoutExtension) .ToArray(); diff --git a/API/Services/DirectoryService.cs b/API/Services/DirectoryService.cs index d3976da67..54757f651 100644 --- a/API/Services/DirectoryService.cs +++ b/API/Services/DirectoryService.cs @@ -1,6 +1,7 @@ using System; using System.Collections.Generic; using System.Collections.Immutable; +using System.Diagnostics; using System.IO; using System.IO.Abstractions; using System.Linq; @@ -9,6 +10,8 @@ using System.Threading.Tasks; using API.DTOs.System; using API.Entities.Enums; using API.Extensions; +using Kavita.Common.Helpers; +using Microsoft.Extensions.FileSystemGlobbing; using Microsoft.Extensions.Logging; namespace API.Services @@ -57,9 +60,23 @@ namespace API.Services void RemoveNonImages(string directoryName); void Flatten(string directoryName); Task CheckWriteAccess(string directoryName); + + IEnumerable GetFilesWithCertainExtensions(string path, + string searchPatternExpression = "", + SearchOption searchOption = SearchOption.TopDirectoryOnly); + + IEnumerable GetDirectories(string folderPath); + IEnumerable GetDirectories(string folderPath, GlobMatcher matcher); + string GetParentDirectoryName(string fileOrFolder); + #nullable enable + IList ScanFiles(string folderPath, GlobMatcher? matcher = null); + DateTime GetLastWriteTime(string folderPath); + GlobMatcher CreateMatcherFromFile(string filePath); +#nullable disable } public class DirectoryService : IDirectoryService { + public const string KavitaIgnoreFile = ".kavitaignore"; public IFileSystem FileSystem { get; } public string CacheDirectory { get; } public string CoverImageDirectory { get; } @@ -100,12 +117,12 @@ namespace API.Services /// /// Given a set of regex search criteria, get files in the given path. /// - /// This will always exclude patterns + /// This will always exclude patterns /// Directory to search /// Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files. /// SearchOption to use, defaults to TopDirectoryOnly /// List of file paths - private IEnumerable GetFilesWithCertainExtensions(string path, + public IEnumerable GetFilesWithCertainExtensions(string path, string searchPatternExpression = "", SearchOption searchOption = SearchOption.TopDirectoryOnly) { @@ -114,7 +131,7 @@ namespace API.Services return FileSystem.Directory.EnumerateFiles(path, "*", searchOption) .Where(file => - reSearchPattern.IsMatch(FileSystem.Path.GetExtension(file)) && !FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)); + reSearchPattern.IsMatch(FileSystem.Path.GetExtension(file)) && !FileSystem.Path.GetFileName(file).StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)); } @@ -191,12 +208,12 @@ namespace API.Services { var fileName = FileSystem.Path.GetFileName(file); return reSearchPattern.IsMatch(fileName) && - !fileName.StartsWith(Parser.Parser.MacOsMetadataFileStartsWith); + !fileName.StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith); }); } return FileSystem.Directory.EnumerateFiles(path, "*", searchOption).Where(file => - !FileSystem.Path.GetFileName(file).StartsWith(Parser.Parser.MacOsMetadataFileStartsWith)); + !FileSystem.Path.GetFileName(file).StartsWith(Tasks.Scanner.Parser.Parser.MacOsMetadataFileStartsWith)); } /// @@ -480,10 +497,10 @@ namespace API.Services { var stopLookingForDirectories = false; var dirs = new Dictionary(); - foreach (var folder in libraryFolders) + foreach (var folder in libraryFolders.Select(Tasks.Scanner.Parser.Parser.NormalizePath)) { if (stopLookingForDirectories) break; - foreach (var file in filePaths) + foreach (var file in filePaths.Select(Tasks.Scanner.Parser.Parser.NormalizePath)) { if (!file.Contains(folder)) continue; @@ -496,7 +513,7 @@ namespace API.Services break; } - var fullPath = Path.Join(folder, parts.Last()); + var fullPath = Tasks.Scanner.Parser.Parser.NormalizePath(Path.Join(folder, parts.Last())); if (!dirs.ContainsKey(fullPath)) { dirs.Add(fullPath, string.Empty); @@ -507,10 +524,161 @@ namespace API.Services return dirs; } + /// + /// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope. + /// + /// + /// List of directory paths, empty if path doesn't exist + public IEnumerable GetDirectories(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray.Empty; + return FileSystem.Directory.GetDirectories(folderPath) + .Where(path => ExcludeDirectories.Matches(path).Count == 0); + } + + /// + /// Gets a set of directories from the folder path. Automatically excludes directories that shouldn't be in scope. + /// + /// + /// A set of glob rules that will filter directories out + /// List of directory paths, empty if path doesn't exist + public IEnumerable GetDirectories(string folderPath, GlobMatcher matcher) + { + if (matcher == null) return GetDirectories(folderPath); + + return GetDirectories(folderPath) + .Where(folder => !matcher.ExcludeMatches( + $"{FileSystem.DirectoryInfo.FromDirectoryName(folder).Name}{FileSystem.Path.AltDirectorySeparatorChar}")); + } + + /// + /// Returns all directories, including subdirectories. Automatically excludes directories that shouldn't be in scope. + /// + /// + /// + public IEnumerable GetAllDirectories(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) return ImmutableArray.Empty; + var directories = new List(); + + var foundDirs = GetDirectories(folderPath); + foreach (var foundDir in foundDirs) + { + directories.Add(foundDir); + directories.AddRange(GetAllDirectories(foundDir)); + } + + return directories; + } + + /// + /// Returns the parent directories name for a file or folder. Empty string is path is not valid. + /// + /// + /// + public string GetParentDirectoryName(string fileOrFolder) + { + try + { + return Tasks.Scanner.Parser.Parser.NormalizePath(Directory.GetParent(fileOrFolder)?.FullName); + } + catch (Exception) + { + return string.Empty; + } + } + + /// + /// Scans a directory by utilizing a recursive folder search. If a .kavitaignore file is found, will ignore matching patterns + /// + /// + /// + /// + public IList ScanFiles(string folderPath, GlobMatcher? matcher = null) + { + _logger.LogDebug("[ScanFiles] called on {Path}", folderPath); + var files = new List(); + if (!Exists(folderPath)) return files; + + var potentialIgnoreFile = FileSystem.Path.Join(folderPath, KavitaIgnoreFile); + if (matcher == null) + { + matcher = CreateMatcherFromFile(potentialIgnoreFile); + } + else + { + matcher.Merge(CreateMatcherFromFile(potentialIgnoreFile)); + } + + + var directories = GetDirectories(folderPath, matcher); + + foreach (var directory in directories) + { + files.AddRange(ScanFiles(directory, matcher)); + } + + + // Get the matcher from either ignore or global (default setup) + if (matcher == null) + { + files.AddRange(GetFilesWithCertainExtensions(folderPath, Tasks.Scanner.Parser.Parser.SupportedExtensions)); + } + else + { + var foundFiles = GetFilesWithCertainExtensions(folderPath, + Tasks.Scanner.Parser.Parser.SupportedExtensions) + .Where(file => !matcher.ExcludeMatches(FileSystem.FileInfo.FromFileName(file).Name)); + files.AddRange(foundFiles); + } + + return files; + } + + /// + /// Recursively scans a folder and returns the max last write time on any folders and files + /// + /// + /// Max Last Write Time + public DateTime GetLastWriteTime(string folderPath) + { + if (!FileSystem.Directory.Exists(folderPath)) throw new IOException($"{folderPath} does not exist"); + return Directory.GetFileSystemEntries(folderPath, "*.*", SearchOption.AllDirectories).Max(path => FileSystem.File.GetLastWriteTime(path)); + } + + /// + /// Generates a GlobMatcher from a .kavitaignore file found at path. Returns null otherwise. + /// + /// + /// + public GlobMatcher CreateMatcherFromFile(string filePath) + { + if (!FileSystem.File.Exists(filePath)) + { + return null; + } + + // Read file in and add each line to Matcher + var lines = FileSystem.File.ReadAllLines(filePath); + if (lines.Length == 0) + { + return null; + } + + GlobMatcher matcher = new(); + foreach (var line in lines) + { + matcher.AddExclude(line); + } + + return matcher; + } + /// /// Recursively scans files and applies an action on them. This uses as many cores the underlying PC has to speed /// up processing. + /// NOTE: This is no longer parallel due to user's machines locking up /// /// Directory to scan /// Action to apply on file path @@ -538,18 +706,16 @@ namespace API.Services string[] files; try { - subDirs = FileSystem.Directory.GetDirectories(currentDir).Where(path => ExcludeDirectories.Matches(path).Count == 0); + subDirs = GetDirectories(currentDir); } // Thrown if we do not have discovery permission on the directory. catch (UnauthorizedAccessException e) { - Console.WriteLine(e.Message); - logger.LogError(e, "Unauthorized access on {Directory}", currentDir); + logger.LogCritical(e, "Unauthorized access on {Directory}", currentDir); continue; } // Thrown if another process has deleted the directory after we retrieved its name. catch (DirectoryNotFoundException e) { - Console.WriteLine(e.Message); - logger.LogError(e, "Directory not found on {Directory}", currentDir); + logger.LogCritical(e, "Directory not found on {Directory}", currentDir); continue; } @@ -558,15 +724,15 @@ namespace API.Services .ToArray(); } catch (UnauthorizedAccessException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "Unauthorized access on a file in {Directory}", currentDir); continue; } catch (DirectoryNotFoundException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "Directory not found on a file in {Directory}", currentDir); continue; } catch (IOException e) { - Console.WriteLine(e.Message); + logger.LogCritical(e, "IO exception on a file in {Directory}", currentDir); continue; } @@ -577,19 +743,16 @@ namespace API.Services foreach (var file in files) { action(file); fileCount++; - } + } } catch (AggregateException ae) { ae.Handle((ex) => { - if (ex is UnauthorizedAccessException) { - // Here we just output a message and go on. - Console.WriteLine(ex.Message); - _logger.LogError(ex, "Unauthorized access on file"); - return true; - } - // Handle other exceptions here if necessary... + if (ex is not UnauthorizedAccessException) return false; + // Here we just output a message and go on. + _logger.LogError(ex, "Unauthorized access on file"); + return true; + // Handle other exceptions here if necessary... - return false; }); } @@ -682,7 +845,7 @@ namespace API.Services /// Fully qualified directory public void RemoveNonImages(string directoryName) { - DeleteFiles(GetFiles(directoryName, searchOption:SearchOption.AllDirectories).Where(file => !Parser.Parser.IsImage(file))); + DeleteFiles(GetFiles(directoryName, searchOption:SearchOption.AllDirectories).Where(file => !Tasks.Scanner.Parser.Parser.IsImage(file))); } @@ -755,9 +918,9 @@ namespace API.Services foreach (var file in directory.EnumerateFiles().OrderByNatural(file => file.FullName)) { if (file.Directory == null) continue; - var paddedIndex = Parser.Parser.PadZeros(directoryIndex + ""); + var paddedIndex = Tasks.Scanner.Parser.Parser.PadZeros(directoryIndex + ""); // We need to rename the files so that after flattening, they are in the order we found them - var newName = $"{paddedIndex}_{Parser.Parser.PadZeros(fileIndex + "")}{file.Extension}"; + var newName = $"{paddedIndex}_{Tasks.Scanner.Parser.Parser.PadZeros(fileIndex + "")}{file.Extension}"; var newPath = Path.Join(root.FullName, newName); if (!File.Exists(newPath)) file.MoveTo(newPath); fileIndex++; @@ -769,7 +932,7 @@ namespace API.Services foreach (var subDirectory in directory.EnumerateDirectories().OrderByNatural(d => d.FullName)) { // We need to check if the directory is not a blacklisted (ie __MACOSX) - if (Parser.Parser.HasBlacklistedFolderInPath(subDirectory.FullName)) continue; + if (Tasks.Scanner.Parser.Parser.HasBlacklistedFolderInPath(subDirectory.FullName)) continue; FlattenDirectory(root, subDirectory, ref directoryIndex); } diff --git a/API/Services/EmailService.cs b/API/Services/EmailService.cs index c5ba90464..819a0c77a 100644 --- a/API/Services/EmailService.cs +++ b/API/Services/EmailService.cs @@ -82,8 +82,15 @@ public class EmailService : IEmailService public async Task CheckIfAccessible(string host) { // This is the only exception for using the default because we need an external service to check if the server is accessible for emails - if (IsLocalIpAddress(host)) return false; - return await SendEmailWithGet(DefaultApiUrl + "/api/email/reachable?host=" + host); + try + { + if (IsLocalIpAddress(host)) return false; + return await SendEmailWithGet(DefaultApiUrl + "/api/email/reachable?host=" + host); + } + catch (Exception) + { + return false; + } } public async Task SendMigrationEmail(EmailMigrationDto data) diff --git a/API/Services/HostedServices/StartupTasksHostedService.cs b/API/Services/HostedServices/StartupTasksHostedService.cs index 099c44cc8..df7692c7c 100644 --- a/API/Services/HostedServices/StartupTasksHostedService.cs +++ b/API/Services/HostedServices/StartupTasksHostedService.cs @@ -1,6 +1,8 @@ using System; using System.Threading; using System.Threading.Tasks; +using API.Data; +using API.Services.Tasks.Scanner; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.Hosting; @@ -23,6 +25,8 @@ namespace API.Services.HostedServices await taskScheduler.ScheduleTasks(); taskScheduler.ScheduleUpdaterTasks(); + + try { // These methods will automatically check if stat collection is disabled to prevent sending any data regardless @@ -34,6 +38,21 @@ namespace API.Services.HostedServices { //If stats startup fail the user can keep using the app } + + try + { + var unitOfWork = scope.ServiceProvider.GetRequiredService(); + if ((await unitOfWork.SettingsRepository.GetSettingsDtoAsync()).EnableFolderWatching) + { + var libraryWatcher = scope.ServiceProvider.GetRequiredService(); + await libraryWatcher.StartWatching(); + } + } + catch (Exception) + { + // Fail silently + } + } public Task StopAsync(CancellationToken cancellationToken) => Task.CompletedTask; diff --git a/API/Services/ImageService.cs b/API/Services/ImageService.cs index 03d589776..1d1271ad5 100644 --- a/API/Services/ImageService.cs +++ b/API/Services/ImageService.cs @@ -63,7 +63,7 @@ public class ImageService : IImageService else { _directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(fileFilePath), targetDirectory, - Parser.Parser.ImageFileExtensions); + Tasks.Scanner.Parser.Parser.ImageFileExtensions); } } diff --git a/API/Services/MetadataService.cs b/API/Services/MetadataService.cs index 3c0df0ec7..0dd980a59 100644 --- a/API/Services/MetadataService.cs +++ b/API/Services/MetadataService.cs @@ -36,11 +36,15 @@ public interface IMetadataService /// /// /// Overrides any cache logic and forces execution + Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true); + Task GenerateCoversForSeries(Series series, bool forceUpdate = false); + Task RemoveAbandonedMetadataKeys(); } public class MetadataService : IMetadataService { + public const string Name = "MetadataService"; private readonly IUnitOfWork _unitOfWork; private readonly ILogger _logger; private readonly IEventHub _eventHub; @@ -77,9 +81,7 @@ public class MetadataService : IMetadataService _logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile.FilePath); chapter.CoverImage = _readingItemService.GetCoverImage(firstFile.FilePath, ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId), firstFile.Format); - - // await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, - // MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter), false); + _unitOfWork.ChapterRepository.Update(chapter); _updateEvents.Add(MessageFactory.CoverUpdateEvent(chapter.Id, MessageFactoryEntityTypes.Chapter)); return Task.FromResult(true); } @@ -110,7 +112,6 @@ public class MetadataService : IMetadataService if (firstChapter == null) return Task.FromResult(false); volume.CoverImage = firstChapter.CoverImage; - //await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(volume.Id, MessageFactoryEntityTypes.Volume), false); _updateEvents.Add(MessageFactory.CoverUpdateEvent(volume.Id, MessageFactoryEntityTypes.Volume)); return Task.FromResult(true); @@ -147,7 +148,6 @@ public class MetadataService : IMetadataService } } series.CoverImage = firstCover?.CoverImage ?? coverImage; - //await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false); _updateEvents.Add(MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series)); return Task.CompletedTask; } @@ -160,7 +160,7 @@ public class MetadataService : IMetadataService /// private async Task ProcessSeriesCoverGen(Series series, bool forceUpdate) { - _logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName); + _logger.LogDebug("[MetadataService] Processing cover image generation for series: {SeriesName}", series.OriginalName); try { var volumeIndex = 0; @@ -194,7 +194,7 @@ public class MetadataService : IMetadataService } catch (Exception ex) { - _logger.LogError(ex, "[MetadataService] There was an exception during updating metadata for {SeriesName} ", series.Name); + _logger.LogError(ex, "[MetadataService] There was an exception during cover generation for {SeriesName} ", series.Name); } } @@ -210,14 +210,14 @@ public class MetadataService : IMetadataService public async Task GenerateCoversForLibrary(int libraryId, bool forceUpdate = false) { var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None); - _logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name); + _logger.LogInformation("[MetadataService] Beginning cover generation refresh of {LibraryName}", library.Name); _updateEvents.Clear(); var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id); var stopwatch = Stopwatch.StartNew(); var totalTime = 0L; - _logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize); + _logger.LogInformation("[MetadataService] Refreshing Library {LibraryName} for cover generation. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize); await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.CoverUpdateProgressEvent(library.Id, 0F, ProgressEventType.Started, $"Starting {library.Name}")); @@ -228,7 +228,7 @@ public class MetadataService : IMetadataService totalTime += stopwatch.ElapsedMilliseconds; stopwatch.Restart(); - _logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}", + _logger.LogDebug("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd})", chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize); var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, @@ -254,7 +254,7 @@ public class MetadataService : IMetadataService } catch (Exception ex) { - _logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name); + _logger.LogError(ex, "[MetadataService] There was an exception during cover generation refresh for {SeriesName}", series.Name); } seriesIndex++; } @@ -271,17 +271,18 @@ public class MetadataService : IMetadataService await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.CoverUpdateProgressEvent(library.Id, 1F, ProgressEventType.Ended, $"Complete")); - await RemoveAbandonedMetadataKeys(); - - _logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime); + _logger.LogInformation("[MetadataService] Updated covers for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime); } - private async Task RemoveAbandonedMetadataKeys() + public async Task RemoveAbandonedMetadataKeys() { await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated(); await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated(); await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated(); + await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries(); + await _unitOfWork.AppUserProgressRepository.CleanupAbandonedChapters(); + } /// @@ -292,7 +293,6 @@ public class MetadataService : IMetadataService /// Overrides any cache logic and forces execution public async Task GenerateCoversForSeries(int libraryId, int seriesId, bool forceUpdate = true) { - var sw = Stopwatch.StartNew(); var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId); if (series == null) { @@ -300,8 +300,19 @@ public class MetadataService : IMetadataService return; } + await GenerateCoversForSeries(series, forceUpdate); + } + + /// + /// Generate Cover for a Series. This is used by Scan Loop and should not be invoked directly via User Interaction. + /// + /// A full Series, with metadata, chapters, etc + /// + public async Task GenerateCoversForSeries(Series series, bool forceUpdate = false) + { + var sw = Stopwatch.StartNew(); await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, - MessageFactory.CoverUpdateProgressEvent(libraryId, 0F, ProgressEventType.Started, series.Name)); + MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 0F, ProgressEventType.Started, series.Name)); await ProcessSeriesCoverGen(series, forceUpdate); @@ -309,17 +320,14 @@ public class MetadataService : IMetadataService if (_unitOfWork.HasChanges()) { await _unitOfWork.CommitAsync(); + _logger.LogInformation("[MetadataService] Updated covers for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, - MessageFactory.CoverUpdateProgressEvent(libraryId, 1F, ProgressEventType.Ended, series.Name)); - - await RemoveAbandonedMetadataKeys(); + MessageFactory.CoverUpdateProgressEvent(series.LibraryId, 1F, ProgressEventType.Ended, series.Name)); await _eventHub.SendMessageAsync(MessageFactory.CoverUpdate, MessageFactory.CoverUpdateEvent(series.Id, MessageFactoryEntityTypes.Series), false); await FlushEvents(); - - _logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } private async Task FlushEvents() diff --git a/API/Services/ReaderService.cs b/API/Services/ReaderService.cs index 1c993a487..197731a8b 100644 --- a/API/Services/ReaderService.cs +++ b/API/Services/ReaderService.cs @@ -59,7 +59,7 @@ public class ReaderService : IReaderService public static string FormatBookmarkFolderPath(string baseDirectory, int userId, int seriesId, int chapterId) { - return Parser.Parser.NormalizePath(Path.Join(baseDirectory, $"{userId}", $"{seriesId}", $"{chapterId}")); + return Tasks.Scanner.Parser.Parser.NormalizePath(Path.Join(baseDirectory, $"{userId}", $"{seriesId}", $"{chapterId}")); } /// @@ -496,7 +496,7 @@ public class ReaderService : IReaderService { var chapters = volume.Chapters .OrderBy(c => float.Parse(c.Number)) - .Where(c => !c.IsSpecial && Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber); + .Where(c => !c.IsSpecial && Tasks.Scanner.Parser.Parser.MaxNumberFromRange(c.Range) <= chapterNumber); await MarkChaptersAsRead(user, volume.SeriesId, chapters); } } diff --git a/API/Services/ReadingItemService.cs b/API/Services/ReadingItemService.cs index 3b2e0bf4c..8e4676639 100644 --- a/API/Services/ReadingItemService.cs +++ b/API/Services/ReadingItemService.cs @@ -12,6 +12,7 @@ public interface IReadingItemService string GetCoverImage(string filePath, string fileName, MangaFormat format); void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1); ParserInfo Parse(string path, string rootPath, LibraryType type); + ParserInfo ParseFile(string path, string rootPath, LibraryType type); } public class ReadingItemService : IReadingItemService @@ -20,7 +21,7 @@ public class ReadingItemService : IReadingItemService private readonly IBookService _bookService; private readonly IImageService _imageService; private readonly IDirectoryService _directoryService; - private readonly DefaultParser _defaultParser; + private readonly IDefaultParser _defaultParser; public ReadingItemService(IArchiveService archiveService, IBookService bookService, IImageService imageService, IDirectoryService directoryService) { @@ -39,12 +40,12 @@ public class ReadingItemService : IReadingItemService /// public ComicInfo? GetComicInfo(string filePath) { - if (Parser.Parser.IsEpub(filePath)) + if (Tasks.Scanner.Parser.Parser.IsEpub(filePath)) { return _bookService.GetComicInfo(filePath); } - if (Parser.Parser.IsComicInfoExtension(filePath)) + if (Tasks.Scanner.Parser.Parser.IsComicInfoExtension(filePath)) { return _archiveService.GetComicInfo(filePath); } @@ -52,6 +53,71 @@ public class ReadingItemService : IReadingItemService return null; } + /// + /// Processes files found during a library scan. + /// + /// Path of a file + /// + /// Library type to determine parsing to perform + public ParserInfo ParseFile(string path, string rootPath, LibraryType type) + { + var info = Parse(path, rootPath, type); + if (info == null) + { + return null; + } + + + // This catches when original library type is Manga/Comic and when parsing with non + if (Tasks.Scanner.Parser.Parser.IsEpub(path) && Tasks.Scanner.Parser.Parser.ParseVolume(info.Series) != Tasks.Scanner.Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume? + { + info = _defaultParser.Parse(path, rootPath, LibraryType.Book); + var info2 = Parse(path, rootPath, type); + info.Merge(info2); + } + + info.ComicInfo = GetComicInfo(path); + if (info.ComicInfo == null) return info; + + if (!string.IsNullOrEmpty(info.ComicInfo.Volume)) + { + info.Volumes = info.ComicInfo.Volume; + } + if (!string.IsNullOrEmpty(info.ComicInfo.Series)) + { + info.Series = info.ComicInfo.Series.Trim(); + } + if (!string.IsNullOrEmpty(info.ComicInfo.Number)) + { + info.Chapters = info.ComicInfo.Number; + } + + // Patch is SeriesSort from ComicInfo + if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort)) + { + info.SeriesSort = info.ComicInfo.TitleSort.Trim(); + } + + if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Tasks.Scanner.Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format)) + { + info.IsSpecial = true; + info.Chapters = Tasks.Scanner.Parser.Parser.DefaultChapter; + info.Volumes = Tasks.Scanner.Parser.Parser.DefaultVolume; + } + + if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort)) + { + info.SeriesSort = info.ComicInfo.SeriesSort.Trim(); + } + + if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries)) + { + info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim(); + } + + return info; + } + /// /// /// @@ -134,6 +200,6 @@ public class ReadingItemService : IReadingItemService /// public ParserInfo Parse(string path, string rootPath, LibraryType type) { - return Parser.Parser.IsEpub(path) ? _bookService.ParseInfo(path) : _defaultParser.Parse(path, rootPath, type); + return Tasks.Scanner.Parser.Parser.IsEpub(path) ? _bookService.ParseInfo(path) : _defaultParser.Parse(path, rootPath, type); } } diff --git a/API/Services/ReadingListService.cs b/API/Services/ReadingListService.cs new file mode 100644 index 000000000..60314e3a9 --- /dev/null +++ b/API/Services/ReadingListService.cs @@ -0,0 +1,182 @@ +using System.Collections.Generic; +using System.Linq; +using System.Threading.Tasks; +using API.Comparators; +using API.Data; +using API.Data.Repositories; +using API.DTOs.ReadingLists; +using API.Entities; +using Microsoft.Extensions.Logging; + +namespace API.Services; + +public interface IReadingListService +{ + Task RemoveFullyReadItems(int readingListId, AppUser user); + Task UpdateReadingListItemPosition(UpdateReadingListPosition dto); + Task DeleteReadingListItem(UpdateReadingListPosition dto); + Task UserHasReadingListAccess(int readingListId, string username); + Task DeleteReadingList(int readingListId, AppUser user); + + Task AddChaptersToReadingList(int seriesId, IList chapterIds, + ReadingList readingList); +} + +/// +/// Methods responsible for management of Reading Lists +/// +/// If called from API layer, expected for to be called beforehand +public class ReadingListService : IReadingListService +{ + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst(); + + public ReadingListService(IUnitOfWork unitOfWork, ILogger logger) + { + _unitOfWork = unitOfWork; + _logger = logger; + } + + + + /// + /// Removes all entries that are fully read from the reading list + /// + /// If called from API layer, expected for to be called beforehand + /// Reading List Id + /// User + /// + public async Task RemoveFullyReadItems(int readingListId, AppUser user) + { + var items = await _unitOfWork.ReadingListRepository.GetReadingListItemDtosByIdAsync(readingListId, user.Id); + items = await _unitOfWork.ReadingListRepository.AddReadingProgressModifiers(user.Id, items.ToList()); + + // Collect all Ids to remove + var itemIdsToRemove = items.Where(item => item.PagesRead == item.PagesTotal).Select(item => item.Id); + + try + { + var listItems = + (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(readingListId)).Where(r => + itemIdsToRemove.Contains(r.Id)); + _unitOfWork.ReadingListRepository.BulkRemove(listItems); + + if (!_unitOfWork.HasChanges()) return true; + + await _unitOfWork.CommitAsync(); + return true; + } + catch + { + await _unitOfWork.RollbackAsync(); + } + + return false; + } + + /// + /// Updates a reading list item from one position to another. This will cause items at that position to be pushed one index. + /// + /// + /// + public async Task UpdateReadingListItemPosition(UpdateReadingListPosition dto) + { + var items = (await _unitOfWork.ReadingListRepository.GetReadingListItemsByIdAsync(dto.ReadingListId)).ToList(); + var item = items.Find(r => r.Id == dto.ReadingListItemId); + items.Remove(item); + items.Insert(dto.ToPosition, item); + + for (var i = 0; i < items.Count; i++) + { + items[i].Order = i; + } + + if (!_unitOfWork.HasChanges()) return true; + + return await _unitOfWork.CommitAsync(); + } + + public async Task DeleteReadingListItem(UpdateReadingListPosition dto) + { + var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(dto.ReadingListId); + readingList.Items = readingList.Items.Where(r => r.Id != dto.ReadingListItemId).ToList(); + + var index = 0; + foreach (var readingListItem in readingList.Items) + { + readingListItem.Order = index; + index++; + } + + if (!_unitOfWork.HasChanges()) return true; + + return await _unitOfWork.CommitAsync(); + } + + /// + /// Validates the user has access to the reading list to perform actions on it + /// + /// + /// + /// + public async Task UserHasReadingListAccess(int readingListId, string username) + { + var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username, + AppUserIncludes.ReadingListsWithItems); + if (user.ReadingLists.SingleOrDefault(rl => rl.Id == readingListId) == null && !await _unitOfWork.UserRepository.IsUserAdminAsync(user)) + { + return null; + } + + return user; + } + + /// + /// Removes the Reading List from kavita + /// + /// + /// User should have ReadingLists populated + /// + public async Task DeleteReadingList(int readingListId, AppUser user) + { + var readingList = await _unitOfWork.ReadingListRepository.GetReadingListByIdAsync(readingListId); + user.ReadingLists.Remove(readingList); + + if (!_unitOfWork.HasChanges()) return true; + + return await _unitOfWork.CommitAsync(); + } + + /// + /// Adds a list of Chapters as reading list items to the passed reading list. + /// + /// + /// + /// + /// True if new chapters were added + public async Task AddChaptersToReadingList(int seriesId, IList chapterIds, ReadingList readingList) + { + readingList.Items ??= new List(); + var lastOrder = 0; + if (readingList.Items.Any()) + { + lastOrder = readingList.Items.DefaultIfEmpty().Max(rli => rli.Order); + } + + var existingChapterExists = readingList.Items.Select(rli => rli.ChapterId).ToHashSet(); + var chaptersForSeries = (await _unitOfWork.ChapterRepository.GetChaptersByIdsAsync(chapterIds)) + .OrderBy(c => Tasks.Scanner.Parser.Parser.MinNumberFromRange(c.Volume.Name)) + .ThenBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting); + + var index = lastOrder + 1; + foreach (var chapter in chaptersForSeries) + { + if (existingChapterExists.Contains(chapter.Id)) continue; + readingList.Items.Add(DbFactory.ReadingListItem(index, seriesId, chapter.VolumeId, chapter.Id)); + index += 1; + } + + return index > lastOrder + 1; + } +} diff --git a/API/Services/SeriesService.cs b/API/Services/SeriesService.cs index f869ea12a..471cb2b16 100644 --- a/API/Services/SeriesService.cs +++ b/API/Services/SeriesService.cs @@ -8,7 +8,6 @@ using API.Data; using API.DTOs; using API.DTOs.CollectionTags; using API.DTOs.Metadata; -using API.DTOs.Reader; using API.DTOs.SeriesDetail; using API.Entities; using API.Entities.Enums; @@ -51,8 +50,8 @@ public class SeriesService : ISeriesService /// public static Chapter GetFirstChapterForMetadata(Series series, bool isBookLibrary) { - return series.Volumes.OrderBy(v => v.Number, new ChapterSortComparer()) - .SelectMany(v => v.Chapters.OrderBy(c => float.Parse(c.Number), new ChapterSortComparer())) + return series.Volumes.OrderBy(v => v.Number, ChapterSortComparer.Default) + .SelectMany(v => v.Chapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default)) .FirstOrDefault(); } @@ -255,7 +254,7 @@ public class SeriesService : ISeriesService // At this point, all tags that aren't in dto have been removed. foreach (var tagTitle in tags.Select(t => t.Title)) { - var normalizedTitle = Parser.Parser.Normalize(tagTitle); + var normalizedTitle = Tasks.Scanner.Parser.Parser.Normalize(tagTitle); var existingTag = allTags.SingleOrDefault(t => t.NormalizedTitle == normalizedTitle); if (existingTag != null) { @@ -296,7 +295,7 @@ public class SeriesService : ISeriesService // At this point, all tags that aren't in dto have been removed. foreach (var tagTitle in tags.Select(t => t.Title)) { - var normalizedTitle = Parser.Parser.Normalize(tagTitle); + var normalizedTitle = Tasks.Scanner.Parser.Parser.Normalize(tagTitle); var existingTag = allTags.SingleOrDefault(t => t.NormalizedTitle.Equals(normalizedTitle)); if (existingTag != null) { @@ -422,8 +421,17 @@ public class SeriesService : ISeriesService } var series = await _unitOfWork.SeriesRepository.GetSeriesByIdsAsync(seriesIds); + var libraryIds = series.Select(s => s.LibraryId); + var libraries = await _unitOfWork.LibraryRepository.GetLibraryForIdsAsync(libraryIds); + foreach (var library in libraries) + { + library.LastModified = DateTime.Now; + _unitOfWork.LibraryRepository.Update(library); + } + _unitOfWork.SeriesRepository.Remove(series); + if (!_unitOfWork.HasChanges() || !await _unitOfWork.CommitAsync()) return true; foreach (var s in series) @@ -457,7 +465,7 @@ public class SeriesService : ISeriesService var libraryType = await _unitOfWork.LibraryRepository.GetLibraryTypeAsync(series.LibraryId); var volumes = (await _unitOfWork.VolumeRepository.GetVolumesDtoAsync(seriesId, userId)) - .OrderBy(v => Parser.Parser.MinNumberFromRange(v.Name)) + .OrderBy(v => Tasks.Scanner.Parser.Parser.MinNumberFromRange(v.Name)) .ToList(); // For books, the Name of the Volume is remapped to the actual name of the book, rather than Volume number. @@ -485,7 +493,7 @@ public class SeriesService : ISeriesService if (v.Number == 0) return c; c.VolumeTitle = v.Name; return c; - }).OrderBy(c => float.Parse(c.Number), new ChapterSortComparer())); + }).OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default)).ToList(); foreach (var chapter in chapters) { @@ -510,7 +518,13 @@ public class SeriesService : ISeriesService var storylineChapters = volumes .Where(v => v.Number == 0) .SelectMany(v => v.Chapters.Where(c => !c.IsSpecial)) - .OrderBy(c => float.Parse(c.Number), new ChapterSortComparer()); + .OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default) + .ToList(); + + // When there's chapters without a volume number revert to chapter sorting only as opposed to volume then chapter + if (storylineChapters.Any()) { + retChapters = retChapters.OrderBy(c => float.Parse(c.Number), ChapterSortComparer.Default); + } return new SeriesDetailDto() { @@ -528,7 +542,7 @@ public class SeriesService : ISeriesService /// private static bool ShouldIncludeChapter(ChapterDto chapter) { - return !chapter.IsSpecial && !chapter.Number.Equals(Parser.Parser.DefaultChapter); + return !chapter.IsSpecial && !chapter.Number.Equals(Tasks.Scanner.Parser.Parser.DefaultChapter); } public static void RenameVolumeName(ChapterDto firstChapter, VolumeDto volume, LibraryType libraryType) @@ -537,7 +551,7 @@ public class SeriesService : ISeriesService { if (string.IsNullOrEmpty(firstChapter.TitleName)) { - if (firstChapter.Range.Equals(Parser.Parser.DefaultVolume)) return; + if (firstChapter.Range.Equals(Tasks.Scanner.Parser.Parser.DefaultVolume)) return; var title = Path.GetFileNameWithoutExtension(firstChapter.Range); if (string.IsNullOrEmpty(title)) return; volume.Name += $" - {title}"; @@ -558,7 +572,7 @@ public class SeriesService : ISeriesService { if (isSpecial) { - return Parser.Parser.CleanSpecialTitle(chapterTitle); + return Tasks.Scanner.Parser.Parser.CleanSpecialTitle(chapterTitle); } var hashSpot = withHash ? "#" : string.Empty; diff --git a/API/Services/TaskScheduler.cs b/API/Services/TaskScheduler.cs index e9030b969..affbec32b 100644 --- a/API/Services/TaskScheduler.cs +++ b/API/Services/TaskScheduler.cs @@ -8,8 +8,8 @@ using API.Entities.Enums; using API.Helpers.Converters; using API.Services.Tasks; using API.Services.Tasks.Metadata; +using API.Services.Tasks.Scanner; using Hangfire; -using Hangfire.Storage; using Microsoft.Extensions.Logging; namespace API.Services; @@ -19,7 +19,8 @@ public interface ITaskScheduler Task ScheduleTasks(); Task ScheduleStatsTasks(); void ScheduleUpdaterTasks(); - void ScanLibrary(int libraryId); + void ScanFolder(string folderPath); + void ScanLibrary(int libraryId, bool force = false); void CleanupChapters(int[] chapterIds); void RefreshMetadata(int libraryId, bool forceUpdate = true); void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false); @@ -29,8 +30,6 @@ public interface ITaskScheduler void CancelStatsTasks(); Task RunStatCollection(); void ScanSiteThemes(); - - } public class TaskScheduler : ITaskScheduler { @@ -48,6 +47,12 @@ public class TaskScheduler : ITaskScheduler private readonly IWordCountAnalyzerService _wordCountAnalyzerService; public static BackgroundJobServer Client => new BackgroundJobServer(); + public const string ScanQueue = "scan"; + public const string DefaultQueue = "default"; + + public static readonly IList ScanTasks = new List() + {"ScannerService", "ScanLibrary", "ScanLibraries", "ScanFolder", "ScanSeries"}; + private static readonly Random Rnd = new Random(); @@ -83,7 +88,7 @@ public class TaskScheduler : ITaskScheduler } else { - RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily, TimeZoneInfo.Local); + RecurringJob.AddOrUpdate("scan-libraries", () => ScanLibraries(), Cron.Daily, TimeZoneInfo.Local); } setting = (await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Value; @@ -149,6 +154,7 @@ public class TaskScheduler : ITaskScheduler BackgroundJob.Enqueue(() => _themeService.Scan()); } + #endregion #region UpdateTasks @@ -159,17 +165,44 @@ public class TaskScheduler : ITaskScheduler // Schedule update check between noon and 6pm local time RecurringJob.AddOrUpdate("check-updates", () => CheckForUpdate(), Cron.Daily(Rnd.Next(12, 18)), TimeZoneInfo.Local); } + + public void ScanFolder(string folderPath) + { + _scannerService.ScanFolder(Tasks.Scanner.Parser.Parser.NormalizePath(folderPath)); + } + #endregion - public void ScanLibrary(int libraryId) + public void ScanLibraries() { - if (HasAlreadyEnqueuedTask("ScannerService","ScanLibrary", new object[] {libraryId})) + if (RunningAnyTasksByMethod(ScanTasks, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanLibraries in 3 hours"); + BackgroundJob.Schedule(() => ScanLibraries(), TimeSpan.FromHours(3)); + return; + } + _scannerService.ScanLibraries(); + } + + public void ScanLibrary(int libraryId, bool force = false) + { + var alreadyEnqueued = + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) || + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue); + if (alreadyEnqueued) { _logger.LogInformation("A duplicate request to scan library for library occured. Skipping"); return; } + if (RunningAnyTasksByMethod(ScanTasks, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanLibrary in 3 hours"); + BackgroundJob.Schedule(() => ScanLibrary(libraryId, force), TimeSpan.FromHours(3)); + return; + } + _logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId); - BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId)); + BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, force)); // When we do a scan, force cache to re-unpack in case page numbers change BackgroundJob.Enqueue(() => _cleanupService.CleanupCacheDirectory()); } @@ -181,7 +214,11 @@ public class TaskScheduler : ITaskScheduler public void RefreshMetadata(int libraryId, bool forceUpdate = true) { - if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadata", new object[] {libraryId, forceUpdate})) + var alreadyEnqueued = HasAlreadyEnqueuedTask(MetadataService.Name, "GenerateCoversForLibrary", + new object[] {libraryId, true}) || + HasAlreadyEnqueuedTask("MetadataService", "GenerateCoversForLibrary", + new object[] {libraryId, false}); + if (alreadyEnqueued) { _logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping"); return; @@ -193,7 +230,7 @@ public class TaskScheduler : ITaskScheduler public void RefreshSeriesMetadata(int libraryId, int seriesId, bool forceUpdate = false) { - if (HasAlreadyEnqueuedTask("MetadataService","RefreshMetadataForSeries", new object[] {libraryId, seriesId, forceUpdate})) + if (HasAlreadyEnqueuedTask(MetadataService.Name,"GenerateCoversForSeries", new object[] {libraryId, seriesId, forceUpdate})) { _logger.LogInformation("A duplicate request to refresh metadata for library occured. Skipping"); return; @@ -205,14 +242,20 @@ public class TaskScheduler : ITaskScheduler public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false) { - if (HasAlreadyEnqueuedTask("ScannerService", "ScanSeries", new object[] {libraryId, seriesId, forceUpdate})) + if (HasAlreadyEnqueuedTask(ScannerService.Name, "ScanSeries", new object[] {seriesId, forceUpdate}, ScanQueue)) { _logger.LogInformation("A duplicate request to scan series occured. Skipping"); return; } + if (RunningAnyTasksByMethod(ScanTasks, ScanQueue)) + { + _logger.LogInformation("A Scan is already running, rescheduling ScanSeries in 10 minutes"); + BackgroundJob.Schedule(() => ScanSeries(libraryId, seriesId, forceUpdate), TimeSpan.FromMinutes(10)); + return; + } _logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId); - BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, CancellationToken.None)); + BackgroundJob.Enqueue(() => _scannerService.ScanSeries(seriesId, forceUpdate)); } public void AnalyzeFilesForSeries(int libraryId, int seriesId, bool forceUpdate = false) @@ -242,6 +285,13 @@ public class TaskScheduler : ITaskScheduler await _versionUpdaterService.PushUpdate(update); } + public static bool HasScanTaskRunningForLibrary(int libraryId) + { + return + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, true}, ScanQueue) || + HasAlreadyEnqueuedTask("ScannerService", "ScanLibrary", new object[] {libraryId, false}, ScanQueue); + } + /// /// Checks if this same invocation is already enqueued /// @@ -250,7 +300,7 @@ public class TaskScheduler : ITaskScheduler /// object[] of arguments in the order they are passed to enqueued job /// Queue to check against. Defaults to "default" /// - private static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = "default") + public static bool HasAlreadyEnqueuedTask(string className, string methodName, object[] args, string queue = DefaultQueue) { var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue); return enqueuedJobs.Any(j => j.Value.InEnqueuedState && @@ -258,4 +308,11 @@ public class TaskScheduler : ITaskScheduler j.Value.Job.Method.Name.Equals(methodName) && j.Value.Job.Method.DeclaringType.Name.Equals(className)); } + + public static bool RunningAnyTasksByMethod(IEnumerable classNames, string queue = DefaultQueue) + { + var enqueuedJobs = JobStorage.Current.GetMonitoringApi().EnqueuedJobs(queue, 0, int.MaxValue); + return enqueuedJobs.Any(j => !j.Value.InEnqueuedState && + classNames.Contains(j.Value.Job.Method.DeclaringType?.Name)); + } } diff --git a/API/Services/Tasks/CleanupService.cs b/API/Services/Tasks/CleanupService.cs index 4420adedb..c33459681 100644 --- a/API/Services/Tasks/CleanupService.cs +++ b/API/Services/Tasks/CleanupService.cs @@ -20,6 +20,7 @@ namespace API.Services.Tasks Task DeleteChapterCoverImages(); Task DeleteTagCoverImages(); Task CleanupBackups(); + void CleanupTemp(); } /// /// Cleans up after operations on reoccurring basis @@ -127,16 +128,18 @@ namespace API.Services.Tasks } /// - /// Removes all files and directories in the cache directory + /// Removes all files and directories in the cache and temp directory /// public void CleanupCacheDirectory() { _logger.LogInformation("Performing cleanup of Cache directory"); _directoryService.ExistOrCreate(_directoryService.CacheDirectory); + _directoryService.ExistOrCreate(_directoryService.TempDirectory); try { _directoryService.ClearDirectory(_directoryService.CacheDirectory); + _directoryService.ClearDirectory(_directoryService.TempDirectory); } catch (Exception ex) { @@ -175,5 +178,22 @@ namespace API.Services.Tasks } _logger.LogInformation("Finished cleanup of Database backups at {Time}", DateTime.Now); } + + public void CleanupTemp() + { + _logger.LogInformation("Performing cleanup of Temp directory"); + _directoryService.ExistOrCreate(_directoryService.TempDirectory); + + try + { + _directoryService.ClearDirectory(_directoryService.TempDirectory); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup"); + } + + _logger.LogInformation("Temp directory purged"); + } } } diff --git a/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs b/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs index 8c71b92d3..1bc20a359 100644 --- a/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs +++ b/API/Services/Tasks/Metadata/WordCountAnalyzerService.cs @@ -142,7 +142,8 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService _logger.LogInformation("[WordCountAnalyzerService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds); } - private async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true) + + public async Task ProcessSeries(Series series, bool forceUpdate = false, bool useFileName = true) { var isEpub = series.Format == MangaFormat.Epub; var existingWordCount = series.WordCount; @@ -208,6 +209,11 @@ public class WordCountAnalyzerService : IWordCountAnalyzerService chapter.MinHoursToRead = est.MinHours; chapter.MaxHoursToRead = est.MaxHours; chapter.AvgHoursToRead = est.AvgHours; + foreach (var file in chapter.Files) + { + file.LastFileAnalysis = DateTime.Now; + _unitOfWork.MangaFileRepository.Update(file); + } _unitOfWork.ChapterRepository.Update(chapter); } diff --git a/API/Services/Tasks/Scanner/LibraryWatcher.cs b/API/Services/Tasks/Scanner/LibraryWatcher.cs new file mode 100644 index 000000000..17ea744c9 --- /dev/null +++ b/API/Services/Tasks/Scanner/LibraryWatcher.cs @@ -0,0 +1,250 @@ +using System; +using System.Collections.Generic; +using System.Diagnostics; +using System.IO; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using Hangfire; +using Microsoft.Extensions.Hosting; +using Microsoft.Extensions.Logging; + +namespace API.Services.Tasks.Scanner; + +/// +/// Change information +/// +public class Change +{ + /// + /// Gets or sets the type of the change. + /// + /// + /// The type of the change. + /// + public WatcherChangeTypes ChangeType { get; set; } + + /// + /// Gets or sets the full path. + /// + /// + /// The full path. + /// + public string FullPath { get; set; } + + /// + /// Gets or sets the name. + /// + /// + /// The name. + /// + public string Name { get; set; } + + /// + /// Gets or sets the old full path. + /// + /// + /// The old full path. + /// + public string OldFullPath { get; set; } + + /// + /// Gets or sets the old name. + /// + /// + /// The old name. + /// + public string OldName { get; set; } +} + +public interface ILibraryWatcher +{ + /// + /// Start watching all library folders + /// + /// + Task StartWatching(); + /// + /// Stop watching all folders + /// + void StopWatching(); + /// + /// Essentially stops then starts watching. Useful if there is a change in folders or libraries + /// + /// + Task RestartWatching(); +} + +/// +/// Responsible for watching the file system and processing change events. This is mainly responsible for invoking +/// Scanner to quickly pickup on changes. +/// +public class LibraryWatcher : ILibraryWatcher +{ + private readonly IDirectoryService _directoryService; + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + private readonly IScannerService _scannerService; + + private readonly Dictionary> _watcherDictionary = new (); + /// + /// This is just here to prevent GC from Disposing our watchers + /// + private readonly IList _fileWatchers = new List(); + private IList _libraryFolders = new List(); + + private readonly TimeSpan _queueWaitTime; + + + public LibraryWatcher(IDirectoryService directoryService, IUnitOfWork unitOfWork, ILogger logger, IScannerService scannerService, IHostEnvironment environment) + { + _directoryService = directoryService; + _unitOfWork = unitOfWork; + _logger = logger; + _scannerService = scannerService; + + _queueWaitTime = environment.IsDevelopment() ? TimeSpan.FromSeconds(30) : TimeSpan.FromMinutes(5); + + } + + public async Task StartWatching() + { + _logger.LogInformation("Starting file watchers"); + + _libraryFolders = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()) + .SelectMany(l => l.Folders) + .Distinct() + .Select(Parser.Parser.NormalizePath) + .Where(_directoryService.Exists) + .ToList(); + foreach (var libraryFolder in _libraryFolders) + { + _logger.LogDebug("Watching {FolderPath}", libraryFolder); + var watcher = new FileSystemWatcher(libraryFolder); + + watcher.Changed += OnChanged; + watcher.Created += OnCreated; + watcher.Deleted += OnDeleted; + watcher.Error += OnError; + + watcher.Filter = "*.*"; + watcher.IncludeSubdirectories = true; + watcher.EnableRaisingEvents = true; + _fileWatchers.Add(watcher); + if (!_watcherDictionary.ContainsKey(libraryFolder)) + { + _watcherDictionary.Add(libraryFolder, new List()); + } + + _watcherDictionary[libraryFolder].Add(watcher); + } + } + + public void StopWatching() + { + _logger.LogInformation("Stopping watching folders"); + foreach (var fileSystemWatcher in _watcherDictionary.Values.SelectMany(watcher => watcher)) + { + fileSystemWatcher.EnableRaisingEvents = false; + fileSystemWatcher.Changed -= OnChanged; + fileSystemWatcher.Created -= OnCreated; + fileSystemWatcher.Deleted -= OnDeleted; + fileSystemWatcher.Dispose(); + } + _fileWatchers.Clear(); + _watcherDictionary.Clear(); + } + + public async Task RestartWatching() + { + StopWatching(); + await StartWatching(); + } + + private void OnChanged(object sender, FileSystemEventArgs e) + { + if (e.ChangeType != WatcherChangeTypes.Changed) return; + _logger.LogDebug("[LibraryWatcher] Changed: {FullPath}, {Name}", e.FullPath, e.Name); + ProcessChange(e.FullPath, string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name))); + } + + private void OnCreated(object sender, FileSystemEventArgs e) + { + _logger.LogDebug("[LibraryWatcher] Created: {FullPath}, {Name}", e.FullPath, e.Name); + ProcessChange(e.FullPath, !_directoryService.FileSystem.File.Exists(e.Name)); + } + + /// + /// From testing, on Deleted only needs to pass through the event when a folder is deleted. If a file is deleted, Changed will handle automatically. + /// + /// + /// + private void OnDeleted(object sender, FileSystemEventArgs e) { + var isDirectory = string.IsNullOrEmpty(_directoryService.FileSystem.Path.GetExtension(e.Name)); + if (!isDirectory) return; + _logger.LogDebug("[LibraryWatcher] Deleted: {FullPath}, {Name}", e.FullPath, e.Name); + ProcessChange(e.FullPath, true); + } + + + private void OnError(object sender, ErrorEventArgs e) + { + _logger.LogError(e.GetException(), "[LibraryWatcher] An error occured, likely too many watches occured at once. Restarting Watchers"); + Task.Run(RestartWatching); + } + + + /// + /// Processes the file or folder change. If the change is a file change and not from a supported extension, it will be ignored. + /// + /// This will ignore image files that are added to the system. However, they may still trigger scans due to folder changes. + /// File or folder that changed + /// If the change is on a directory and not a file + private void ProcessChange(string filePath, bool isDirectoryChange = false) + { + var sw = Stopwatch.StartNew(); + try + { + // We need to check if directory or not + if (!isDirectoryChange && + !(Parser.Parser.IsArchive(filePath) || Parser.Parser.IsBook(filePath))) return; + + var parentDirectory = _directoryService.GetParentDirectoryName(filePath); + if (string.IsNullOrEmpty(parentDirectory)) return; + + // We need to find the library this creation belongs to + // Multiple libraries can point to the same base folder. In this case, we need use FirstOrDefault + var libraryFolder = _libraryFolders.FirstOrDefault(f => parentDirectory.Contains(f)); + if (string.IsNullOrEmpty(libraryFolder)) return; + + var rootFolder = _directoryService.GetFoldersTillRoot(libraryFolder, filePath).ToList(); + if (!rootFolder.Any()) return; + + // Select the first folder and join with library folder, this should give us the folder to scan. + var fullPath = + Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(libraryFolder, rootFolder.First())); + + var alreadyScheduled = + TaskScheduler.HasAlreadyEnqueuedTask(ScannerService.Name, "ScanFolder", new object[] {fullPath}); + _logger.LogDebug("{FullPath} already enqueued: {Value}", fullPath, alreadyScheduled); + if (!alreadyScheduled) + { + _logger.LogDebug("[LibraryWatcher] Scheduling ScanFolder for {Folder}", fullPath); + BackgroundJob.Schedule(() => _scannerService.ScanFolder(fullPath), _queueWaitTime); + } + else + { + _logger.LogDebug("[LibraryWatcher] Skipped scheduling ScanFolder for {Folder} as a job already queued", + fullPath); + } + } + catch (Exception ex) + { + _logger.LogError(ex, "[LibraryWatcher] An error occured when processing a watch event"); + } + _logger.LogDebug("ProcessChange occured in {ElapsedMilliseconds}ms", sw.ElapsedMilliseconds); + } + + + +} diff --git a/API/Services/Tasks/Scanner/ParseScannedFiles.cs b/API/Services/Tasks/Scanner/ParseScannedFiles.cs index 785f9ad46..d31879e84 100644 --- a/API/Services/Tasks/Scanner/ParseScannedFiles.cs +++ b/API/Services/Tasks/Scanner/ParseScannedFiles.cs @@ -1,37 +1,53 @@ using System; using System.Collections.Concurrent; using System.Collections.Generic; -using System.Diagnostics; -using System.IO; using System.Linq; using System.Threading.Tasks; -using API.Data.Metadata; -using API.Entities; using API.Entities.Enums; -using API.Helpers; +using API.Extensions; using API.Parser; using API.SignalR; -using Microsoft.AspNetCore.SignalR; using Microsoft.Extensions.Logging; namespace API.Services.Tasks.Scanner { public class ParsedSeries { + /// + /// Name of the Series + /// public string Name { get; init; } + /// + /// Normalized Name of the Series + /// public string NormalizedName { get; init; } + /// + /// Format of the Series + /// public MangaFormat Format { get; init; } } + public enum Modified + { + Modified = 1, + NotModified = 2 + } + + public class SeriesModified + { + public string FolderPath { get; set; } + public string SeriesName { get; set; } + public DateTime LastScanned { get; set; } + public MangaFormat Format { get; set; } + } + public class ParseScannedFiles { - private readonly ConcurrentDictionary> _scannedSeries; private readonly ILogger _logger; private readonly IDirectoryService _directoryService; private readonly IReadingItemService _readingItemService; private readonly IEventHub _eventHub; - private readonly DefaultParser _defaultParser; /// /// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos. @@ -47,108 +63,51 @@ namespace API.Services.Tasks.Scanner _logger = logger; _directoryService = directoryService; _readingItemService = readingItemService; - _scannedSeries = new ConcurrentDictionary>(); - _defaultParser = new DefaultParser(_directoryService); _eventHub = eventHub; } - /// - /// Gets the list of all parserInfos given a Series (Will match on Name, LocalizedName, OriginalName). If the series does not exist within, return empty list. - /// - /// - /// - /// - public static IList GetInfosByName(Dictionary> parsedSeries, Series series) - { - var allKeys = parsedSeries.Keys.Where(ps => - SeriesHelper.FindSeries(series, ps)); - - var infos = new List(); - foreach (var key in allKeys) - { - infos.AddRange(parsedSeries[key]); - } - - return infos; - } /// - /// Processes files found during a library scan. - /// Populates a collection of for DB updates later. + /// This will Scan all files in a folder path. For each folder within the folderPath, FolderAction will be invoked for all files contained /// - /// Path of a file - /// - /// Library type to determine parsing to perform - private void ProcessFile(string path, string rootPath, LibraryType type) + /// Scan directory by directory and for each, call folderAction + /// A library folder or series folder + /// A callback async Task to be called once all files for each folder path are found + /// If we should bypass any folder last write time checks on the scan and force I/O + public async Task ProcessFiles(string folderPath, bool scanDirectoryByDirectory, + IDictionary> seriesPaths, Func, string,Task> folderAction, bool forceCheck = false) { - var info = _readingItemService.Parse(path, rootPath, type); - if (info == null) + string normalizedPath; + if (scanDirectoryByDirectory) { - // If the file is an image and literally a cover image, skip processing. - if (!(Parser.Parser.IsImage(path) && Parser.Parser.IsCoverImage(path))) + // This is used in library scan, so we should check first for a ignore file and use that here as well + var potentialIgnoreFile = _directoryService.FileSystem.Path.Join(folderPath, DirectoryService.KavitaIgnoreFile); + var directories = _directoryService.GetDirectories(folderPath, _directoryService.CreateMatcherFromFile(potentialIgnoreFile)).ToList(); + + foreach (var directory in directories) { - _logger.LogWarning("[Scanner] Could not parse series from {Path}", path); + normalizedPath = Parser.Parser.NormalizePath(directory); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck)) + { + await folderAction(new List(), directory); + } + else + { + // For a scan, this is doing everything in the directory loop before the folder Action is called...which leads to no progress indication + await folderAction(_directoryService.ScanFiles(directory), directory); + } } + return; } - - // This catches when original library type is Manga/Comic and when parsing with non - if (Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume) // Shouldn't this be info.Volume != DefaultVolume? + normalizedPath = Parser.Parser.NormalizePath(folderPath); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedPath, forceCheck)) { - info = _defaultParser.Parse(path, rootPath, LibraryType.Book); - var info2 = _readingItemService.Parse(path, rootPath, type); - info.Merge(info2); - } - - info.ComicInfo = _readingItemService.GetComicInfo(path); - if (info.ComicInfo != null) - { - if (!string.IsNullOrEmpty(info.ComicInfo.Volume)) - { - info.Volumes = info.ComicInfo.Volume; - } - if (!string.IsNullOrEmpty(info.ComicInfo.Series)) - { - info.Series = info.ComicInfo.Series.Trim(); - } - if (!string.IsNullOrEmpty(info.ComicInfo.Number)) - { - info.Chapters = info.ComicInfo.Number; - } - - // Patch is SeriesSort from ComicInfo - if (!string.IsNullOrEmpty(info.ComicInfo.TitleSort)) - { - info.SeriesSort = info.ComicInfo.TitleSort.Trim(); - } - - if (!string.IsNullOrEmpty(info.ComicInfo.Format) && Parser.Parser.HasComicInfoSpecial(info.ComicInfo.Format)) - { - info.IsSpecial = true; - info.Chapters = Parser.Parser.DefaultChapter; - info.Volumes = Parser.Parser.DefaultVolume; - } - - if (!string.IsNullOrEmpty(info.ComicInfo.SeriesSort)) - { - info.SeriesSort = info.ComicInfo.SeriesSort.Trim(); - } - - if (!string.IsNullOrEmpty(info.ComicInfo.LocalizedSeries)) - { - info.LocalizedSeries = info.ComicInfo.LocalizedSeries.Trim(); - } - } - - try - { - TrackSeries(info); - } - catch (Exception ex) - { - _logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath); + await folderAction(new List(), folderPath); + return; } + await folderAction(_directoryService.ScanFiles(folderPath), folderPath); } @@ -156,13 +115,14 @@ namespace API.Services.Tasks.Scanner /// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing. /// This will check if the name matches an existing series name (multiple fields) /// + /// A localized list of a series' parsed infos /// - private void TrackSeries(ParserInfo info) + private void TrackSeries(ConcurrentDictionary> scannedSeries, ParserInfo info) { if (info.Series == string.Empty) return; // Check if normalized info.Series already exists and if so, update info to use that name instead - info.Series = MergeName(info); + info.Series = MergeName(scannedSeries, info); var normalizedSeries = Parser.Parser.Normalize(info.Series); var normalizedSortSeries = Parser.Parser.Normalize(info.SeriesSort); @@ -170,7 +130,7 @@ namespace API.Services.Tasks.Scanner try { - var existingKey = _scannedSeries.Keys.SingleOrDefault(ps => + var existingKey = scannedSeries.Keys.SingleOrDefault(ps => ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries) || ps.NormalizedName.Equals(normalizedLocalizedSeries) || ps.NormalizedName.Equals(normalizedSortSeries))); @@ -181,7 +141,7 @@ namespace API.Services.Tasks.Scanner NormalizedName = normalizedSeries }; - _scannedSeries.AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => + scannedSeries.AddOrUpdate(existingKey, new List() {info}, (_, oldValue) => { oldValue ??= new List(); if (!oldValue.Contains(info)) @@ -195,7 +155,7 @@ namespace API.Services.Tasks.Scanner catch (Exception ex) { _logger.LogCritical(ex, "{SeriesName} matches against multiple series in the parsed series. This indicates a critical kavita issue. Key will be skipped", info.Series); - foreach (var seriesKey in _scannedSeries.Keys.Where(ps => + foreach (var seriesKey in scannedSeries.Keys.Where(ps => ps.Format == info.Format && (ps.NormalizedName.Equals(normalizedSeries) || ps.NormalizedName.Equals(normalizedLocalizedSeries) || ps.NormalizedName.Equals(normalizedSortSeries)))) @@ -205,23 +165,24 @@ namespace API.Services.Tasks.Scanner } } + /// /// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with /// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization. /// /// /// Series Name to group this info into - public string MergeName(ParserInfo info) + private string MergeName(ConcurrentDictionary> scannedSeries, ParserInfo info) { var normalizedSeries = Parser.Parser.Normalize(info.Series); var normalizedLocalSeries = Parser.Parser.Normalize(info.LocalizedSeries); - // We use FirstOrDefault because this was introduced late in development and users might have 2 series with both names + try { var existingName = - _scannedSeries.SingleOrDefault(p => - (Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries || - Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) && + scannedSeries.SingleOrDefault(p => + (Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedSeries) || + Parser.Parser.Normalize(p.Key.NormalizedName).Equals(normalizedLocalSeries)) && p.Key.Format == info.Format) .Key; @@ -233,7 +194,7 @@ namespace API.Services.Tasks.Scanner catch (Exception ex) { _logger.LogCritical(ex, "Multiple series detected for {SeriesName} ({File})! This is critical to fix! There should only be 1", info.Series, info.FullFilePath); - var values = _scannedSeries.Where(p => + var values = scannedSeries.Where(p => (Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedSeries || Parser.Parser.Normalize(p.Key.NormalizedName) == normalizedLocalSeries) && p.Key.Format == info.Format); @@ -247,34 +208,77 @@ namespace API.Services.Tasks.Scanner return info.Series; } + /// - /// + /// This will process series by folder groups. /// - /// Type of library. Used for selecting the correct file extensions to search for and parsing files - /// The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders - /// Name of the Library + /// + /// + /// /// - public async Task>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable folders, string libraryName) + public async Task ScanLibrariesForSeries(LibraryType libraryType, + IEnumerable folders, string libraryName, bool isLibraryScan, + IDictionary> seriesPaths, Action>> processSeriesInfos, bool forceCheck = false) { - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Started)); + + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Starting", libraryName, ProgressEventType.Started)); + foreach (var folderPath in folders) { try { - async void Action(string f) + await ProcessFiles(folderPath, isLibraryScan, seriesPaths, async (files, folder) => { - try + var normalizedFolder = Parser.Parser.NormalizePath(folder); + if (HasSeriesFolderNotChangedSinceLastScan(seriesPaths, normalizedFolder, forceCheck)) { - ProcessFile(f, folderPath, libraryType); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(f, libraryName, ProgressEventType.Updated)); + var parsedInfos = seriesPaths[normalizedFolder].Select(fp => new ParserInfo() + { + Series = fp.SeriesName, + Format = fp.Format, + }).ToList(); + processSeriesInfos.Invoke(new Tuple>(true, parsedInfos)); + _logger.LogDebug("Skipped File Scan for {Folder} as it hasn't changed since last scan", folder); + return; } - catch (FileNotFoundException exception) + _logger.LogDebug("Found {Count} files for {Folder}", files.Count, folder); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(folderPath, libraryName, ProgressEventType.Updated)); + if (files.Count == 0) { - _logger.LogError(exception, "The file {Filename} could not be found", f); + _logger.LogInformation("[ScannerService] {Folder} is empty", folder); + return; } - } + var scannedSeries = new ConcurrentDictionary>(); + var infos = files + .Select(file => _readingItemService.ParseFile(file, folderPath, libraryType)) + .Where(info => info != null) + .ToList(); - _directoryService.TraverseTreeParallelForEach(folderPath, Action, Parser.Parser.SupportedExtensions, _logger); + + MergeLocalizedSeriesWithSeries(infos); + + foreach (var info in infos) + { + try + { + TrackSeries(scannedSeries, info); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an exception that occurred during tracking {FilePath}. Skipping this file", info.FullFilePath); + } + } + + // It would be really cool if we can emit an event when a folder hasn't been changed so we don't parse everything, but the first item to ensure we don't delete it + // Otherwise, we can do a last step in the DB where we validate all files on disk exist and if not, delete them. (easy but slow) + foreach (var series in scannedSeries.Keys) + { + if (scannedSeries[series].Count > 0 && processSeriesInfos != null) + { + processSeriesInfos.Invoke(new Tuple>(false, scannedSeries[series])); + } + } + }, forceCheck); } catch (ArgumentException ex) { @@ -282,20 +286,76 @@ namespace API.Services.Tasks.Scanner } } - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("", libraryName, ProgressEventType.Ended)); - - return SeriesWithInfos(); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent("File Scan Done", libraryName, ProgressEventType.Ended)); } /// - /// Returns any series where there were parsed infos + /// Checks against all folder paths on file if the last scanned is >= the directory's last write down to the second /// + /// + /// + /// /// - private Dictionary> SeriesWithInfos() + private bool HasSeriesFolderNotChangedSinceLastScan(IDictionary> seriesPaths, string normalizedFolder, bool forceCheck = false) { - var filtered = _scannedSeries.Where(kvp => kvp.Value.Count > 0); - var series = filtered.ToDictionary(v => v.Key, v => v.Value); - return series; + if (forceCheck) return false; + + return seriesPaths.ContainsKey(normalizedFolder) && seriesPaths[normalizedFolder].All(f => f.LastScanned.Truncate(TimeSpan.TicksPerSecond) >= + _directoryService.GetLastWriteTime(normalizedFolder).Truncate(TimeSpan.TicksPerSecond)); + } + + /// + /// Checks if there are any ParserInfos that have a Series that matches the LocalizedSeries field in any other info. If so, + /// rewrites the infos with series name instead of the localized name, so they stack. + /// + /// + /// Accel World v01.cbz has Series "Accel World" and Localized Series "World of Acceleration" + /// World of Acceleration v02.cbz has Series "World of Acceleration" + /// After running this code, we'd have: + /// World of Acceleration v02.cbz having Series "Accel World" and Localized Series of "World of Acceleration" + /// + /// A collection of ParserInfos + private void MergeLocalizedSeriesWithSeries(IReadOnlyCollection infos) + { + var hasLocalizedSeries = infos.Any(i => !string.IsNullOrEmpty(i.LocalizedSeries)); + if (!hasLocalizedSeries) return; + + var localizedSeries = infos + .Where(i => !i.IsSpecial) + .Select(i => i.LocalizedSeries) + .Distinct() + .FirstOrDefault(i => !string.IsNullOrEmpty(i)); + if (string.IsNullOrEmpty(localizedSeries)) return; + + // NOTE: If we have multiple series in a folder with a localized title, then this will fail. It will group into one series. User needs to fix this themselves. + string nonLocalizedSeries; + // Normalize this as many of the cases is a capitalization difference + var nonLocalizedSeriesFound = infos + .Where(i => !i.IsSpecial) + .Select(i => i.Series).DistinctBy(Parser.Parser.Normalize).ToList(); + if (nonLocalizedSeriesFound.Count == 1) + { + nonLocalizedSeries = nonLocalizedSeriesFound.First(); + } + else + { + // There can be a case where there are multiple series in a folder that causes merging. + if (nonLocalizedSeriesFound.Count > 2) + { + _logger.LogError("[ScannerService] There are multiple series within one folder that contain localized series. This will cause them to group incorrectly. Please separate series into their own dedicated folder or ensure there is only 2 potential series (localized and series): {LocalizedSeries}", string.Join(", ", nonLocalizedSeriesFound)); + } + nonLocalizedSeries = nonLocalizedSeriesFound.FirstOrDefault(s => !s.Equals(localizedSeries)); + } + + if (string.IsNullOrEmpty(nonLocalizedSeries)) return; + + var normalizedNonLocalizedSeries = Parser.Parser.Normalize(nonLocalizedSeries); + foreach (var infoNeedingMapping in infos.Where(i => + !Parser.Parser.Normalize(i.Series).Equals(normalizedNonLocalizedSeries))) + { + infoNeedingMapping.Series = nonLocalizedSeries; + infoNeedingMapping.LocalizedSeries = localizedSeries; + } } } } diff --git a/API/Parser/DefaultParser.cs b/API/Services/Tasks/Scanner/Parser/DefaultParser.cs similarity index 52% rename from API/Parser/DefaultParser.cs rename to API/Services/Tasks/Scanner/Parser/DefaultParser.cs index 161a1533b..60317e97d 100644 --- a/API/Parser/DefaultParser.cs +++ b/API/Services/Tasks/Scanner/Parser/DefaultParser.cs @@ -5,10 +5,16 @@ using API.Services; namespace API.Parser; +public interface IDefaultParser +{ + ParserInfo Parse(string filePath, string rootPath, LibraryType type = LibraryType.Manga); + void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret); +} + /// /// This is an implementation of the Parser that is the basis for everything /// -public class DefaultParser +public class DefaultParser : IDefaultParser { private readonly IDirectoryService _directoryService; @@ -30,15 +36,15 @@ public class DefaultParser var fileName = _directoryService.FileSystem.Path.GetFileNameWithoutExtension(filePath); ParserInfo ret; - if (Parser.IsEpub(filePath)) + if (Services.Tasks.Scanner.Parser.Parser.IsEpub(filePath)) { ret = new ParserInfo() { - Chapters = Parser.ParseChapter(fileName) ?? Parser.ParseComicChapter(fileName), - Series = Parser.ParseSeries(fileName) ?? Parser.ParseComicSeries(fileName), - Volumes = Parser.ParseVolume(fileName) ?? Parser.ParseComicVolume(fileName), + Chapters = Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName), + Series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName), + Volumes = Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName) ?? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName), Filename = Path.GetFileName(filePath), - Format = Parser.ParseFormat(filePath), + Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath), FullFilePath = filePath }; } @@ -46,65 +52,65 @@ public class DefaultParser { ret = new ParserInfo() { - Chapters = type == LibraryType.Manga ? Parser.ParseChapter(fileName) : Parser.ParseComicChapter(fileName), - Series = type == LibraryType.Manga ? Parser.ParseSeries(fileName) : Parser.ParseComicSeries(fileName), - Volumes = type == LibraryType.Manga ? Parser.ParseVolume(fileName) : Parser.ParseComicVolume(fileName), + Chapters = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseChapter(fileName), + Series = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSeries(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseSeries(fileName), + Volumes = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseVolume(fileName), Filename = Path.GetFileName(filePath), - Format = Parser.ParseFormat(filePath), + Format = Services.Tasks.Scanner.Parser.Parser.ParseFormat(filePath), Title = Path.GetFileNameWithoutExtension(fileName), FullFilePath = filePath }; } - if (Parser.IsImage(filePath) && Parser.IsCoverImage(filePath)) return null; + if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath) && Services.Tasks.Scanner.Parser.Parser.IsCoverImage(filePath)) return null; - if (Parser.IsImage(filePath)) + if (Services.Tasks.Scanner.Parser.Parser.IsImage(filePath)) { // Reset Chapters, Volumes, and Series as images are not good to parse information out of. Better to use folders. - ret.Volumes = Parser.DefaultVolume; - ret.Chapters = Parser.DefaultChapter; + ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume; + ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter; ret.Series = string.Empty; } - if (ret.Series == string.Empty || Parser.IsImage(filePath)) + if (ret.Series == string.Empty || Services.Tasks.Scanner.Parser.Parser.IsImage(filePath)) { // Try to parse information out of each folder all the way to rootPath ParseFromFallbackFolders(filePath, rootPath, type, ref ret); } - var edition = Parser.ParseEdition(fileName); + var edition = Services.Tasks.Scanner.Parser.Parser.ParseEdition(fileName); if (!string.IsNullOrEmpty(edition)) { - ret.Series = Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic); + ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(ret.Series.Replace(edition, ""), type is LibraryType.Comic); ret.Edition = edition; } - var isSpecial = type == LibraryType.Comic ? Parser.ParseComicSpecial(fileName) : Parser.ParseMangaSpecial(fileName); + var isSpecial = type == LibraryType.Comic ? Services.Tasks.Scanner.Parser.Parser.ParseComicSpecial(fileName) : Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(fileName); // We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that // could cause a problem as Omake is a special term, but there is valid volume/chapter information. - if (ret.Chapters == Parser.DefaultChapter && ret.Volumes == Parser.DefaultVolume && !string.IsNullOrEmpty(isSpecial)) + if (ret.Chapters == Services.Tasks.Scanner.Parser.Parser.DefaultChapter && ret.Volumes == Services.Tasks.Scanner.Parser.Parser.DefaultVolume && !string.IsNullOrEmpty(isSpecial)) { ret.IsSpecial = true; ParseFromFallbackFolders(filePath, rootPath, type, ref ret); // NOTE: This can cause some complications, we should try to be a bit less aggressive to fallback to folder } // If we are a special with marker, we need to ensure we use the correct series name. we can do this by falling back to Folder name - if (Parser.HasSpecialMarker(fileName)) + if (Services.Tasks.Scanner.Parser.Parser.HasSpecialMarker(fileName)) { ret.IsSpecial = true; - ret.Chapters = Parser.DefaultChapter; - ret.Volumes = Parser.DefaultVolume; + ret.Chapters = Services.Tasks.Scanner.Parser.Parser.DefaultChapter; + ret.Volumes = Services.Tasks.Scanner.Parser.Parser.DefaultVolume; ParseFromFallbackFolders(filePath, rootPath, type, ref ret); } if (string.IsNullOrEmpty(ret.Series)) { - ret.Series = Parser.CleanTitle(fileName, type is LibraryType.Comic); + ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(fileName, type is LibraryType.Comic); } // Pdfs may have .pdf in the series name, remove that - if (Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf")) + if (Services.Tasks.Scanner.Parser.Parser.IsPdf(filePath) && ret.Series.ToLower().EndsWith(".pdf")) { ret.Series = ret.Series.Substring(0, ret.Series.Length - ".pdf".Length); } @@ -125,18 +131,18 @@ public class DefaultParser for (var i = 0; i < fallbackFolders.Count; i++) { var folder = fallbackFolders[i]; - if (!string.IsNullOrEmpty(Parser.ParseMangaSpecial(folder))) continue; + if (!string.IsNullOrEmpty(Services.Tasks.Scanner.Parser.Parser.ParseMangaSpecial(folder))) continue; - var parsedVolume = type is LibraryType.Manga ? Parser.ParseVolume(folder) : Parser.ParseComicVolume(folder); - var parsedChapter = type is LibraryType.Manga ? Parser.ParseChapter(folder) : Parser.ParseComicChapter(folder); + var parsedVolume = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseVolume(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicVolume(folder); + var parsedChapter = type is LibraryType.Manga ? Services.Tasks.Scanner.Parser.Parser.ParseChapter(folder) : Services.Tasks.Scanner.Parser.Parser.ParseComicChapter(folder); - if (!parsedVolume.Equals(Parser.DefaultVolume) || !parsedChapter.Equals(Parser.DefaultChapter)) + if (!parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume) || !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) { - if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Parser.DefaultVolume)) && !parsedVolume.Equals(Parser.DefaultVolume)) + if ((string.IsNullOrEmpty(ret.Volumes) || ret.Volumes.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) && !parsedVolume.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultVolume)) { ret.Volumes = parsedVolume; } - if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Parser.DefaultChapter)) && !parsedChapter.Equals(Parser.DefaultChapter)) + if ((string.IsNullOrEmpty(ret.Chapters) || ret.Chapters.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) && !parsedChapter.Equals(Services.Tasks.Scanner.Parser.Parser.DefaultChapter)) { ret.Chapters = parsedChapter; } @@ -145,11 +151,11 @@ public class DefaultParser // Generally users group in series folders. Let's try to parse series from the top folder if (!folder.Equals(ret.Series) && i == fallbackFolders.Count - 1) { - var series = Parser.ParseSeries(folder); + var series = Services.Tasks.Scanner.Parser.Parser.ParseSeries(folder); if (string.IsNullOrEmpty(series)) { - ret.Series = Parser.CleanTitle(folder, type is LibraryType.Comic); + ret.Series = Services.Tasks.Scanner.Parser.Parser.CleanTitle(folder, type is LibraryType.Comic); break; } diff --git a/API/Parser/Parser.cs b/API/Services/Tasks/Scanner/Parser/Parser.cs similarity index 98% rename from API/Parser/Parser.cs rename to API/Services/Tasks/Scanner/Parser/Parser.cs index b79ad0889..8db88333e 100644 --- a/API/Parser/Parser.cs +++ b/API/Services/Tasks/Scanner/Parser/Parser.cs @@ -5,7 +5,7 @@ using System.Linq; using System.Text.RegularExpressions; using API.Entities.Enums; -namespace API.Parser +namespace API.Services.Tasks.Scanner.Parser { public static class Parser { @@ -15,7 +15,7 @@ namespace API.Parser public const string ImageFileExtensions = @"^(\.png|\.jpeg|\.jpg|\.webp|\.gif)"; public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|\.cb7|\.cbt"; - public const string BookFileExtensions = @"\.epub|\.pdf"; + private const string BookFileExtensions = @"\.epub|\.pdf"; public const string MacOsMetadataFileStartsWith = @"._"; public const string SupportedExtensions = @@ -1031,9 +1031,15 @@ namespace API.Parser return IsImage(filename) && CoverImageRegex.IsMatch(filename); } + /// + /// Validates that a Path doesn't start with certain blacklisted folders, like __MACOSX, @Recently-Snapshot, etc and that if a full path, the filename + /// doesn't start with ._, which is a metadata file on MACOSX. + /// + /// + /// public static bool HasBlacklistedFolderInPath(string path) { - return path.Contains("__MACOSX") || path.StartsWith("@Recently-Snapshot") || path.StartsWith("@recycle") || path.StartsWith("._") || path.Contains(".qpkg"); + return path.Contains("__MACOSX") || path.StartsWith("@Recently-Snapshot") || path.StartsWith("@recycle") || path.StartsWith("._") || Path.GetFileName(path).StartsWith("._") || path.Contains(".qpkg"); } @@ -1066,7 +1072,8 @@ namespace API.Parser /// public static string NormalizePath(string path) { - return path.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar); + return path.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar) + .Replace(@"//", Path.AltDirectorySeparatorChar + string.Empty); } /// diff --git a/API/Parser/ParserInfo.cs b/API/Services/Tasks/Scanner/Parser/ParserInfo.cs similarity index 99% rename from API/Parser/ParserInfo.cs rename to API/Services/Tasks/Scanner/Parser/ParserInfo.cs index caae49f84..4a0a3fdc6 100644 --- a/API/Parser/ParserInfo.cs +++ b/API/Services/Tasks/Scanner/Parser/ParserInfo.cs @@ -1,5 +1,6 @@ using API.Data.Metadata; using API.Entities.Enums; +using API.Services.Tasks.Scanner.Parser; namespace API.Parser { diff --git a/API/Services/Tasks/Scanner/ProcessSeries.cs b/API/Services/Tasks/Scanner/ProcessSeries.cs new file mode 100644 index 000000000..e8db2a97a --- /dev/null +++ b/API/Services/Tasks/Scanner/ProcessSeries.cs @@ -0,0 +1,819 @@ +using System; +using System.Collections.Generic; +using System.Collections.Immutable; +using System.Diagnostics; +using System.Linq; +using System.Threading.Tasks; +using API.Data; +using API.Data.Metadata; +using API.Entities; +using API.Entities.Enums; +using API.Extensions; +using API.Helpers; +using API.Parser; +using API.Services.Tasks.Metadata; +using API.SignalR; +using Hangfire; +using Microsoft.Extensions.Logging; + +namespace API.Services.Tasks.Scanner; + +public interface IProcessSeries +{ + /// + /// Do not allow this Prime to be invoked by multiple threads. It will break the DB. + /// + /// + Task Prime(); + Task ProcessSeriesAsync(IList parsedInfos, Library library); + void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false); +} + +/// +/// All code needed to Update a Series from a Scan action +/// +public class ProcessSeries : IProcessSeries +{ + private readonly IUnitOfWork _unitOfWork; + private readonly ILogger _logger; + private readonly IEventHub _eventHub; + private readonly IDirectoryService _directoryService; + private readonly ICacheHelper _cacheHelper; + private readonly IReadingItemService _readingItemService; + private readonly IFileService _fileService; + private readonly IMetadataService _metadataService; + private readonly IWordCountAnalyzerService _wordCountAnalyzerService; + + private IList _genres; + private IList _people; + private IList _tags; + + + + public ProcessSeries(IUnitOfWork unitOfWork, ILogger logger, IEventHub eventHub, + IDirectoryService directoryService, ICacheHelper cacheHelper, IReadingItemService readingItemService, + IFileService fileService, IMetadataService metadataService, IWordCountAnalyzerService wordCountAnalyzerService) + { + _unitOfWork = unitOfWork; + _logger = logger; + _eventHub = eventHub; + _directoryService = directoryService; + _cacheHelper = cacheHelper; + _readingItemService = readingItemService; + _fileService = fileService; + _metadataService = metadataService; + _wordCountAnalyzerService = wordCountAnalyzerService; + } + + /// + /// Invoke this before processing any series, just once to prime all the needed data during a scan + /// + public async Task Prime() + { + _genres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); + _people = await _unitOfWork.PersonRepository.GetAllPeople(); + _tags = await _unitOfWork.TagRepository.GetAllTagsAsync(); + } + + public async Task ProcessSeriesAsync(IList parsedInfos, Library library) + { + if (!parsedInfos.Any()) return; + + var seriesAdded = false; + var scanWatch = Stopwatch.StartNew(); + var seriesName = parsedInfos.First().Series; + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, + MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Updated, seriesName)); + _logger.LogInformation("[ScannerService] Beginning series update on {SeriesName}", seriesName); + + // Check if there is a Series + var firstInfo = parsedInfos.First(); + Series series; + try + { + series = + await _unitOfWork.SeriesRepository.GetFullSeriesByAnyName(firstInfo.Series, firstInfo.LocalizedSeries, + library.Id, firstInfo.Format); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was an exception finding existing series for {SeriesName} with Localized name of {LocalizedName} for library {LibraryId}. This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan", firstInfo.Series, firstInfo.LocalizedSeries, library.Id); + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"There was an exception finding existing series for {firstInfo.Series} with Localized name of {firstInfo.LocalizedSeries} for library {library.Id}", + "This indicates you have duplicate series with same name or localized name in the library. Correct this and rescan.")); + return; + } + + if (series == null) + { + seriesAdded = true; + series = DbFactory.Series(firstInfo.Series, firstInfo.LocalizedSeries); + } + + if (series.LibraryId == 0) series.LibraryId = library.Id; + + try + { + _logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName); + + var firstParsedInfo = parsedInfos[0]; + + UpdateVolumes(series, parsedInfos); + series.Pages = series.Volumes.Sum(v => v.Pages); + + series.NormalizedName = Parser.Parser.Normalize(series.Name); + series.OriginalName ??= firstParsedInfo.Series; + if (series.Format == MangaFormat.Unknown) + { + series.Format = firstParsedInfo.Format; + } + + if (string.IsNullOrEmpty(series.SortName)) + { + series.SortName = series.Name; + } + if (!series.SortNameLocked) + { + series.SortName = series.Name; + if (!string.IsNullOrEmpty(firstParsedInfo.SeriesSort)) + { + series.SortName = firstParsedInfo.SeriesSort; + } + } + + // parsedInfos[0] is not the first volume or chapter. We need to find it + var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p)); + if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries)) + { + series.LocalizedName = localizedSeries; + series.NormalizedLocalizedName = Parser.Parser.Normalize(series.LocalizedName); + } + + UpdateSeriesMetadata(series, library.Type); + + // Update series FolderPath here + await UpdateSeriesFolderPath(parsedInfos, library, series); + + series.LastFolderScanned = DateTime.Now; + _unitOfWork.SeriesRepository.Attach(series); + + if (_unitOfWork.HasChanges()) + { + try + { + await _unitOfWork.CommitAsync(); + } + catch (Exception ex) + { + await _unitOfWork.RollbackAsync(); + _logger.LogCritical(ex, "[ScannerService] There was an issue writing to the for series {@SeriesName}", series); + + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"There was an issue writing to the DB for Series {series}", + ex.Message)); + return; + } + + if (seriesAdded) + { + await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded, + MessageFactory.SeriesAddedEvent(series.Id, series.Name, series.LibraryId), false); + } + + _logger.LogInformation("[ScannerService] Finished series update on {SeriesName} in {Milliseconds} ms", seriesName, scanWatch.ElapsedMilliseconds); + } + } + catch (Exception ex) + { + _logger.LogError(ex, "[ScannerService] There was an exception updating series for {SeriesName}", series.Name); + } + + await _metadataService.GenerateCoversForSeries(series, false); + EnqueuePostSeriesProcessTasks(series.LibraryId, series.Id); + } + + private async Task UpdateSeriesFolderPath(IEnumerable parsedInfos, Library library, Series series) + { + var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(library.Folders.Select(l => l.Path), + parsedInfos.Select(f => f.FullFilePath).ToList()); + if (seriesDirs.Keys.Count == 0) + { + _logger.LogCritical( + "Scan Series has files spread outside a main series folder. This has negative performance effects. Please ensure all series are under a single folder from library"); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} has files spread outside a single series folder", + "This has negative performance effects. Please ensure all series are under a single folder from library")); + } + else + { + // Don't save FolderPath if it's a library Folder + if (!library.Folders.Select(f => f.Path).Contains(seriesDirs.Keys.First())) + { + series.FolderPath = Parser.Parser.NormalizePath(seriesDirs.Keys.First()); + } + } + } + + public void EnqueuePostSeriesProcessTasks(int libraryId, int seriesId, bool forceUpdate = false) + { + //BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, seriesId, forceUpdate)); + BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, seriesId, forceUpdate)); + } + + private static void UpdateSeriesMetadata(Series series, LibraryType libraryType) + { + series.Metadata ??= DbFactory.SeriesMetadata(new List()); + var isBook = libraryType == LibraryType.Book; + var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook); + + var firstFile = firstChapter?.Files.FirstOrDefault(); + if (firstFile == null) return; + if (Parser.Parser.IsPdf(firstFile.FilePath)) return; + + var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList(); + + // Update Metadata based on Chapter metadata + series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year); + + if (series.Metadata.ReleaseYear < 1000) + { + // Not a valid year, default to 0 + series.Metadata.ReleaseYear = 0; + } + + // Set the AgeRating as highest in all the comicInfos + if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating); + + series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount); + series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count); + // To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well. + if (series.Metadata.MaxCount != series.Metadata.TotalCount) + { + var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name)); + var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range)); + if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume; + else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter; + } + + + if (!series.Metadata.PublicationStatusLocked) + { + series.Metadata.PublicationStatus = PublicationStatus.OnGoing; + if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0) + { + series.Metadata.PublicationStatus = PublicationStatus.Completed; + } else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0) + { + series.Metadata.PublicationStatus = PublicationStatus.Ended; + } + } + + if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked) + { + series.Metadata.Summary = firstChapter.Summary; + } + + if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked) + { + series.Metadata.Language = firstChapter.Language; + } + + // Handle People + foreach (var chapter in chapters) + { + if (!series.Metadata.WriterLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Writer)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.CoverArtistLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.CoverArtist)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.PublisherLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Publisher)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.CharacterLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Character)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.ColoristLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Colorist)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.EditorLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Editor)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.InkerLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Inker)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.LettererLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Letterer)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.PencillerLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Penciller)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.TranslatorLocked) + { + foreach (var person in chapter.People.Where(p => p.Role == PersonRole.Translator)) + { + PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); + } + } + + if (!series.Metadata.TagsLocked) + { + foreach (var tag in chapter.Tags) + { + TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag); + } + } + + if (!series.Metadata.GenresLocked) + { + foreach (var genre in chapter.Genres) + { + GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre); + } + } + } + + var genres = chapters.SelectMany(c => c.Genres).ToList(); + GenreHelper.KeepOnlySameGenreBetweenLists(series.Metadata.Genres.ToList(), genres, genre => + { + if (series.Metadata.GenresLocked) return; + series.Metadata.Genres.Remove(genre); + }); + + // NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it + // I might be able to filter out people that are in locked fields? + var people = chapters.SelectMany(c => c.People).ToList(); + PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People.ToList(), + people, person => + { + switch (person.Role) + { + case PersonRole.Writer: + if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Penciller: + if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Inker: + if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Colorist: + if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Letterer: + if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.CoverArtist: + if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Editor: + if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Publisher: + if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Character: + if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person); + break; + case PersonRole.Translator: + if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person); + break; + default: + series.Metadata.People.Remove(person); + break; + } + }); + } + + private void UpdateVolumes(Series series, IList parsedInfos) + { + var startingVolumeCount = series.Volumes.Count; + // Add new volumes and update chapters per volume + var distinctVolumes = parsedInfos.DistinctVolumes(); + _logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name); + foreach (var volumeNumber in distinctVolumes) + { + var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber); + if (volume == null) + { + volume = DbFactory.Volume(volumeNumber); + volume.SeriesId = series.Id; + series.Volumes.Add(volume); + } + + volume.Name = volumeNumber; + + _logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name); + var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray(); + UpdateChapters(series, volume, infos); + volume.Pages = volume.Chapters.Sum(c => c.Pages); + + // Update all the metadata on the Chapters + foreach (var chapter in volume.Chapters) + { + var firstFile = chapter.Files.MinBy(x => x.Chapter); + if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue; + try + { + var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath)); + UpdateChapterFromComicInfo(chapter, firstChapterInfo?.ComicInfo); + } + catch (Exception ex) + { + _logger.LogError(ex, "There was some issue when updating chapter's metadata"); + } + } + } + + // Remove existing volumes that aren't in parsedInfos + var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList(); + if (series.Volumes.Count != nonDeletedVolumes.Count) + { + _logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name", + (series.Volumes.Count - nonDeletedVolumes.Count), series.Name); + var deletedVolumes = series.Volumes.Except(nonDeletedVolumes); + foreach (var volume in deletedVolumes) + { + var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? ""; + if (!string.IsNullOrEmpty(file) && _directoryService.FileSystem.File.Exists(file)) + { + _logger.LogError( + "[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}", + file); + } + + _logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file); + } + + series.Volumes = nonDeletedVolumes; + } + + _logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}", + series.Name, startingVolumeCount, series.Volumes.Count); + } + + private void UpdateChapters(Series series, Volume volume, IList parsedInfos) + { + // Add new chapters + foreach (var info in parsedInfos) + { + // Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0 + // also are treated like specials for UI grouping. + Chapter chapter; + try + { + chapter = volume.Chapters.GetChapterByRange(info); + } + catch (Exception ex) + { + _logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters); + continue; + } + + if (chapter == null) + { + _logger.LogDebug( + "[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters); + chapter = DbFactory.Chapter(info); + volume.Chapters.Add(chapter); + series.LastChapterAdded = DateTime.Now; + } + else + { + chapter.UpdateFrom(info); + } + + if (chapter == null) continue; + // Add files + var specialTreatment = info.IsSpecialInfo(); + AddOrUpdateFileForChapter(chapter, info); + chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty; + chapter.Range = specialTreatment ? info.Filename : info.Chapters; + } + + + // Remove chapters that aren't in parsedInfos or have no files linked + var existingChapters = volume.Chapters.ToList(); + foreach (var existingChapter in existingChapters) + { + if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter)) + { + _logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series); + volume.Chapters.Remove(existingChapter); + } + else + { + // Ensure we remove any files that no longer exist AND order + existingChapter.Files = existingChapter.Files + .Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath)) + .OrderByNatural(f => f.FilePath).ToList(); + existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages); + } + } + } + + private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info) + { + chapter.Files ??= new List(); + var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath); + if (existingFile != null) + { + existingFile.Format = info.Format; + if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return; + existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format); + // We skip updating DB here with last modified time so that metadata refresh can do it + } + else + { + var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format)); + if (file == null) return; + + chapter.Files.Add(file); + } + } + + #nullable enable + private void UpdateChapterFromComicInfo(Chapter chapter, ComicInfo? info) + { + var firstFile = chapter.Files.MinBy(x => x.Chapter); + if (firstFile == null || + _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return; + + var comicInfo = info; + if (info == null) + { + comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath); + } + + if (comicInfo == null) return; + _logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath); + + chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating); + + if (!string.IsNullOrEmpty(comicInfo.Title)) + { + chapter.TitleName = comicInfo.Title.Trim(); + } + + if (!string.IsNullOrEmpty(comicInfo.Summary)) + { + chapter.Summary = comicInfo.Summary; + } + + if (!string.IsNullOrEmpty(comicInfo.LanguageISO)) + { + chapter.Language = comicInfo.LanguageISO; + } + + if (comicInfo.Count > 0) + { + chapter.TotalCount = comicInfo.Count; + } + + // This needs to check against both Number and Volume to calculate Count + if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0) + { + chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number)); + } + if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0) + { + chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume))); + } + + void AddPerson(Person person) + { + PersonHelper.AddPersonIfNotExists(chapter.People, person); + } + + void AddGenre(Genre genre) + { + //chapter.Genres.Add(genre); + GenreHelper.AddGenreIfNotExists(chapter.Genres, genre); + } + + void AddTag(Tag tag, bool added) + { + //chapter.Tags.Add(tag); + TagHelper.AddTagIfNotExists(chapter.Tags, tag); + } + + + if (comicInfo.Year > 0) + { + var day = Math.Max(comicInfo.Day, 1); + var month = Math.Max(comicInfo.Month, 1); + chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}"); + } + + var people = GetTagValues(comicInfo.Colorist); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist); + UpdatePeople(people, PersonRole.Colorist, + AddPerson); + + people = GetTagValues(comicInfo.Characters); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character); + UpdatePeople(people, PersonRole.Character, + AddPerson); + + + people = GetTagValues(comicInfo.Translator); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator); + UpdatePeople(people, PersonRole.Translator, + AddPerson); + + + people = GetTagValues(comicInfo.Writer); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer); + UpdatePeople(people, PersonRole.Writer, + AddPerson); + + people = GetTagValues(comicInfo.Editor); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor); + UpdatePeople(people, PersonRole.Editor, + AddPerson); + + people = GetTagValues(comicInfo.Inker); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker); + UpdatePeople(people, PersonRole.Inker, + AddPerson); + + people = GetTagValues(comicInfo.Letterer); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer); + UpdatePeople(people, PersonRole.Letterer, + AddPerson); + + + people = GetTagValues(comicInfo.Penciller); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller); + UpdatePeople(people, PersonRole.Penciller, + AddPerson); + + people = GetTagValues(comicInfo.CoverArtist); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist); + UpdatePeople(people, PersonRole.CoverArtist, + AddPerson); + + people = GetTagValues(comicInfo.Publisher); + PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher); + UpdatePeople(people, PersonRole.Publisher, + AddPerson); + + var genres = GetTagValues(comicInfo.Genre); + GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList()); + UpdateGenre(genres, false, + AddGenre); + + var tags = GetTagValues(comicInfo.Tags); + TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList()); + UpdateTag(tags, false, + AddTag); + } + + private static IList GetTagValues(string comicInfoTagSeparatedByComma) + { + + if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma)) + { + return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList(); + } + return ImmutableList.Empty; + } + #nullable disable + + /// + /// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and + /// add an entry. For each person in name, the callback will be executed. + /// + /// This does not remove people if an empty list is passed into names + /// This is used to add new people to a list without worrying about duplicating rows in the DB + /// + /// + /// + private void UpdatePeople(IEnumerable names, PersonRole role, Action action) + { + + var allPeopleTypeRole = _people.Where(p => p.Role == role).ToList(); + + foreach (var name in names) + { + var normalizedName = Parser.Parser.Normalize(name); + var person = allPeopleTypeRole.FirstOrDefault(p => + p.NormalizedName.Equals(normalizedName)); + if (person == null) + { + person = DbFactory.Person(name, role); + lock (_people) + { + _people.Add(person); + } + } + + action(person); + } + } + + /// + /// + /// + /// + /// + /// + private void UpdateGenre(IEnumerable names, bool isExternal, Action action) + { + foreach (var name in names) + { + if (string.IsNullOrEmpty(name.Trim())) continue; + + var normalizedName = Parser.Parser.Normalize(name); + var genre = _genres.FirstOrDefault(p => + p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); + if (genre == null) + { + genre = DbFactory.Genre(name, false); + lock (_genres) + { + _genres.Add(genre); + } + } + + action(genre); + } + } + + /// + /// + /// + /// + /// + /// Callback for every item. Will give said item back and a bool if item was added + private void UpdateTag(IEnumerable names, bool isExternal, Action action) + { + foreach (var name in names) + { + if (string.IsNullOrEmpty(name.Trim())) continue; + + var added = false; + var normalizedName = Parser.Parser.Normalize(name); + + var tag = _tags.FirstOrDefault(p => + p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal); + if (tag == null) + { + added = true; + tag = DbFactory.Tag(name, false); + lock (_tags) + { + _tags.Add(tag); + } + } + + action(tag, added); + } + } + +} diff --git a/API/Services/Tasks/ScannerService.cs b/API/Services/Tasks/ScannerService.cs index d04290b11..662016415 100644 --- a/API/Services/Tasks/ScannerService.cs +++ b/API/Services/Tasks/ScannerService.cs @@ -1,16 +1,13 @@ using System; using System.Collections.Generic; -using System.Collections.Immutable; using System.Diagnostics; +using System.Globalization; using System.IO; using System.Linq; -using System.Threading; using System.Threading.Tasks; using API.Data; -using API.Data.Metadata; using API.Data.Repositories; using API.Entities; -using API.Entities.Enums; using API.Extensions; using API.Helpers; using API.Parser; @@ -28,196 +25,322 @@ public interface IScannerService /// cover images if forceUpdate is true. /// /// Library to scan against + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - Task ScanLibrary(int libraryId); + Task ScanLibrary(int libraryId, bool forceUpdate = false); + + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] Task ScanLibraries(); + + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - Task ScanSeries(int libraryId, int seriesId, CancellationToken token); + Task ScanSeries(int seriesId, bool bypassFolderOptimizationChecks = true); + + Task ScanFolder(string folder); + } +public enum ScanCancelReason +{ + /// + /// Don't cancel, everything is good + /// + NoCancel = 0, + /// + /// A folder is completely empty or missing + /// + FolderMount = 1, + /// + /// There has been no change to the filesystem since last scan + /// + NoChange = 2, + /// + /// The underlying folder is missing + /// + FolderMissing = 3 +} + +/** + * Responsible for Scanning the disk and importing/updating/deleting files -> DB entities. + */ public class ScannerService : IScannerService { + public const string Name = "ScannerService"; private readonly IUnitOfWork _unitOfWork; private readonly ILogger _logger; private readonly IMetadataService _metadataService; private readonly ICacheService _cacheService; private readonly IEventHub _eventHub; - private readonly IFileService _fileService; private readonly IDirectoryService _directoryService; private readonly IReadingItemService _readingItemService; - private readonly ICacheHelper _cacheHelper; + private readonly IProcessSeries _processSeries; private readonly IWordCountAnalyzerService _wordCountAnalyzerService; public ScannerService(IUnitOfWork unitOfWork, ILogger logger, IMetadataService metadataService, ICacheService cacheService, IEventHub eventHub, - IFileService fileService, IDirectoryService directoryService, IReadingItemService readingItemService, - ICacheHelper cacheHelper, IWordCountAnalyzerService wordCountAnalyzerService) + IDirectoryService directoryService, IReadingItemService readingItemService, + IProcessSeries processSeries, IWordCountAnalyzerService wordCountAnalyzerService) { _unitOfWork = unitOfWork; _logger = logger; _metadataService = metadataService; _cacheService = cacheService; _eventHub = eventHub; - _fileService = fileService; _directoryService = directoryService; _readingItemService = readingItemService; - _cacheHelper = cacheHelper; + _processSeries = processSeries; _wordCountAnalyzerService = wordCountAnalyzerService; } - [DisableConcurrentExecution(60 * 60 * 60)] - [AutomaticRetry(Attempts = 3, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - public async Task ScanSeries(int libraryId, int seriesId, CancellationToken token) + public async Task ScanFolder(string folder) { - var sw = new Stopwatch(); + var seriesId = await _unitOfWork.SeriesRepository.GetSeriesIdByFolder(folder); + if (seriesId > 0) + { + BackgroundJob.Enqueue(() => ScanSeries(seriesId, true)); + return; + } + + // This is basically rework of what's already done in Library Watcher but is needed if invoked via API + var parentDirectory = _directoryService.GetParentDirectoryName(folder); + if (string.IsNullOrEmpty(parentDirectory)) return; // This should never happen as it's calculated before enqueing + + var libraries = (await _unitOfWork.LibraryRepository.GetLibraryDtosAsync()).ToList(); + var libraryFolders = libraries.SelectMany(l => l.Folders); + var libraryFolder = libraryFolders.Select(Scanner.Parser.Parser.NormalizePath).SingleOrDefault(f => f.Contains(parentDirectory)); + + if (string.IsNullOrEmpty(libraryFolder)) return; + + var library = libraries.FirstOrDefault(l => l.Folders.Select(Scanner.Parser.Parser.NormalizePath).Contains(libraryFolder)); + if (library != null) + { + BackgroundJob.Enqueue(() => ScanLibrary(library.Id, false)); + } + } + + /// + /// + /// + /// + /// Not Used. Scan series will always force + [Queue(TaskScheduler.ScanQueue)] + public async Task ScanSeries(int seriesId, bool bypassFolderOptimizationChecks = true) + { + var sw = Stopwatch.StartNew(); var files = await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId); var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId); var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new[] {seriesId}); - var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); - var folderPaths = library.Folders.Select(f => f.Path).ToList(); - - var seriesFolderPaths = (await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId)) - .Select(f => _directoryService.FileSystem.FileInfo.FromFileName(f.FilePath).Directory.FullName) - .ToList(); + var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(series.LibraryId, LibraryIncludes.Folders); var libraryPaths = library.Folders.Select(f => f.Path).ToList(); - - if (!await CheckMounts(library.Name, seriesFolderPaths)) + if (await ShouldScanSeries(seriesId, library, libraryPaths, series, true) != ScanCancelReason.NoCancel) { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(series.LibraryId, seriesId, false)); + BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(library.Id, seriesId, false)); return; } - if (!await CheckMounts(library.Name, libraryPaths)) + var folderPath = series.FolderPath; + if (string.IsNullOrEmpty(folderPath) || !_directoryService.Exists(folderPath)) { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + // We don't care if it's multiple due to new scan loop enforcing all in one root directory + var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); + if (seriesDirs.Keys.Count == 0) + { + _logger.LogCritical("Scan Series has files spread outside a main series folder. Defaulting to library folder (this is expensive)"); + await _eventHub.SendMessageAsync(MessageFactory.Info, MessageFactory.InfoEvent($"{series.Name} is not organized well and scan series will be expensive!", "Scan Series has files spread outside a main series folder. Defaulting to library folder (this is expensive)")); + seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); + } + + folderPath = seriesDirs.Keys.FirstOrDefault(); + + // We should check if folderPath is a library folder path and if so, return early and tell user to correct their setup. + if (libraryPaths.Contains(folderPath)) + { + _logger.LogCritical("[ScannerSeries] {SeriesName} scan aborted. Files for series are not in a nested folder under library path. Correct this and rescan", series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Files for series are not in a nested folder under library path. Correct this and rescan.")); + return; + } + } + + if (string.IsNullOrEmpty(folderPath)) + { + _logger.LogCritical("[ScannerSeries] Scan Series could not find a single, valid folder root for files"); + await _eventHub.SendMessageAsync(MessageFactory.Error, MessageFactory.ErrorEvent($"{series.Name} scan aborted", "Scan Series could not find a single, valid folder root for files")); return; } - var allPeople = await _unitOfWork.PersonRepository.GetAllPeople(); - var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); - var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync(); + var parsedSeries = new Dictionary>(); + var processTasks = new List(); - // Shouldn't this be libraryPath? - var seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(libraryPaths, files.Select(f => f.FilePath).ToList()); - if (seriesDirs.Keys.Count == 0) + + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); + + await _processSeries.Prime(); + void TrackFiles(Tuple> parsedInfo) { - _logger.LogDebug("Scan Series has files spread outside a main series folder. Defaulting to library folder"); - seriesDirs = _directoryService.FindHighestDirectoriesFromFiles(folderPaths, files.Select(f => f.FilePath).ToList()); + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = Scanner.Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + if (!foundParsedSeries.NormalizedName.Equals(series.NormalizedName)) + { + return; + } + + processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library)); + parsedSeries.Add(foundParsedSeries, parsedFiles); } _logger.LogInformation("Beginning file scan on {SeriesName}", series.Name); - var (totalFiles, scanElapsedTime, parsedSeries) = await ScanFiles(library, seriesDirs.Keys); + var scanElapsedTime = await ScanFiles(library, new []{folderPath}, false, TrackFiles, true); + _logger.LogInformation("ScanFiles for {Series} took {Time}", series.Name, scanElapsedTime); + //await Task.WhenAll(processTasks); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); // Remove any parsedSeries keys that don't belong to our series. This can occur when users store 2 series in the same folder RemoveParsedInfosNotForSeries(parsedSeries, series); - // If nothing was found, first validate any of the files still exist. If they don't then we have a deletion and can skip the rest of the logic flow - if (parsedSeries.Count == 0) - { - var anyFilesExist = - (await _unitOfWork.SeriesRepository.GetFilesForSeries(series.Id)).Any(m => File.Exists(m.FilePath)); + // If nothing was found, first validate any of the files still exist. If they don't then we have a deletion and can skip the rest of the logic flow + if (parsedSeries.Count == 0) + { + var seriesFiles = (await _unitOfWork.SeriesRepository.GetFilesForSeries(series.Id)); + var anyFilesExist = seriesFiles.Where(f => f.FilePath.Contains(series.FolderPath)).Any(m => File.Exists(m.FilePath)); - if (!anyFilesExist) - { - try - { - _unitOfWork.SeriesRepository.Remove(series); - await CommitAndSend(totalFiles, parsedSeries, sw, scanElapsedTime, series); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "There was an error during ScanSeries to delete the series"); - await _unitOfWork.RollbackAsync(); - } + if (!anyFilesExist) + { + try + { + _unitOfWork.SeriesRepository.Remove(series); + await CommitAndSend(1, sw, scanElapsedTime, series); + await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved, + MessageFactory.SeriesRemovedEvent(seriesId, string.Empty, series.LibraryId), false); + } + catch (Exception ex) + { + _logger.LogCritical(ex, "There was an error during ScanSeries to delete the series as no files could be found. Aborting scan"); + await _unitOfWork.RollbackAsync(); + return; + } + } + else + { + // I think we should just fail and tell user to fix their setup. This is extremely expensive for an edge case + _logger.LogCritical("We weren't able to find any files in the series scan, but there should be. Please correct your naming convention or put Series in a dedicated folder. Aborting scan"); + await _eventHub.SendMessageAsync(MessageFactory.Error, + MessageFactory.ErrorEvent($"Error scanning {series.Name}", "We weren't able to find any files in the series scan, but there should be. Please correct your naming convention or put Series in a dedicated folder. Aborting scan")); + await _unitOfWork.RollbackAsync(); + return; + } + // At this point, parsedSeries will have at least one key and we can perform the update. If it still doesn't, just return and don't do anything + if (parsedSeries.Count == 0) return; + } - } - else - { - // We need to do an additional check for an edge case: If the scan ran and the files do not match the existing Series name, then it is very likely, - // the files have crap naming and if we don't correct, the series will get deleted due to the parser not being able to fallback onto folder parsing as the root - // is the series folder. - var existingFolder = seriesDirs.Keys.FirstOrDefault(key => key.Contains(series.OriginalName)); - if (seriesDirs.Keys.Count == 1 && !string.IsNullOrEmpty(existingFolder)) - { - seriesDirs = new Dictionary(); - var path = Directory.GetParent(existingFolder)?.FullName; - if (!folderPaths.Contains(path) || !folderPaths.Any(p => p.Contains(path ?? string.Empty))) - { - _logger.LogCritical("[ScanService] Aborted: {SeriesName} has bad naming convention and sits at root of library. Cannot scan series without deletion occuring. Correct file names to have Series Name within it or perform Scan Library", series.OriginalName); - await _eventHub.SendMessageAsync(MessageFactory.Error, - MessageFactory.ErrorEvent($"Scan of {series.Name} aborted", $"{series.OriginalName} has bad naming convention and sits at root of library. Cannot scan series without deletion occuring. Correct file names to have Series Name within it or perform Scan Library")); - return; - } - if (!string.IsNullOrEmpty(path)) - { - seriesDirs[path] = string.Empty; - } - } - var (totalFiles2, scanElapsedTime2, parsedSeries2) = await ScanFiles(library, seriesDirs.Keys); - _logger.LogInformation("{SeriesName} has bad naming convention, forcing rescan at a higher directory", series.OriginalName); - totalFiles += totalFiles2; - scanElapsedTime += scanElapsedTime2; - parsedSeries = parsedSeries2; - RemoveParsedInfosNotForSeries(parsedSeries, series); - } - } - - // At this point, parsedSeries will have at least one key and we can perform the update. If it still doesn't, just return and don't do anything - if (parsedSeries.Count == 0) return; - - // Merge any series together that might have different ParsedSeries but belong to another group of ParsedSeries - try - { - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - - await CommitAndSend(totalFiles, parsedSeries, sw, scanElapsedTime, series); - await RemoveAbandonedMetadataKeys(); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "There was an error during ScanSeries to update the series"); - await _unitOfWork.RollbackAsync(); - } + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); // Tell UI that this series is done await _eventHub.SendMessageAsync(MessageFactory.ScanSeries, - MessageFactory.ScanSeriesEvent(libraryId, seriesId, series.Name)); - await CleanupDbEntities(); + MessageFactory.ScanSeriesEvent(library.Id, seriesId, series.Name)); + + await _metadataService.RemoveAbandonedMetadataKeys(); BackgroundJob.Enqueue(() => _cacheService.CleanupChapters(chapterIds)); BackgroundJob.Enqueue(() => _directoryService.ClearDirectory(_directoryService.TempDirectory)); - BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForSeries(libraryId, series.Id, false)); - BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanSeries(libraryId, series.Id, false)); } - private static void RemoveParsedInfosNotForSeries(Dictionary> parsedSeries, Series series) + private async Task ShouldScanSeries(int seriesId, Library library, IList libraryPaths, Series series, bool bypassFolderChecks = false) + { + var seriesFolderPaths = (await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId)) + .Select(f => _directoryService.FileSystem.FileInfo.FromFileName(f.FilePath).Directory.FullName) + .Distinct() + .ToList(); + + if (!await CheckMounts(library.Name, seriesFolderPaths)) + { + _logger.LogCritical( + "Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + return ScanCancelReason.FolderMount; + } + + if (!await CheckMounts(library.Name, libraryPaths)) + { + _logger.LogCritical( + "Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); + return ScanCancelReason.FolderMount; + } + + // If all series Folder paths haven't been modified since last scan, abort (NOTE: This flow never happens as ScanSeries will always bypass) + if (!bypassFolderChecks) + { + + var allFolders = seriesFolderPaths.SelectMany(path => _directoryService.GetDirectories(path)).ToList(); + allFolders.AddRange(seriesFolderPaths); + + try + { + if (allFolders.All(folder => _directoryService.GetLastWriteTime(folder) <= series.LastFolderScanned)) + { + _logger.LogInformation( + "[ScannerService] {SeriesName} scan has no work to do. All folders have not been changed since last scan", + series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.InfoEvent($"{series.Name} scan has no work to do", + $"All folders have not been changed since last scan ({series.LastFolderScanned.ToString(CultureInfo.CurrentCulture)}). Scan will be aborted.")); + return ScanCancelReason.NoChange; + } + } + catch (IOException ex) + { + // If there is an exception it means that the folder doesn't exist. So we should delete the series + _logger.LogError(ex, "[ScannerService] Scan series for {SeriesName} found the folder path no longer exists", + series.Name); + await _eventHub.SendMessageAsync(MessageFactory.Info, + MessageFactory.ErrorEvent($"{series.Name} scan has no work to do", + "The folder the series was in is missing. Delete series manually or perform a library scan.")); + return ScanCancelReason.NoCancel; + } + } + + + return ScanCancelReason.NoCancel; + } + + private static void RemoveParsedInfosNotForSeries(Dictionary> parsedSeries, Series series) { var keys = parsedSeries.Keys; - foreach (var key in keys.Where(key => !SeriesHelper.FindSeries(series, key))) // series.Format != key.Format || + foreach (var key in keys.Where(key => !SeriesHelper.FindSeries(series, key))) { parsedSeries.Remove(key); } } - private async Task CommitAndSend(int totalFiles, - Dictionary> parsedSeries, Stopwatch sw, long scanElapsedTime, Series series) + private async Task CommitAndSend(int seriesCount, Stopwatch sw, long scanElapsedTime, Series series) { if (_unitOfWork.HasChanges()) { await _unitOfWork.CommitAsync(); _logger.LogInformation( - "Processed {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {SeriesName}", - totalFiles, parsedSeries.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, series.Name); + "Processed files and {SeriesCount} series in {ElapsedScanTime} milliseconds for {SeriesName}", + seriesCount, sw.ElapsedMilliseconds + scanElapsedTime, series.Name); } } + /// + /// Ensure that all library folders are mounted. In the case that any are empty or non-existent, emit an event to the UI via EventHub and return false + /// + /// + /// + /// private async Task CheckMounts(string libraryName, IList folders) { // Check if any of the folder roots are not available (ie disconnected from network, etc) and fail if any of them are @@ -236,8 +359,6 @@ public class ScannerService : IScannerService // For Docker instances check if any of the folder roots are not available (ie disconnected volumes, etc) and fail if any of them are if (folders.Any(f => _directoryService.IsDirectoryEmpty(f))) { - // NOTE: Food for thought, move this to throw an exception and let a middleware inform the UI to keep the code clean. (We can throw a custom exception which - // will always propagate to the UI) // That way logging and UI informing is all in one place with full context _logger.LogError("Some of the root folders for the library are empty. " + "Either your mount has been disconnected or you are trying to delete all series in the library. " + @@ -255,13 +376,13 @@ public class ScannerService : IScannerService return true; } + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] public async Task ScanLibraries() { _logger.LogInformation("Starting Scan of All Libraries"); - var libraries = await _unitOfWork.LibraryRepository.GetLibrariesAsync(); - foreach (var lib in libraries) + foreach (var lib in await _unitOfWork.LibraryRepository.GetLibrariesAsync()) { await ScanLibrary(lib.Id); } @@ -275,50 +396,108 @@ public class ScannerService : IScannerService /// ie) all entities will be rechecked for new cover images and comicInfo.xml changes /// /// + [Queue(TaskScheduler.ScanQueue)] [DisableConcurrentExecution(60 * 60 * 60)] [AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)] - public async Task ScanLibrary(int libraryId) + public async Task ScanLibrary(int libraryId, bool forceUpdate = false) { - Library library; - try - { - library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); - } - catch (Exception ex) - { - // This usually only fails if user is not authenticated. - _logger.LogError(ex, "[ScannerService] There was an issue fetching Library {LibraryId}", libraryId); - return; - } - - if (!await CheckMounts(library.Name, library.Folders.Select(f => f.Path).ToList())) - { - _logger.LogCritical("Some of the root folders for library are not accessible. Please check that drives are connected and rescan. Scan will be aborted"); - - return; - } + var sw = Stopwatch.StartNew(); + var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.Folders); + var libraryFolderPaths = library.Folders.Select(fp => fp.Path).ToList(); + if (!await CheckMounts(library.Name, libraryFolderPaths)) return; + // Validations are done, now we can start actual scan _logger.LogInformation("[ScannerService] Beginning file scan on {LibraryName}", library.Name); - var (totalFiles, scanElapsedTime, series) = await ScanFiles(library, library.Folders.Select(fp => fp.Path)); - _logger.LogInformation("[ScannerService] Finished file scan. Updating database"); + // This doesn't work for something like M:/Manga/ and a series has library folder as root + var shouldUseLibraryScan = !(await _unitOfWork.LibraryRepository.DoAnySeriesFoldersMatch(libraryFolderPaths)); + if (!shouldUseLibraryScan) + { + _logger.LogError("Library {LibraryName} consists of one or more Series folders, using series scan", library.Name); + } + + var totalFiles = 0; + var seenSeries = new List(); + + + await _processSeries.Prime(); + var processTasks = new List(); + void TrackFiles(Tuple> parsedInfo) + { + var skippedScan = parsedInfo.Item1; + var parsedFiles = parsedInfo.Item2; + if (parsedFiles.Count == 0) return; + + var foundParsedSeries = new ParsedSeries() + { + Name = parsedFiles.First().Series, + NormalizedName = Scanner.Parser.Parser.Normalize(parsedFiles.First().Series), + Format = parsedFiles.First().Format + }; + + if (skippedScan) + { + seenSeries.AddRange(parsedFiles.Select(pf => new ParsedSeries() + { + Name = pf.Series, + NormalizedName = Scanner.Parser.Parser.Normalize(pf.Series), + Format = pf.Format + })); + return; + } + + totalFiles += parsedFiles.Count; + + + seenSeries.Add(foundParsedSeries); + processTasks.Add(_processSeries.ProcessSeriesAsync(parsedFiles, library)); + } + + + var scanElapsedTime = await ScanFiles(library, libraryFolderPaths, shouldUseLibraryScan, TrackFiles); + + + await Task.WhenAll(processTasks); + + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.FileScanProgressEvent(string.Empty, library.Name, ProgressEventType.Ended)); + + _logger.LogInformation("[ScannerService] Finished file scan in {ScanAndUpdateTime} milliseconds. Updating database", scanElapsedTime); + + var time = DateTime.Now; foreach (var folderPath in library.Folders) { - folderPath.LastScanned = DateTime.Now; + folderPath.LastScanned = time; } - var sw = Stopwatch.StartNew(); - await UpdateLibrary(library, series); + library.LastScanned = time; + + // Could I delete anything in a Library's Series where the LastScan date is before scanStart? + // NOTE: This implementation is expensive + var removedSeries = await _unitOfWork.SeriesRepository.RemoveSeriesNotInList(seenSeries, library.Id); - library.LastScanned = DateTime.Now; _unitOfWork.LibraryRepository.Update(library); if (await _unitOfWork.CommitAsync()) { - _logger.LogInformation( - "[ScannerService] Finished scan of {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}", - totalFiles, series.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, library.Name); + if (totalFiles == 0) + { + _logger.LogInformation( + "[ScannerService] Finished library scan of {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}. There were no changes", + totalFiles, seenSeries.Count, sw.ElapsedMilliseconds, library.Name); + } + else + { + _logger.LogInformation( + "[ScannerService] Finished library scan of {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}", + totalFiles, seenSeries.Count, sw.ElapsedMilliseconds, library.Name); + } + + foreach (var s in removedSeries) + { + await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved, + MessageFactory.SeriesRemovedEvent(s.Id, s.Name, s.LibraryId), false); + } } else { @@ -326,22 +505,24 @@ public class ScannerService : IScannerService "[ScannerService] There was a critical error that resulted in a failed scan. Please check logs and rescan"); } - await CleanupDbEntities(); + await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, string.Empty)); + await _metadataService.RemoveAbandonedMetadataKeys(); - BackgroundJob.Enqueue(() => _metadataService.GenerateCoversForLibrary(libraryId, false)); - BackgroundJob.Enqueue(() => _wordCountAnalyzerService.ScanLibrary(libraryId, false)); BackgroundJob.Enqueue(() => _directoryService.ClearDirectory(_directoryService.TempDirectory)); } - private async Task>>> ScanFiles(Library library, IEnumerable dirs) + private async Task ScanFiles(Library library, IEnumerable dirs, + bool isLibraryScan, Action>> processSeriesInfos = null, bool forceChecks = false) { var scanner = new ParseScannedFiles(_logger, _directoryService, _readingItemService, _eventHub); - var scanWatch = new Stopwatch(); - var parsedSeries = await scanner.ScanLibrariesForSeries(library.Type, dirs, library.Name); - var totalFiles = parsedSeries.Keys.Sum(key => parsedSeries[key].Count); + var scanWatch = Stopwatch.StartNew(); + + await scanner.ScanLibrariesForSeries(library.Type, dirs, library.Name, + isLibraryScan, await _unitOfWork.SeriesRepository.GetFolderPathMap(library.Id), processSeriesInfos, forceChecks); + var scanElapsedTime = scanWatch.ElapsedMilliseconds; - return new Tuple>>(totalFiles, scanElapsedTime, parsedSeries); + return scanElapsedTime; } /// @@ -364,707 +545,8 @@ public class ScannerService : IScannerService _logger.LogInformation("Removed {Count} abandoned collection tags", cleanedUp); } - private async Task UpdateLibrary(Library library, Dictionary> parsedSeries) - { - if (parsedSeries == null) return; - - // Library contains no Series, so we need to fetch series in groups of ChunkSize - var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id); - var stopwatch = Stopwatch.StartNew(); - var totalTime = 0L; - - var allPeople = await _unitOfWork.PersonRepository.GetAllPeople(); - var allGenres = await _unitOfWork.GenreRepository.GetAllGenresAsync(); - var allTags = await _unitOfWork.TagRepository.GetAllTagsAsync(); - - // Update existing series - _logger.LogInformation("[ScannerService] Updating existing series for {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", - library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize); - for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++) - { - if (chunkInfo.TotalChunks == 0) continue; - totalTime += stopwatch.ElapsedMilliseconds; - stopwatch.Restart(); - _logger.LogInformation("[ScannerService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}", - chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize); - var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id, new UserParams() - { - PageNumber = chunk, - PageSize = chunkInfo.ChunkSize - }); - - // First, remove any series that are not in parsedSeries list - var missingSeries = FindSeriesNotOnDisk(nonLibrarySeries, parsedSeries).ToList(); - - foreach (var missing in missingSeries) - { - _unitOfWork.SeriesRepository.Remove(missing); - } - - var cleanedSeries = SeriesHelper.RemoveMissingSeries(nonLibrarySeries, missingSeries, out var removeCount); - if (removeCount > 0) - { - _logger.LogInformation("[ScannerService] Removed {RemoveMissingSeries} series that are no longer on disk:", removeCount); - foreach (var s in missingSeries) - { - _logger.LogDebug("[ScannerService] Removed {SeriesName} ({Format})", s.Name, s.Format); - } - } - - // Now, we only have to deal with series that exist on disk. Let's recalculate the volumes for each series - var librarySeries = cleanedSeries.ToList(); - - foreach (var series in librarySeries) - { - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Started, series.Name)); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); - } - - try - { - await _unitOfWork.CommitAsync(); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "[ScannerService] There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB", chunk); - foreach (var series in nonLibrarySeries) - { - _logger.LogCritical("[ScannerService] There may be a constraint issue with {SeriesName}", series.OriginalName); - } - - await _eventHub.SendMessageAsync(MessageFactory.Error, - MessageFactory.ErrorEvent("There was an issue writing to the DB. Chunk {ChunkNumber} did not save to DB", - "The following series had constraint issues: " + string.Join(",", nonLibrarySeries.Select(s => s.OriginalName)))); - - continue; - } - _logger.LogInformation( - "[ScannerService] Processed {SeriesStart} - {SeriesEnd} series in {ElapsedScanTime} milliseconds for {LibraryName}", - chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, totalTime, library.Name); - - // Emit any series removed - foreach (var missing in missingSeries) - { - await _eventHub.SendMessageAsync(MessageFactory.SeriesRemoved, MessageFactory.SeriesRemovedEvent(missing.Id, missing.Name, library.Id)); - } - - foreach (var series in librarySeries) - { - // This is something more like, the series has finished updating in the backend. It may or may not have been modified. - await _eventHub.SendMessageAsync(MessageFactory.ScanSeries, MessageFactory.ScanSeriesEvent(library.Id, series.Id, series.Name)); - } - } - - - // Add new series that have parsedInfos - _logger.LogDebug("[ScannerService] Adding new series"); - var newSeries = new List(); - var allSeries = (await _unitOfWork.SeriesRepository.GetSeriesForLibraryIdAsync(library.Id)).ToList(); - _logger.LogDebug("[ScannerService] Fetched {AllSeriesCount} series for comparing new series with. There should be {DeltaToParsedSeries} new series", - allSeries.Count, parsedSeries.Count - allSeries.Count); - // TODO: Once a parsedSeries is processed, remove the key to free up some memory - foreach (var (key, infos) in parsedSeries) - { - // Key is normalized already - Series existingSeries; - try - { - existingSeries = allSeries.SingleOrDefault(s => SeriesHelper.FindSeries(s, key)); - } - catch (Exception e) - { - // NOTE: If I ever want to put Duplicates table, this is where it can go - _logger.LogCritical(e, "[ScannerService] There are multiple series that map to normalized key {Key}. You can manually delete the entity via UI and rescan to fix it. This will be skipped", key.NormalizedName); - var duplicateSeries = allSeries.Where(s => SeriesHelper.FindSeries(s, key)); - foreach (var series in duplicateSeries) - { - _logger.LogCritical("[ScannerService] Duplicate Series Found: {Key} maps with {Series}", key.Name, series.OriginalName); - - } - - continue; - } - - if (existingSeries != null) continue; - - var s = DbFactory.Series(infos[0].Series); - if (!s.SortNameLocked && !string.IsNullOrEmpty(infos[0].SeriesSort)) - { - s.SortName = infos[0].SeriesSort; - } - if (!s.LocalizedNameLocked && !string.IsNullOrEmpty(infos[0].LocalizedSeries)) - { - s.LocalizedName = infos[0].LocalizedSeries; - } - s.Format = key.Format; - s.LibraryId = library.Id; // We have to manually set this since we aren't adding the series to the Library's series. - newSeries.Add(s); - } - - - foreach(var series in newSeries) - { - _logger.LogDebug("[ScannerService] Processing series {SeriesName}", series.OriginalName); - await UpdateSeries(series, parsedSeries, allPeople, allTags, allGenres, library); - _unitOfWork.SeriesRepository.Attach(series); - try - { - await _unitOfWork.CommitAsync(); - _logger.LogInformation( - "[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}", - newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name); - - // Inform UI of new series added - await _eventHub.SendMessageAsync(MessageFactory.SeriesAdded, MessageFactory.SeriesAddedEvent(series.Id, series.Name, library.Id)); - } - catch (Exception ex) - { - _logger.LogCritical(ex, "[ScannerService] There was a critical exception adding new series entry for {SeriesName} with a duplicate index key: {IndexKey} ", - series.Name, $"{series.Name}_{series.NormalizedName}_{series.LocalizedName}_{series.LibraryId}_{series.Format}"); - } - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended)); - - _logger.LogInformation( - "[ScannerService] Added {NewSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}", - newSeries.Count, stopwatch.ElapsedMilliseconds, library.Name); - } - - private async Task UpdateSeries(Series series, Dictionary> parsedSeries, - ICollection allPeople, ICollection allTags, ICollection allGenres, Library library) - { - try - { - _logger.LogInformation("[ScannerService] Processing series {SeriesName}", series.OriginalName); - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - - // Get all associated ParsedInfos to the series. This includes infos that use a different filename that matches Series LocalizedName - var parsedInfos = ParseScannedFiles.GetInfosByName(parsedSeries, series); - UpdateVolumes(series, parsedInfos, allPeople, allTags, allGenres); - series.Pages = series.Volumes.Sum(v => v.Pages); - - series.NormalizedName = Parser.Parser.Normalize(series.Name); - series.Metadata ??= DbFactory.SeriesMetadata(new List()); - if (series.Format == MangaFormat.Unknown) - { - series.Format = parsedInfos[0].Format; - } - series.OriginalName ??= parsedInfos[0].Series; - if (string.IsNullOrEmpty(series.SortName)) - { - series.SortName = series.Name; - } - if (!series.SortNameLocked) - { - series.SortName = series.Name; - if (!string.IsNullOrEmpty(parsedInfos[0].SeriesSort)) - { - series.SortName = parsedInfos[0].SeriesSort; - } - } - - // parsedInfos[0] is not the first volume or chapter. We need to find it - var localizedSeries = parsedInfos.Select(p => p.LocalizedSeries).FirstOrDefault(p => !string.IsNullOrEmpty(p)); - if (!series.LocalizedNameLocked && !string.IsNullOrEmpty(localizedSeries)) - { - series.LocalizedName = localizedSeries; - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - - UpdateSeriesMetadata(series, allPeople, allGenres, allTags, library.Type); - } - catch (Exception ex) - { - _logger.LogError(ex, "[ScannerService] There was an exception updating volumes for {SeriesName}", series.Name); - } - - await _eventHub.SendMessageAsync(MessageFactory.NotificationProgress, MessageFactory.LibraryScanProgressEvent(library.Name, ProgressEventType.Ended, series.Name)); - } - - public static IEnumerable FindSeriesNotOnDisk(IEnumerable existingSeries, Dictionary> parsedSeries) + public static IEnumerable FindSeriesNotOnDisk(IEnumerable existingSeries, Dictionary> parsedSeries) { return existingSeries.Where(es => !ParserInfoHelpers.SeriesHasMatchingParserInfoFormat(es, parsedSeries)); } - - private async Task RemoveAbandonedMetadataKeys() - { - await _unitOfWork.TagRepository.RemoveAllTagNoLongerAssociated(); - await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated(); - await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated(); - } - - - private static void UpdateSeriesMetadata(Series series, ICollection allPeople, ICollection allGenres, ICollection allTags, LibraryType libraryType) - { - var isBook = libraryType == LibraryType.Book; - var firstChapter = SeriesService.GetFirstChapterForMetadata(series, isBook); - - var firstFile = firstChapter?.Files.FirstOrDefault(); - if (firstFile == null) return; - if (Parser.Parser.IsPdf(firstFile.FilePath)) return; - - var chapters = series.Volumes.SelectMany(volume => volume.Chapters).ToList(); - - // Update Metadata based on Chapter metadata - series.Metadata.ReleaseYear = chapters.Min(c => c.ReleaseDate.Year); - - if (series.Metadata.ReleaseYear < 1000) - { - // Not a valid year, default to 0 - series.Metadata.ReleaseYear = 0; - } - - // Set the AgeRating as highest in all the comicInfos - if (!series.Metadata.AgeRatingLocked) series.Metadata.AgeRating = chapters.Max(chapter => chapter.AgeRating); - - series.Metadata.TotalCount = chapters.Max(chapter => chapter.TotalCount); - series.Metadata.MaxCount = chapters.Max(chapter => chapter.Count); - // To not have to rely completely on ComicInfo, try to parse out if the series is complete by checking parsed filenames as well. - if (series.Metadata.MaxCount != series.Metadata.TotalCount) - { - var maxVolume = series.Volumes.Max(v => (int) Parser.Parser.MaxNumberFromRange(v.Name)); - var maxChapter = chapters.Max(c => (int) Parser.Parser.MaxNumberFromRange(c.Range)); - if (maxVolume == series.Metadata.TotalCount) series.Metadata.MaxCount = maxVolume; - else if (maxChapter == series.Metadata.TotalCount) series.Metadata.MaxCount = maxChapter; - } - - - if (!series.Metadata.PublicationStatusLocked) - { - series.Metadata.PublicationStatus = PublicationStatus.OnGoing; - if (series.Metadata.MaxCount >= series.Metadata.TotalCount && series.Metadata.TotalCount > 0) - { - series.Metadata.PublicationStatus = PublicationStatus.Completed; - } else if (series.Metadata.TotalCount > 0 && series.Metadata.MaxCount > 0) - { - series.Metadata.PublicationStatus = PublicationStatus.Ended; - } - } - - if (!string.IsNullOrEmpty(firstChapter.Summary) && !series.Metadata.SummaryLocked) - { - series.Metadata.Summary = firstChapter.Summary; - } - - if (!string.IsNullOrEmpty(firstChapter.Language) && !series.Metadata.LanguageLocked) - { - series.Metadata.Language = firstChapter.Language; - } - - - void HandleAddPerson(Person person) - { - PersonHelper.AddPersonIfNotExists(series.Metadata.People, person); - allPeople.Add(person); - } - - // Handle People - foreach (var chapter in chapters) - { - if (!series.Metadata.WriterLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Writer).Select(p => p.Name), PersonRole.Writer, - HandleAddPerson); - } - - if (!series.Metadata.CoverArtistLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.CoverArtist).Select(p => p.Name), PersonRole.CoverArtist, - HandleAddPerson); - } - - if (!series.Metadata.PublisherLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Publisher).Select(p => p.Name), PersonRole.Publisher, - HandleAddPerson); - } - - if (!series.Metadata.CharacterLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Character).Select(p => p.Name), PersonRole.Character, - HandleAddPerson); - } - - if (!series.Metadata.ColoristLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Colorist).Select(p => p.Name), PersonRole.Colorist, - HandleAddPerson); - } - - if (!series.Metadata.EditorLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Editor).Select(p => p.Name), PersonRole.Editor, - HandleAddPerson); - } - - if (!series.Metadata.InkerLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Inker).Select(p => p.Name), PersonRole.Inker, - HandleAddPerson); - } - - if (!series.Metadata.LettererLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Letterer).Select(p => p.Name), PersonRole.Letterer, - HandleAddPerson); - } - - if (!series.Metadata.PencillerLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Penciller).Select(p => p.Name), PersonRole.Penciller, - HandleAddPerson); - } - - if (!series.Metadata.TranslatorLocked) - { - PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Translator).Select(p => p.Name), PersonRole.Translator, - HandleAddPerson); - } - - if (!series.Metadata.TagsLocked) - { - TagHelper.UpdateTag(allTags, chapter.Tags.Select(t => t.Title), false, (tag, _) => - { - TagHelper.AddTagIfNotExists(series.Metadata.Tags, tag); - allTags.Add(tag); - }); - } - - if (!series.Metadata.GenresLocked) - { - GenreHelper.UpdateGenre(allGenres, chapter.Genres.Select(t => t.Title), false, genre => - { - GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre); - allGenres.Add(genre); - }); - } - } - - // NOTE: The issue here is that people is just from chapter, but series metadata might already have some people on it - // I might be able to filter out people that are in locked fields? - var people = chapters.SelectMany(c => c.People).ToList(); - PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People, - people, person => - { - switch (person.Role) - { - case PersonRole.Writer: - if (!series.Metadata.WriterLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Penciller: - if (!series.Metadata.PencillerLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Inker: - if (!series.Metadata.InkerLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Colorist: - if (!series.Metadata.ColoristLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Letterer: - if (!series.Metadata.LettererLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.CoverArtist: - if (!series.Metadata.CoverArtistLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Editor: - if (!series.Metadata.EditorLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Publisher: - if (!series.Metadata.PublisherLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Character: - if (!series.Metadata.CharacterLocked) series.Metadata.People.Remove(person); - break; - case PersonRole.Translator: - if (!series.Metadata.TranslatorLocked) series.Metadata.People.Remove(person); - break; - default: - series.Metadata.People.Remove(person); - break; - } - }); - } - - - - private void UpdateVolumes(Series series, IList parsedInfos, ICollection allPeople, ICollection allTags, ICollection allGenres) - { - var startingVolumeCount = series.Volumes.Count; - // Add new volumes and update chapters per volume - var distinctVolumes = parsedInfos.DistinctVolumes(); - _logger.LogDebug("[ScannerService] Updating {DistinctVolumes} volumes on {SeriesName}", distinctVolumes.Count, series.Name); - foreach (var volumeNumber in distinctVolumes) - { - var volume = series.Volumes.SingleOrDefault(s => s.Name == volumeNumber); - if (volume == null) - { - volume = DbFactory.Volume(volumeNumber); - series.Volumes.Add(volume); - _unitOfWork.VolumeRepository.Add(volume); - } - - volume.Name = volumeNumber; - - _logger.LogDebug("[ScannerService] Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name); - var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray(); - UpdateChapters(series, volume, infos); - volume.Pages = volume.Chapters.Sum(c => c.Pages); - - // Update all the metadata on the Chapters - foreach (var chapter in volume.Chapters) - { - var firstFile = chapter.Files.MinBy(x => x.Chapter); - if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) continue; - try - { - var firstChapterInfo = infos.SingleOrDefault(i => i.FullFilePath.Equals(firstFile.FilePath)); - UpdateChapterFromComicInfo(chapter, allPeople, allTags, allGenres, firstChapterInfo?.ComicInfo); - } - catch (Exception ex) - { - _logger.LogError(ex, "There was some issue when updating chapter's metadata"); - } - } - } - - // Remove existing volumes that aren't in parsedInfos - var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList(); - if (series.Volumes.Count != nonDeletedVolumes.Count) - { - _logger.LogDebug("[ScannerService] Removed {Count} volumes from {SeriesName} where parsed infos were not mapping with volume name", - (series.Volumes.Count - nonDeletedVolumes.Count), series.Name); - var deletedVolumes = series.Volumes.Except(nonDeletedVolumes); - foreach (var volume in deletedVolumes) - { - var file = volume.Chapters.FirstOrDefault()?.Files?.FirstOrDefault()?.FilePath ?? ""; - if (!string.IsNullOrEmpty(file) && File.Exists(file)) - { - _logger.LogError( - "[ScannerService] Volume cleanup code was trying to remove a volume with a file still existing on disk. File: {File}", - file); - } - - _logger.LogDebug("[ScannerService] Removed {SeriesName} - Volume {Volume}: {File}", series.Name, volume.Name, file); - } - - series.Volumes = nonDeletedVolumes; - } - - _logger.LogDebug("[ScannerService] Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}", - series.Name, startingVolumeCount, series.Volumes.Count); - } - - private void UpdateChapters(Series series, Volume volume, IList parsedInfos) - { - // Add new chapters - foreach (var info in parsedInfos) - { - // Specials go into their own chapters with Range being their filename and IsSpecial = True. Non-Specials with Vol and Chap as 0 - // also are treated like specials for UI grouping. - Chapter chapter; - try - { - chapter = volume.Chapters.GetChapterByRange(info); - } - catch (Exception ex) - { - _logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters); - continue; - } - - if (chapter == null) - { - _logger.LogDebug( - "[ScannerService] Adding new chapter, {Series} - Vol {Volume} Ch {Chapter}", info.Series, info.Volumes, info.Chapters); - chapter = DbFactory.Chapter(info); - volume.Chapters.Add(chapter); - series.LastChapterAdded = DateTime.Now; - } - else - { - chapter.UpdateFrom(info); - } - - if (chapter == null) continue; - // Add files - var specialTreatment = info.IsSpecialInfo(); - AddOrUpdateFileForChapter(chapter, info); - chapter.Number = Parser.Parser.MinNumberFromRange(info.Chapters) + string.Empty; - chapter.Range = specialTreatment ? info.Filename : info.Chapters; - } - - - // Remove chapters that aren't in parsedInfos or have no files linked - var existingChapters = volume.Chapters.ToList(); - foreach (var existingChapter in existingChapters) - { - if (existingChapter.Files.Count == 0 || !parsedInfos.HasInfo(existingChapter)) - { - _logger.LogDebug("[ScannerService] Removed chapter {Chapter} for Volume {VolumeNumber} on {SeriesName}", existingChapter.Range, volume.Name, parsedInfos[0].Series); - volume.Chapters.Remove(existingChapter); - } - else - { - // Ensure we remove any files that no longer exist AND order - existingChapter.Files = existingChapter.Files - .Where(f => parsedInfos.Any(p => p.FullFilePath == f.FilePath)) - .OrderByNatural(f => f.FilePath).ToList(); - existingChapter.Pages = existingChapter.Files.Sum(f => f.Pages); - } - } - } - - private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info) - { - chapter.Files ??= new List(); - var existingFile = chapter.Files.SingleOrDefault(f => f.FilePath == info.FullFilePath); - if (existingFile != null) - { - existingFile.Format = info.Format; - if (!_fileService.HasFileBeenModifiedSince(existingFile.FilePath, existingFile.LastModified) && existingFile.Pages != 0) return; - existingFile.Pages = _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format); - // We skip updating DB here with last modified time so that metadata refresh can do it - } - else - { - var file = DbFactory.MangaFile(info.FullFilePath, info.Format, _readingItemService.GetNumberOfPages(info.FullFilePath, info.Format)); - if (file == null) return; - - chapter.Files.Add(file); - } - } - - private void UpdateChapterFromComicInfo(Chapter chapter, ICollection allPeople, ICollection allTags, ICollection allGenres, ComicInfo? info) - { - var firstFile = chapter.Files.MinBy(x => x.Chapter); - if (firstFile == null || - _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, firstFile)) return; - - var comicInfo = info; - if (info == null) - { - comicInfo = _readingItemService.GetComicInfo(firstFile.FilePath); - } - - if (comicInfo == null) return; - _logger.LogDebug("[ScannerService] Read ComicInfo for {File}", firstFile.FilePath); - - chapter.AgeRating = ComicInfo.ConvertAgeRatingToEnum(comicInfo.AgeRating); - - if (!string.IsNullOrEmpty(comicInfo.Title)) - { - chapter.TitleName = comicInfo.Title.Trim(); - } - - if (!string.IsNullOrEmpty(comicInfo.Summary)) - { - chapter.Summary = comicInfo.Summary; - } - - if (!string.IsNullOrEmpty(comicInfo.LanguageISO)) - { - chapter.Language = comicInfo.LanguageISO; - } - - if (comicInfo.Count > 0) - { - chapter.TotalCount = comicInfo.Count; - } - - // This needs to check against both Number and Volume to calculate Count - if (!string.IsNullOrEmpty(comicInfo.Number) && float.Parse(comicInfo.Number) > 0) - { - chapter.Count = (int) Math.Floor(float.Parse(comicInfo.Number)); - } - if (!string.IsNullOrEmpty(comicInfo.Volume) && float.Parse(comicInfo.Volume) > 0) - { - chapter.Count = Math.Max(chapter.Count, (int) Math.Floor(float.Parse(comicInfo.Volume))); - } - - - if (comicInfo.Year > 0) - { - var day = Math.Max(comicInfo.Day, 1); - var month = Math.Max(comicInfo.Month, 1); - chapter.ReleaseDate = DateTime.Parse($"{month}/{day}/{comicInfo.Year}"); - } - - var people = GetTagValues(comicInfo.Colorist); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Colorist, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Characters); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Character); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Character, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Translator); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Translator); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Translator, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Writer); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Writer, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Editor); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Editor, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Inker); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Inker, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Letterer); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Letterer, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - - people = GetTagValues(comicInfo.Penciller); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Penciller, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.CoverArtist); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.CoverArtist, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - people = GetTagValues(comicInfo.Publisher); - PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher); - PersonHelper.UpdatePeople(allPeople, people, PersonRole.Publisher, - person => PersonHelper.AddPersonIfNotExists(chapter.People, person)); - - var genres = GetTagValues(comicInfo.Genre); - GenreHelper.KeepOnlySameGenreBetweenLists(chapter.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList()); - GenreHelper.UpdateGenre(allGenres, genres, false, - genre => chapter.Genres.Add(genre)); - - var tags = GetTagValues(comicInfo.Tags); - TagHelper.KeepOnlySameTagBetweenLists(chapter.Tags, tags.Select(t => DbFactory.Tag(t, false)).ToList()); - TagHelper.UpdateTag(allTags, tags, false, - (tag, _) => - { - chapter.Tags.Add(tag); - }); - } - - private static IList GetTagValues(string comicInfoTagSeparatedByComma) - { - - if (!string.IsNullOrEmpty(comicInfoTagSeparatedByComma)) - { - return comicInfoTagSeparatedByComma.Split(",").Select(s => s.Trim()).ToList(); - } - return ImmutableList.Empty; - } } diff --git a/API/Services/Tasks/SiteThemeService.cs b/API/Services/Tasks/SiteThemeService.cs index aa52215c8..8de7f879c 100644 --- a/API/Services/Tasks/SiteThemeService.cs +++ b/API/Services/Tasks/SiteThemeService.cs @@ -55,8 +55,8 @@ public class ThemeService : IThemeService _directoryService.ExistOrCreate(_directoryService.SiteThemeDirectory); var reservedNames = Seed.DefaultThemes.Select(t => t.NormalizedName).ToList(); var themeFiles = _directoryService - .GetFilesWithExtension(Parser.Parser.NormalizePath(_directoryService.SiteThemeDirectory), @"\.css") - .Where(name => !reservedNames.Contains(Parser.Parser.Normalize(name))).ToList(); + .GetFilesWithExtension(Scanner.Parser.Parser.NormalizePath(_directoryService.SiteThemeDirectory), @"\.css") + .Where(name => !reservedNames.Contains(Scanner.Parser.Parser.Normalize(name))).ToList(); var allThemes = (await _unitOfWork.SiteThemeRepository.GetThemes()).ToList(); @@ -64,7 +64,7 @@ public class ThemeService : IThemeService var userThemes = allThemes.Where(t => t.Provider == ThemeProvider.User).ToList(); foreach (var userTheme in userThemes) { - var filepath = Parser.Parser.NormalizePath( + var filepath = Scanner.Parser.Parser.NormalizePath( _directoryService.FileSystem.Path.Join(_directoryService.SiteThemeDirectory, userTheme.FileName)); if (_directoryService.FileSystem.File.Exists(filepath)) continue; @@ -78,7 +78,7 @@ public class ThemeService : IThemeService foreach (var themeFile in themeFiles) { var themeName = - Parser.Parser.Normalize(_directoryService.FileSystem.Path.GetFileNameWithoutExtension(themeFile)); + Scanner.Parser.Parser.Normalize(_directoryService.FileSystem.Path.GetFileNameWithoutExtension(themeFile)); if (allThemeNames.Contains(themeName)) continue; _unitOfWork.SiteThemeRepository.Add(new SiteTheme() diff --git a/API/SignalR/MessageFactory.cs b/API/SignalR/MessageFactory.cs index 47aa07f02..74ee4cc0f 100644 --- a/API/SignalR/MessageFactory.cs +++ b/API/SignalR/MessageFactory.cs @@ -108,7 +108,10 @@ namespace API.SignalR /// When files are being scanned to calculate word count /// private const string WordCountAnalyzerProgress = "WordCountAnalyzerProgress"; - + /// + /// A generic message that can occur in background processing to inform user, but no direct action is needed + /// + public const string Info = "Info"; public static SignalRMessage ScanSeriesEvent(int libraryId, int seriesId, string seriesName) @@ -261,9 +264,7 @@ namespace API.SignalR }; } - /** - * A generic error that will show on events widget in the UI - */ + public static SignalRMessage ErrorEvent(string title, string subtitle) { return new SignalRMessage @@ -281,6 +282,23 @@ namespace API.SignalR }; } + public static SignalRMessage InfoEvent(string title, string subtitle) + { + return new SignalRMessage + { + Name = Info, + Title = title, + SubTitle = subtitle, + Progress = ProgressType.None, + EventType = ProgressEventType.Single, + Body = new + { + Title = title, + SubTitle = subtitle, + } + }; + } + public static SignalRMessage LibraryModifiedEvent(int libraryId, string action) { return new SignalRMessage @@ -319,35 +337,42 @@ namespace API.SignalR /// Represents a file being scanned by Kavita for processing and grouping /// /// Does not have a progress as it's unknown how many files there are. Instead sends -1 to represent indeterminate - /// + /// /// /// /// - public static SignalRMessage FileScanProgressEvent(string filename, string libraryName, string eventType) + public static SignalRMessage FileScanProgressEvent(string folderPath, string libraryName, string eventType) { return new SignalRMessage() { Name = FileScanProgress, Title = $"Scanning {libraryName}", - SubTitle = Path.GetFileName(filename), + SubTitle = folderPath, EventType = eventType, Progress = ProgressType.Indeterminate, Body = new { Title = $"Scanning {libraryName}", - Subtitle = filename, - Filename = filename, + Subtitle = folderPath, + Filename = folderPath, EventTime = DateTime.Now, } }; } + /// + /// This informs the UI with details about what is being processed by the Scanner + /// + /// + /// + /// + /// public static SignalRMessage LibraryScanProgressEvent(string libraryName, string eventType, string seriesName = "") { return new SignalRMessage() { Name = ScanProgress, - Title = $"Scanning {libraryName}", + Title = $"Processing {seriesName}", SubTitle = seriesName, EventType = eventType, Progress = ProgressType.Indeterminate, diff --git a/API/SignalR/Presence/PresenceTracker.cs b/API/SignalR/Presence/PresenceTracker.cs index 45118aa8d..40cec42d0 100644 --- a/API/SignalR/Presence/PresenceTracker.cs +++ b/API/SignalR/Presence/PresenceTracker.cs @@ -38,6 +38,7 @@ namespace API.SignalR.Presence public async Task UserConnected(string username, string connectionId) { var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(username); + if (user == null) return; var isAdmin = await _unitOfWork.UserRepository.IsUserAdminAsync(user); lock (OnlineUsers) { diff --git a/API/Startup.cs b/API/Startup.cs index 31342e7d9..6297453a0 100644 --- a/API/Startup.cs +++ b/API/Startup.cs @@ -17,6 +17,7 @@ using API.Services.Tasks; using API.SignalR; using Hangfire; using Hangfire.MemoryStorage; +using Hangfire.Storage.SQLite; using Kavita.Common; using Kavita.Common.EnvironmentInfo; using Microsoft.AspNetCore.Builder; @@ -111,6 +112,12 @@ namespace API } }); + c.AddServer(new OpenApiServer() + { + Description = "Custom Url", + Url = "/" + }); + c.AddServer(new OpenApiServer() { Description = "Local Server", @@ -149,11 +156,13 @@ namespace API services.AddHangfire(configuration => configuration .UseSimpleAssemblyNameTypeSerializer() .UseRecommendedSerializerSettings() - .UseMemoryStorage()); + .UseMemoryStorage()); // UseSQLiteStorage - SQLite has some issues around resuming jobs when aborted // Add the processing server as IHostedService - services.AddHangfireServer(); - + services.AddHangfireServer(options => + { + options.Queues = new[] {TaskScheduler.ScanQueue, TaskScheduler.DefaultQueue}; + }); // Add IHostedService for startup tasks // Any services that should be bootstrapped go here services.AddHostedService(); @@ -174,15 +183,17 @@ namespace API var logger = serviceProvider.GetRequiredService>(); var userManager = serviceProvider.GetRequiredService>(); var themeService = serviceProvider.GetRequiredService(); + var dataContext = serviceProvider.GetRequiredService(); - await MigrateBookmarks.Migrate(directoryService, unitOfWork, - logger, cacheService); // Only run this if we are upgrading await MigrateChangePasswordRoles.Migrate(unitOfWork, userManager); await MigrateRemoveExtraThemes.Migrate(unitOfWork, themeService); + // Only needed for v0.5.5.x and v0.5.6 + await MigrateNormalizedLocalizedName.Migrate(unitOfWork, dataContext, logger); + // Update the version in the DB after all migrations are run var installVersion = await unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.InstallVersion); installVersion.Value = BuildInfo.Version.ToString(); @@ -260,13 +271,14 @@ namespace API app.Use(async (context, next) => { - context.Response.GetTypedHeaders().CacheControl = - new Microsoft.Net.Http.Headers.CacheControlHeaderValue() - { - Public = false, - MaxAge = TimeSpan.FromSeconds(10), - }; - context.Response.Headers[Microsoft.Net.Http.Headers.HeaderNames.Vary] = + // Note: I removed this as I caught Chrome caching api responses when it shouldn't have + // context.Response.GetTypedHeaders().CacheControl = + // new CacheControlHeaderValue() + // { + // Public = false, + // MaxAge = TimeSpan.FromSeconds(10), + // }; + context.Response.Headers[HeaderNames.Vary] = new[] { "Accept-Encoding" }; // Don't let the site be iframed outside the same origin (clickjacking) diff --git a/API/config/appsettings.Development.json b/API/config/appsettings.Development.json index 78d892e05..bd19064c4 100644 --- a/API/config/appsettings.Development.json +++ b/API/config/appsettings.Development.json @@ -5,11 +5,11 @@ "TokenKey": "super secret unguessable key", "Logging": { "LogLevel": { - "Default": "Critical", - "Microsoft": "Information", + "Default": "Debug", + "Microsoft": "Error", "Microsoft.Hosting.Lifetime": "Error", - "Hangfire": "Information", - "Microsoft.AspNetCore.Hosting.Internal.WebHost": "Information" + "Hangfire": "Error", + "Microsoft.AspNetCore.Hosting.Internal.WebHost": "Error" }, "File": { "Path": "config//logs/kavita.log", diff --git a/API/config/appsettings.json b/API/config/appsettings.json new file mode 100644 index 000000000..19637b881 --- /dev/null +++ b/API/config/appsettings.json @@ -0,0 +1,22 @@ +{ + "ConnectionStrings": { + "DefaultConnection": "Data source=config/kavita.db" + }, + "TokenKey": "super secret unguessable key", + "Logging": { + "LogLevel": { + "Default": "Information", + "Microsoft": "Error", + "Microsoft.Hosting.Lifetime": "Error", + "Hangfire": "Error", + "Microsoft.AspNetCore.Hosting.Internal.WebHost": "Error" + }, + "File": { + "Path": "config/logs/kavita.log", + "Append": "True", + "FileSizeLimitBytes": 10485760, + "MaxRollingFiles": 1 + } + }, + "Port": 5000 +} diff --git a/Dockerfile b/Dockerfile index 7233214f6..c8e090534 100644 --- a/Dockerfile +++ b/Dockerfile @@ -20,7 +20,7 @@ COPY --from=copytask /files/wwwroot /kavita/wwwroot #Installs program dependencies RUN apt-get update \ - && apt-get install -y libicu-dev libssl1.1 libgdiplus curl\ + && apt-get install -y libicu-dev libssl1.1 libgdiplus curl \ && rm -rf /var/lib/apt/lists/* COPY entrypoint.sh /entrypoint.sh diff --git a/Kavita.Common/Helpers/GlobMatcher.cs b/Kavita.Common/Helpers/GlobMatcher.cs new file mode 100644 index 000000000..3abd06f22 --- /dev/null +++ b/Kavita.Common/Helpers/GlobMatcher.cs @@ -0,0 +1,64 @@ +using System.Collections.Generic; +using System.Linq; +using DotNet.Globbing; + +namespace Kavita.Common.Helpers; + +/** + * Matches against strings using Glob syntax + */ +public class GlobMatcher +{ + private readonly IList _includes = new List(); + private readonly IList _excludes = new List(); + + public void AddInclude(string pattern) + { + _includes.Add(Glob.Parse(pattern)); + } + + public void AddExclude(string pattern) + { + _excludes.Add(Glob.Parse(pattern)); + } + + public bool ExcludeMatches(string file) + { + // NOTE: Glob.IsMatch() returns the opposite of what you'd expect + return _excludes.Any(p => p.IsMatch(file)); + } + + + /// + /// + /// + /// + /// + /// True if any + public bool IsMatch(string file, bool mustMatchIncludes = false) + { + // NOTE: Glob.IsMatch() returns the opposite of what you'd expect + if (_excludes.Any(p => p.IsMatch(file))) return true; + if (mustMatchIncludes) + { + return _includes.Any(p => p.IsMatch(file)); + } + + return false; + } + + public void Merge(GlobMatcher matcher) + { + if (matcher == null) return; + foreach (var glob in matcher._excludes) + { + _excludes.Add(glob); + } + + foreach (var glob in matcher._includes) + { + _includes.Add(glob); + } + + } +} diff --git a/Kavita.Common/Kavita.Common.csproj b/Kavita.Common/Kavita.Common.csproj index c0db9f43f..db5f58646 100644 --- a/Kavita.Common/Kavita.Common.csproj +++ b/Kavita.Common/Kavita.Common.csproj @@ -4,11 +4,12 @@ net6.0 kavitareader.com Kavita - 0.5.5.1 + 0.5.6.0 en + @@ -19,4 +20,4 @@ - \ No newline at end of file + diff --git a/UI/Web/src/app/_models/events/info-event.ts b/UI/Web/src/app/_models/events/info-event.ts new file mode 100644 index 000000000..9ad5a1616 --- /dev/null +++ b/UI/Web/src/app/_models/events/info-event.ts @@ -0,0 +1,32 @@ +import { EVENTS } from "src/app/_services/message-hub.service"; + +export interface InfoEvent { + /** + * Payload of the event subtype + */ + body: any; + /** + * Subtype event + */ + name: EVENTS.Info; + /** + * Title to display in events widget + */ + title: string; + /** + * Optional subtitle to display. Defaults to empty string + */ + subTitle: string; + /** + * Type of event. Helps events widget to understand how to handle said event + */ + eventType: 'single'; + /** + * Type of progress. Helps widget understand how to display spinner + */ + progress: 'none'; + /** + * When event was sent + */ + eventTime: string; +} \ No newline at end of file diff --git a/UI/Web/src/app/_models/series.ts b/UI/Web/src/app/_models/series.ts index 8ceda4fc3..9c3c9bd7e 100644 --- a/UI/Web/src/app/_models/series.ts +++ b/UI/Web/src/app/_models/series.ts @@ -48,6 +48,10 @@ export interface Series { * DateTime representing last time a chapter was added to the Series */ lastChapterAdded: string; + /** + * DateTime representing last time the series folder was scanned + */ + lastFolderScanned: string; /** * Number of words in the series */ @@ -55,4 +59,8 @@ export interface Series { minHoursToRead: number; maxHoursToRead: number; avgHoursToRead: number; + /** + * Highest level folder containing this series + */ + folderPath: string; } diff --git a/UI/Web/src/app/_services/account.service.ts b/UI/Web/src/app/_services/account.service.ts index f59583a52..3675e818e 100644 --- a/UI/Web/src/app/_services/account.service.ts +++ b/UI/Web/src/app/_services/account.service.ts @@ -165,7 +165,7 @@ export class AccountService implements OnDestroy { } confirmResetPasswordEmail(model: {email: string, token: string, password: string}) { - return this.httpClient.post(this.baseUrl + 'account/confirm-password-reset', model); + return this.httpClient.post(this.baseUrl + 'account/confirm-password-reset', model, {responseType: 'json' as 'text'}); } resetPassword(username: string, password: string, oldPassword: string) { @@ -228,8 +228,7 @@ export class AccountService implements OnDestroy { private refreshToken() { if (this.currentUser === null || this.currentUser === undefined) return of(); - //console.log('refreshing token and updating user account'); - + return this.httpClient.post<{token: string, refreshToken: string}>(this.baseUrl + 'account/refresh-token', {token: this.currentUser.token, refreshToken: this.currentUser.refreshToken}).pipe(map(user => { if (this.currentUser) { this.currentUser.token = user.token; diff --git a/UI/Web/src/app/_services/action-factory.service.ts b/UI/Web/src/app/_services/action-factory.service.ts index 9223c57ac..6b38dbaa4 100644 --- a/UI/Web/src/app/_services/action-factory.service.ts +++ b/UI/Web/src/app/_services/action-factory.service.ts @@ -18,9 +18,9 @@ export enum Action { */ MarkAsUnread = 1, /** - * Invoke a Scan Library + * Invoke a Scan on Series/Library */ - ScanLibrary = 2, + Scan = 2, /** * Delete the entity */ @@ -129,7 +129,7 @@ export class ActionFactoryService { }); this.seriesActions.push({ - action: Action.ScanLibrary, + action: Action.Scan, title: 'Scan Series', callback: this.dummyCallback, requiresAdmin: true @@ -171,7 +171,7 @@ export class ActionFactoryService { }); this.libraryActions.push({ - action: Action.ScanLibrary, + action: Action.Scan, title: 'Scan Library', callback: this.dummyCallback, requiresAdmin: true diff --git a/UI/Web/src/app/_services/action.service.ts b/UI/Web/src/app/_services/action.service.ts index d863887a4..ba905174c 100644 --- a/UI/Web/src/app/_services/action.service.ts +++ b/UI/Web/src/app/_services/action.service.ts @@ -52,11 +52,15 @@ export class ActionService implements OnDestroy { * @param callback Optional callback to perform actions after API completes * @returns */ - scanLibrary(library: Partial, callback?: LibraryActionCallback) { + async scanLibrary(library: Partial, callback?: LibraryActionCallback) { if (!library.hasOwnProperty('id') || library.id === undefined) { return; } - this.libraryService.scan(library?.id).pipe(take(1)).subscribe((res: any) => { + + // Prompt user if we should do a force or not + const force = false; // await this.promptIfForce(); + + this.libraryService.scan(library.id, force).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + library.name); if (callback) { callback(library); @@ -83,7 +87,9 @@ export class ActionService implements OnDestroy { return; } - this.libraryService.refreshMetadata(library?.id).pipe(take(1)).subscribe((res: any) => { + const forceUpdate = true; //await this.promptIfForce(); + + this.libraryService.refreshMetadata(library?.id, forceUpdate).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + library.name); if (callback) { callback(library); @@ -152,7 +158,7 @@ export class ActionService implements OnDestroy { * @param series Series, must have libraryId and name populated * @param callback Optional callback to perform actions after API completes */ - scanSeries(series: Series, callback?: SeriesActionCallback) { + async scanSeries(series: Series, callback?: SeriesActionCallback) { this.seriesService.scan(series.libraryId, series.id).pipe(take(1)).subscribe((res: any) => { this.toastr.info('Scan queued for ' + series.name); if (callback) { @@ -545,4 +551,16 @@ export class ActionService implements OnDestroy { } }); } + + private async promptIfForce(extraContent: string = '') { + // Prompt user if we should do a force or not + const config = this.confirmService.defaultConfirm; + config.header = 'Force Scan'; + config.buttons = [ + {text: 'Yes', type: 'secondary'}, + {text: 'No', type: 'primary'}, + ]; + const msg = 'Do you want to force this scan? This is will ignore optimizations that reduce processing and I/O. ' + extraContent; + return !await this.confirmService.confirm(msg, config); // Not because primary is the false state + } } diff --git a/UI/Web/src/app/_services/jumpbar.service.ts b/UI/Web/src/app/_services/jumpbar.service.ts index e5c22597c..43dc5ed3d 100644 --- a/UI/Web/src/app/_services/jumpbar.service.ts +++ b/UI/Web/src/app/_services/jumpbar.service.ts @@ -31,7 +31,7 @@ export class JumpbarService { const jumpBarKeysToRender: Array = []; const targetNumberOfKeys = parseInt(Math.floor(currentSize / keySize) + '', 10); const removeCount = jumpBarKeys.length - targetNumberOfKeys - 3; - if (removeCount <= 0) return jumpBarKeysToRender; + if (removeCount <= 0) return [...jumpBarKeys]; const removalTimes = Math.ceil(removeCount / 2); const midPoint = Math.floor(jumpBarKeys.length / 2); diff --git a/UI/Web/src/app/_services/library.service.ts b/UI/Web/src/app/_services/library.service.ts index ce03c2666..5aac12cfd 100644 --- a/UI/Web/src/app/_services/library.service.ts +++ b/UI/Web/src/app/_services/library.service.ts @@ -76,16 +76,16 @@ export class LibraryService { return this.httpClient.post(this.baseUrl + 'library/grant-access', {username, selectedLibraries}); } - scan(libraryId: number) { - return this.httpClient.post(this.baseUrl + 'library/scan?libraryId=' + libraryId, {}); + scan(libraryId: number, force = false) { + return this.httpClient.post(this.baseUrl + 'library/scan?libraryId=' + libraryId + '&force=' + force, {}); } analyze(libraryId: number) { return this.httpClient.post(this.baseUrl + 'library/analyze?libraryId=' + libraryId, {}); } - refreshMetadata(libraryId: number) { - return this.httpClient.post(this.baseUrl + 'library/refresh-metadata?libraryId=' + libraryId, {}); + refreshMetadata(libraryId: number, forceUpdate = false) { + return this.httpClient.post(this.baseUrl + 'library/refresh-metadata?libraryId=' + libraryId + '&force=' + forceUpdate, {}); } create(model: {name: string, type: number, folders: string[]}) { diff --git a/UI/Web/src/app/_services/message-hub.service.ts b/UI/Web/src/app/_services/message-hub.service.ts index 5ceb31e50..961afd6cb 100644 --- a/UI/Web/src/app/_services/message-hub.service.ts +++ b/UI/Web/src/app/_services/message-hub.service.ts @@ -71,7 +71,11 @@ export enum EVENTS { /** * When files are being scanned to calculate word count */ - WordCountAnalyzerProgress = 'WordCountAnalyzerProgress' + WordCountAnalyzerProgress = 'WordCountAnalyzerProgress', + /** + * When the user needs to be informed, but it's not a big deal + */ + Info = 'Info', } export interface Message { @@ -217,6 +221,13 @@ export class MessageHubService { }); }); + this.hubConnection.on(EVENTS.Info, resp => { + this.messagesSource.next({ + event: EVENTS.Info, + payload: resp.body + }); + }); + this.hubConnection.on(EVENTS.SeriesAdded, resp => { this.messagesSource.next({ event: EVENTS.SeriesAdded, diff --git a/UI/Web/src/app/_services/series.service.ts b/UI/Web/src/app/_services/series.service.ts index 2c7cbe71c..cc9c4ef60 100644 --- a/UI/Web/src/app/_services/series.service.ts +++ b/UI/Web/src/app/_services/series.service.ts @@ -153,8 +153,8 @@ export class SeriesService { return this.httpClient.post(this.baseUrl + 'series/refresh-metadata', {libraryId: series.libraryId, seriesId: series.id}); } - scan(libraryId: number, seriesId: number) { - return this.httpClient.post(this.baseUrl + 'series/scan', {libraryId: libraryId, seriesId: seriesId}); + scan(libraryId: number, seriesId: number, force = false) { + return this.httpClient.post(this.baseUrl + 'series/scan', {libraryId: libraryId, seriesId: seriesId, forceUpdate: force}); } analyzeFiles(libraryId: number, seriesId: number) { diff --git a/UI/Web/src/app/_services/theme.service.ts b/UI/Web/src/app/_services/theme.service.ts index 23dc8e90c..18e3764b9 100644 --- a/UI/Web/src/app/_services/theme.service.ts +++ b/UI/Web/src/app/_services/theme.service.ts @@ -2,7 +2,8 @@ import { DOCUMENT } from '@angular/common'; import { HttpClient } from '@angular/common/http'; import { Inject, Injectable, OnDestroy, Renderer2, RendererFactory2, SecurityContext } from '@angular/core'; import { DomSanitizer } from '@angular/platform-browser'; -import { map, ReplaySubject, Subject, takeUntil } from 'rxjs'; +import { ToastrService } from 'ngx-toastr'; +import { map, ReplaySubject, Subject, takeUntil, take } from 'rxjs'; import { environment } from 'src/environments/environment'; import { ConfirmService } from '../shared/confirm.service'; import { NotificationProgressEvent } from '../_models/events/notification-progress-event'; @@ -35,7 +36,7 @@ export class ThemeService implements OnDestroy { constructor(rendererFactory: RendererFactory2, @Inject(DOCUMENT) private document: Document, private httpClient: HttpClient, - messageHub: MessageHubService, private domSantizer: DomSanitizer, private confirmService: ConfirmService) { + messageHub: MessageHubService, private domSantizer: DomSanitizer, private confirmService: ConfirmService, private toastr: ToastrService) { this.renderer = rendererFactory.createRenderer(null, null); this.getThemes(); @@ -47,7 +48,9 @@ export class ThemeService implements OnDestroy { if (notificationEvent.name !== EVENTS.SiteThemeProgress) return; if (notificationEvent.eventType === 'ended') { - if (notificationEvent.name === EVENTS.SiteThemeProgress) this.getThemes().subscribe(() => {}); + if (notificationEvent.name === EVENTS.SiteThemeProgress) this.getThemes().subscribe(() => { + + }); } }); } @@ -73,6 +76,12 @@ export class ThemeService implements OnDestroy { return this.httpClient.get(this.baseUrl + 'theme').pipe(map(themes => { this.themeCache = themes; this.themesSource.next(themes); + this.currentTheme$.pipe(take(1)).subscribe(theme => { + if (!themes.includes(theme)) { + this.setTheme(this.defaultTheme); + this.toastr.info('The active theme no longer exists. Please refresh the page.'); + } + }); return themes; })); } diff --git a/UI/Web/src/app/admin/_modals/directory-picker/directory-picker.component.html b/UI/Web/src/app/admin/_modals/directory-picker/directory-picker.component.html index d7c9b08f0..758922079 100644 --- a/UI/Web/src/app/admin/_modals/directory-picker/directory-picker.component.html +++ b/UI/Web/src/app/admin/_modals/directory-picker/directory-picker.component.html @@ -57,7 +57,7 @@ diff --git a/UI/Web/src/app/admin/_modals/library-editor-modal/library-editor-modal.component.ts b/UI/Web/src/app/admin/_modals/library-editor-modal/library-editor-modal.component.ts index 9b71f4e2a..710d0e4d7 100644 --- a/UI/Web/src/app/admin/_modals/library-editor-modal/library-editor-modal.component.ts +++ b/UI/Web/src/app/admin/_modals/library-editor-modal/library-editor-modal.component.ts @@ -1,5 +1,5 @@ import { Component, Input, OnInit } from '@angular/core'; -import { UntypedFormControl, UntypedFormGroup, Validators } from '@angular/forms'; +import { FormControl, FormGroup, Validators } from '@angular/forms'; import { NgbActiveModal, NgbModal } from '@ng-bootstrap/ng-bootstrap'; import { ToastrService } from 'ngx-toastr'; import { ConfirmService } from 'src/app/shared/confirm.service'; @@ -17,9 +17,9 @@ export class LibraryEditorModalComponent implements OnInit { @Input() library: Library | undefined = undefined; - libraryForm: UntypedFormGroup = new UntypedFormGroup({ - name: new UntypedFormControl('', [Validators.required]), - type: new UntypedFormControl(0, [Validators.required]) + libraryForm: FormGroup = new FormGroup({ + name: new FormControl('', [Validators.required]), + type: new FormControl(0, [Validators.required]) }); selectedFolders: string[] = []; diff --git a/UI/Web/src/app/admin/_models/server-settings.ts b/UI/Web/src/app/admin/_models/server-settings.ts index 736cd39f2..72438a431 100644 --- a/UI/Web/src/app/admin/_models/server-settings.ts +++ b/UI/Web/src/app/admin/_models/server-settings.ts @@ -12,4 +12,5 @@ export interface ServerSettings { convertBookmarkToWebP: boolean; enableSwaggerUi: boolean; totalBackups: number; + enableFolderWatching: boolean; } diff --git a/UI/Web/src/app/admin/edit-user/edit-user.component.ts b/UI/Web/src/app/admin/edit-user/edit-user.component.ts index e002d7661..29b61caa5 100644 --- a/UI/Web/src/app/admin/edit-user/edit-user.component.ts +++ b/UI/Web/src/app/admin/edit-user/edit-user.component.ts @@ -1,5 +1,5 @@ import { Component, Input, OnInit } from '@angular/core'; -import { UntypedFormGroup, UntypedFormControl, Validators } from '@angular/forms'; +import { FormGroup, FormControl, Validators } from '@angular/forms'; import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap'; import { Library } from 'src/app/_models/library'; import { Member } from 'src/app/_models/member'; @@ -19,7 +19,7 @@ export class EditUserComponent implements OnInit { selectedLibraries: Array = []; isSaving: boolean = false; - userForm: UntypedFormGroup = new UntypedFormGroup({}); + userForm: FormGroup = new FormGroup({}); public get email() { return this.userForm.get('email'); } public get username() { return this.userForm.get('username'); } @@ -28,8 +28,8 @@ export class EditUserComponent implements OnInit { constructor(public modal: NgbActiveModal, private accountService: AccountService) { } ngOnInit(): void { - this.userForm.addControl('email', new UntypedFormControl(this.member.email, [Validators.required, Validators.email])); - this.userForm.addControl('username', new UntypedFormControl(this.member.username, [Validators.required])); + this.userForm.addControl('email', new FormControl(this.member.email, [Validators.required, Validators.email])); + this.userForm.addControl('username', new FormControl(this.member.username, [Validators.required])); this.userForm.get('email')?.disable(); } diff --git a/UI/Web/src/app/admin/invite-user/invite-user.component.html b/UI/Web/src/app/admin/invite-user/invite-user.component.html index 1164627a7..64d97438e 100644 --- a/UI/Web/src/app/admin/invite-user/invite-user.component.html +++ b/UI/Web/src/app/admin/invite-user/invite-user.component.html @@ -6,7 +6,7 @@