mirror of
https://github.com/Kareadita/Kavita.git
synced 2025-08-05 08:39:59 -04:00
v0.5.0 Release (#960)
* Bump versions by dotnet-bump-version. * Color Theme applies to scrollbars (#793) * Moved more scss to use syntax to reduce css size * Hooked in color-scheme to help brower render scroll bars appropriately depending on color scheme user selects * Bump versions by dotnet-bump-version. * UI Updates + New Events (#806) * Implemented ability to see downloads users are performing on the events widget. * Fixed a bug where version update task was calling wrong code * Fixed a bug where when checking for updates, the event wouldn't be pushed to server with correct name. Added update check to the event widget rather than opening a modal on the user. * Relaxed password requirements to only be 6-32 characters and inform user on register form about the requirements * Removed a ton of duplicate logic for series cards where the logic was already defined in action service * Fixed OPDS total items giving a rounded number rather than total items. * Fixed off by one issue on OPDS pagination * Bump versions by dotnet-bump-version. * Update GA .net version (#818) * Local Metadata Integration Part 1 (#817) * Started with some basic plumbing with comic info parsing updating Series/Volume. * We can now get chapter title from comicInfo.xml * Hooked in the ability to store people into the chapter metadata. * Removed no longer used imports, fixed up some foreign key constraints on deleting series with person linked. * Refactored Summary out of the UI for Series into SeriesMetadata. Updated application to .net 6. There is a bug in metadata code for updating. * Removed the parallel.ForEach with a normal foreach which lets us use async. For I/O heavy code, shouldn't change much. * Refactored scan code to only check extensions with comic info, fixed a bug on scan events not using correct method name, removed summary field (still buggy) * Fixed a bug where on cancelling a metadata request in modal, underlying button would get stuck in a disabled state. * Changed how metadata selects the first volume to read summary info from. It will now select the first non-special volume rather than Volume 1. * More debugging and found more bugs to fix * Redid all the migrations as one single one. Fixed a bug with GetChapterInfo returning null when ChapterMetadata didn't exist for that Chapter. Fixed an issue with mapper failing on GetChapterMetadata. Started work on adding people and a design for people. * Fixed a bug where checking if file modified now takes into account if file has been processed at least once. Introduced a bug in saving people to series. * Just made code compilable again * Fixed up code. Now people for series and chapters add correctly without any db issues. * Things are working, but I'm not happy with how the management of Person is. I need to take into account that 1 person needs to map to an image and role is arbitrary. * Started adding UI code to showcase chapter metadata * Updated workflow to be .NET 6 * WIP of updating card detail to show the information more clearly and without so many if statements * Removed ChatperMetadata and store on the Chapter itself. Much easier to use and less joins. * Implemented Genre on SeriesMetadata level * Genres and People are now removed from Series level if they are no longer on comicInfo * PeopleHelper is done with unit tests. Everything is working. * Unit tests in place for Genre Helper * Starting on CacheHelper * Finished tests for ShouldUpdateCoverImage. Fixed and added tests in ArchiveService/ScannerService. * CacheHelper is fully tested * Some DI cleanup * Scanner Service now calls GetComicInfo for books. Added ability to update Series Sort name from metadata files (mainly epub as comicinfo doesn't have a field) * Forgot to move a line of code * SortName now populates from metadata (epub only, ComicInfo has no tags) * Cards now show the chapter title name if it's set on hover, else will default back to title. * Fixed a major issue with how MangaFiles were being updated with LastModified, which messed up our logic for avoiding refreshes. * Woohoo, more tests and some refactors to be able to test more services wtih mock filesystem. Fixed an issue where SortName was getting set as first chapter, but the Series was in a group. * Refactored the MangaFile creation code into the DbFactory where we also setup the first LastModified update. * Has file changed bug is now finally fixed * Remove dead genres, refactor genre to use title instead of name. * Refactored out a directory from ShouldUpdateCoverImage() to keep the code clean * Unit tests for ComicInfo on BookService. * Refactored series detail into it's own component * Series-detail now received refresh metadata events to refresh what's on screen * Removed references to Artist on PersonRole as it has no metadata mapping * Security audit * Fixed a benchmark * Updated JWT Token generator to use new methods in .NET 6 * Updated all the docker and build commands to use net6.0 * Commented out sonar scan since it's not setup for net6.0 yet. * Don't rely on test (#819) * Bump versions by dotnet-bump-version. * Update monorepo-build.sh Updated build to use .net6.0 * Bump versions by dotnet-bump-version. * Update sonar-scan.yml More updates to 6.0 * Update sonar-scan.yml Please work * Bump versions by dotnet-bump-version. * Please work * Bump versions by dotnet-bump-version. * Local Metadata Integration Part 1 (#820) * Started with some basic plumbing with comic info parsing updating Series/Volume. * We can now get chapter title from comicInfo.xml * Hooked in the ability to store people into the chapter metadata. * Removed no longer used imports, fixed up some foreign key constraints on deleting series with person linked. * Refactored Summary out of the UI for Series into SeriesMetadata. Updated application to .net 6. There is a bug in metadata code for updating. * Removed the parallel.ForEach with a normal foreach which lets us use async. For I/O heavy code, shouldn't change much. * Refactored scan code to only check extensions with comic info, fixed a bug on scan events not using correct method name, removed summary field (still buggy) * Fixed a bug where on cancelling a metadata request in modal, underlying button would get stuck in a disabled state. * Changed how metadata selects the first volume to read summary info from. It will now select the first non-special volume rather than Volume 1. * More debugging and found more bugs to fix * Redid all the migrations as one single one. Fixed a bug with GetChapterInfo returning null when ChapterMetadata didn't exist for that Chapter. Fixed an issue with mapper failing on GetChapterMetadata. Started work on adding people and a design for people. * Fixed a bug where checking if file modified now takes into account if file has been processed at least once. Introduced a bug in saving people to series. * Just made code compilable again * Fixed up code. Now people for series and chapters add correctly without any db issues. * Things are working, but I'm not happy with how the management of Person is. I need to take into account that 1 person needs to map to an image and role is arbitrary. * Started adding UI code to showcase chapter metadata * Updated workflow to be .NET 6 * WIP of updating card detail to show the information more clearly and without so many if statements * Removed ChatperMetadata and store on the Chapter itself. Much easier to use and less joins. * Implemented Genre on SeriesMetadata level * Genres and People are now removed from Series level if they are no longer on comicInfo * PeopleHelper is done with unit tests. Everything is working. * Unit tests in place for Genre Helper * Starting on CacheHelper * Finished tests for ShouldUpdateCoverImage. Fixed and added tests in ArchiveService/ScannerService. * CacheHelper is fully tested * Some DI cleanup * Scanner Service now calls GetComicInfo for books. Added ability to update Series Sort name from metadata files (mainly epub as comicinfo doesn't have a field) * Forgot to move a line of code * SortName now populates from metadata (epub only, ComicInfo has no tags) * Cards now show the chapter title name if it's set on hover, else will default back to title. * Fixed a major issue with how MangaFiles were being updated with LastModified, which messed up our logic for avoiding refreshes. * Woohoo, more tests and some refactors to be able to test more services wtih mock filesystem. Fixed an issue where SortName was getting set as first chapter, but the Series was in a group. * Refactored the MangaFile creation code into the DbFactory where we also setup the first LastModified update. * Has file changed bug is now finally fixed * Remove dead genres, refactor genre to use title instead of name. * Refactored out a directory from ShouldUpdateCoverImage() to keep the code clean * Unit tests for ComicInfo on BookService. * Refactored series detail into it's own component * Series-detail now received refresh metadata events to refresh what's on screen * Removed references to Artist on PersonRole as it has no metadata mapping * Security audit * Fixed a benchmark * Updated JWT Token generator to use new methods in .NET 6 * Updated all the docker and build commands to use net6.0 * Commented out sonar scan since it's not setup for net6.0 yet. * Removed some directives * Removed my test db * Bump versions by dotnet-bump-version. * .NET 6 Coding Patterns + Unit Tests (#823) * Refactored all files to have Interfaces within the same file. Started moving over to file-scoped namespaces. * Refactored common methods for getting underlying file's cover, pages, and extracting into 1 interface. * More refactoring around removing dependence on explicit filetype testing for getting information. * Code is buildable, tests are broken. Huge refactor (not completed) which makes most of DirectoryService testable with a mock filesystem (and thus the services that utilize it). * Finished porting DirectoryService to use mocked filesystem implementation. * Added a null check * Added a null check * Finished all unit tests for DirectoryService. * Some misc cleanup on the code * Fixed up some bugs from refactoring scan loop. * Implemented CleanupService testing and refactored more of DirectoryService to be non-static. Fixed a bug where cover file cleanup wasn't properly finding files due to a regex bug. * Fixed an issue in CleanupBackup() where we weren't properly selecting database files older than 30 days. Finished CleanupService Tests. * Refactored Flatten and RemoveNonImages to directory service to allow CacheService to be testable. * Finally have CacheService tested. Rewrote GetCachedPagePath() to be much more straightforward & performant. * Updated DefaultParserTests.cs to contain all existing tests and follow new test layout format. * All tests fixed up * Bump versions by dotnet-bump-version. * Bump docnet version to support x64 ARM pdfium binaries. (#826) * Bump versions by dotnet-bump-version. * Fixed bad build (#828) * Bump versions by dotnet-bump-version. * Bugfix/bad build (#829) * Fixed bad build * More directory service issues in Program * Bump versions by dotnet-bump-version. * Feature/local metadata more tags (#832) * Stashing code * removed some debug code on series detail page. Now detail is collapsed by default. * Added AgeRating * Fixed a crash when NetVips tries to write a cover file and cover directory is not existing. * When a card is selected for bulk actions, show an outline in addition to select box * Added AgeRating into the metadata parsing. Added a hack where ComicInfo uses Number in ComicInfo rather than Volume. This is to test out the effects on users libraries. * Added AgeRating and ReleaseDate to the metadata implelentation. * Bump versions by dotnet-bump-version. * Misc Fixes (#839) * Fixed a case where chapter was being parsed incorrectly when the series title ends in a number. * Updated Kavita to support Tome/T notation found in French comics * Added support for identifying European specials and expanded support for cleaning some tags used in European comics. During cleaning, if series starts with - or comma, remove it. * Fixed an issue where add to collection for a single series wasn't calling the bulk action handler * Fixed a NPE on AgeRating conversion. Fixed a bug where when looking for cover image, file extensions was throwing off sort code. * Refactored Natural Sort ordering to better follow how Windows behaves. This is a departure from how the original code executes. * GetCachedPagePath now uses natural sorting to pick the images for reading in a more correct order. * Updated parser to handle a case where there was more than one space as a separator * Bump versions by dotnet-bump-version. * Beefed up ToC generation to handle new ways of packing epub. (#840) * Bump versions by dotnet-bump-version. * Tachiyomi Enhancements (#845) * Added a new endpoint to get all Series with Progress info. * Fixed up some potential NPEs during scan * Commented out filter code, not ready for it. * Fixed up a parsing case for european comics * Refactored FilterDto to allow for specifying multiple formats to return. * Refactored FilterDto to allow for specifying multiple formats to return. * Refactored the UI to show OPDS as 3rd Party Clients since Tachiyomi now uses OPDS url scheme for authentication. * Bump versions by dotnet-bump-version. * In-Depth Filtering (#850) * Laying the foundation for the filter rework * Filtering by Genre is now possible. * Cleaned up code and preparing for People filtering * People filtering is hooked up for the frontend * Filtering now works. On Deck does not work with filtering currently due to a unique implementation. * More cleanup * Implemented the ability to reset the filters * Added a mobile drawer for filtering * Added some additional cases for NaturalSortComparer. Filter now uses a drawer on smaller screens. * Fixed a bug where backup service was not pointing to the correct directory. * Undid the fix, it's working as expected * Bump versions by dotnet-bump-version. * More Filtering and Support for ComicInfo v2.1 (draft) Tags (#851) * Added a reoccuring task to cleanup db entries that might be abandoned. On library page, the Library in question will be prepoulated. * Laid out the foundation for customized sorting. Added all series page to the UI when clicking on Libraries section header on home page so user can apply any filtering they like. * When filtering, the current library filter will now automatically filter out the options for people and genres. * Implemented Sorting controls * Clear now clears sorting and read progress. Sorting is disabled on deck and recently added. * Fixed an issue where all-series page couldn't click to open series * Don't let the user unselect the last read progress. Added new comicinfo v2.1 draft tags. * Hooked in Translator tag into backend and UI. * Fixed an issue where you could open multiple typeaheads at the same time * Integrated Translator and Tags ComicInfo extension fields. Started work on a badge expander. * Reworked a bit more on badge expander. Added the UI code for Age Rating and Tags * Integrated backend for Tags, Translator, and Age Rating * Metadata tags now collapse if more than 4 present * Some code cleanup * Made the not read badge slightly smaller * Bump versions by dotnet-bump-version. * More Filtering and Support for ComicInfo v2.1 (draft) Tags (#853) * Added a reoccuring task to cleanup db entries that might be abandoned. On library page, the Library in question will be prepoulated. * Laid out the foundation for customized sorting. Added all series page to the UI when clicking on Libraries section header on home page so user can apply any filtering they like. * When filtering, the current library filter will now automatically filter out the options for people and genres. * Implemented Sorting controls * Clear now clears sorting and read progress. Sorting is disabled on deck and recently added. * Fixed an issue where all-series page couldn't click to open series * Don't let the user unselect the last read progress. Added new comicinfo v2.1 draft tags. * Hooked in Translator tag into backend and UI. * Fixed an issue where you could open multiple typeaheads at the same time * Integrated Translator and Tags ComicInfo extension fields. Started work on a badge expander. * Reworked a bit more on badge expander. Added the UI code for Age Rating and Tags * Integrated backend for Tags, Translator, and Age Rating * Metadata tags now collapse if more than 4 present * Some code cleanup * Made the not read badge slightly smaller * Implemented Language filter. Fixed up some logic around Release Year not being set when month is missing. * Added a missing file and Updated Filter to use different design for layout * Fixed up some alignment issues * Refined the styles further * Bump versions by dotnet-bump-version. * Fixes v0.4.19! (#855) * Fixed OPDS urls to work with new Filtering schema * Fixed a rendering issue with Language tag when it's null * Fixed a bug where locked covers were resetting during refresh metadata. * Redid all the migrations and put some extra checks due to a bad migration from previous release (EF Core was producing an error). * Fixed a bug which didn't take sort direction when not changing sort field * Default installs now backup daily * Bump versions by dotnet-bump-version. * Fixed a bug in how to determine if the volume or series updates cover image. (#857) * Bump versions by dotnet-bump-version. * More fixes (again) (#858) * Send stack trace to the UI on prod mode * Pdfs will now generate cover images. I missed something a few releases ago. * Bump versions by dotnet-bump-version. * I can't believe it's more fixes! (#863) * Send stack trace to the UI on prod mode * Pdfs will now generate cover images. I missed something a few releases ago. * Ignore @Recently-Snapshot directories for QNAP. * Refactored Bitmap code to use ImageSharp so it's truly cross platform. * Updated pdf extraction to use a multi-threaded approach to greatly speed up pdf image extraction * Hooked in Characters tag from ComicInfo.xml * Bump versions by dotnet-bump-version. * Added check to see if mount folder is empty (#871) * Added check for if mount folder is empty * updating log message * Adding unit test * Bump versions by dotnet-bump-version. * Reader Fixes and Enhancements (#880) * Don't show an exception when bookmarking doesn't have anything to change. * Cleaned up the bookmark code a bit. * Implemented fullscreen mode in the web reader. Refactored User Settings to move Password and 3rd Party Clients to a tab rather than accordion. Removed color filters for web reader. * Implemented fullscreen mode into book reader * Added some code for toggling fullscreen which re-renders the screen to ensure the fitting works optimially * Fixed an issue where moving from FitToScreen -> Split (L/R) wouldn't render the screen correctly due to canvas not being reset. * Fixed bad optimization and scaling when drawing fit to screen * Removed left/right highlights on page direction change in favor for icons. Double arrow will dictate the page change. * Reduced overlay auto close time to 3 seconds * Updated the paginging direction overlay to use icons and colors. Added a blur effect on menus * Removed debug flags * Bump versions by dotnet-bump-version. * Fixed a bug with OPDS feeds not returning back results due to improperly setting up FilterDto. (#882) * Bump versions by dotnet-bump-version. * Bookmark Refactor (#893) * Fixed a bug which didn't take sort direction when not changing sort field * Added foundation for Bookmark refactor * Code broken, need to take a break. Issue is Getting bookmark image needs authentication but UI doesn't send. * Implemented the ability to send bookmarked files to the web. Implemented ability to clear bookmarks on disk on a re-occuring basis. * Updated the bookmark design to have it's own card that is self contained. View bookmarks modal has been updated to better lay out the cards. * Refactored download bookmark codes to select files from bookmark directory directly rather than open underlying files. * Wrote the basic logic to kick start the bookmark migration. Added Installed Version into the DB to allow us to know more accurately when to run migrations * Implemented the ability to change the bookmarks directory * Updated all references to BookmarkDirectory to use setting from the DB. Updated Server Settings page to use 2 col for some rows. * Refactored some code to DirectoryService (hasWriteAccess) and fixed up some unit tests from a previous PR. * Treat folders that start with ._ as blacklisted. * Implemented Reset User preferences. Some extra code to prep for the migration. * Implemented a migration for existing bookmarks to using new filesystem based bookmarks * Bump versions by dotnet-bump-version. * Only admins should be able to create new users (#896) * Bump versions by dotnet-bump-version. * Backup on Migrations (#898) * Refactored how the migrations are run. * A backup will be performed before any migrations. Added additional guards before a sub-module is loaded. * Bump versions by dotnet-bump-version. * Fixed a critical bug where registration was broken for first time flow. Refactored how backup before migrations occured such that it now puts the db in temp. The db will be deleted automatically that night. (#900) * Bump versions by dotnet-bump-version. * Fixed lack of ability to scroll with fullscreen mode (#903) * Bump versions by dotnet-bump-version. * Book Reader Issues (#906) * Refactored the Font Escaping Regex with new unit tests. * Fonts are now properly escaped, somehow a regression was introduced. * Refactored most of the book page loading for the reader into the service. * Fixed a bug where going into fullscreen in non dark mode will cause the background of the reader to go black. Fixed a rendering issue with margin left/right screwing html up. Fixed an issue where line-height: 100% would break book's css, now we remove the styles if they are non-valuable. * Changed how I fixed the black mode in fullscreen * Fixed an issue where anchors wouldn't be colored blue in white mode * Fixed a bug in the code that checks if a filename is a cover where it would choose "backcover" as a cover, despite it not being a valid case. * Validate if ReleaseYear is a valid year and if not, set it to 0 to disable it. * Fixed an issue where some large images could blow out the screen when reading on mobile. Now images will force to be max of width of browser * Put my hack back in for fullscreen putting background color to black * Change forwarded headers from All to explicit names * Fixed an issue where Scheme was not https when it should have been. Now the browser will handle which scheme to request. * Cleaned up the user preferences to stack multiple controls onto one row * Fixed fullscreen scroll issue with progress, but now sticky top is missing. * Corrected the element on which we fullscreen * Bump versions by dotnet-bump-version. * Fixed a bug where loading page on book reader wouldn't scroll to last position (#907) * Bump versions by dotnet-bump-version. * Metadata Optimizations (#910) * Added a tooltip to inform user that format and collection filter selections do not only show for the selected library. * Refactored a lot of code around when we update chapter cover images. Applied an optimization for when we re-calculate volume/series covers, such that it only occurs when the first chapter's image updates. * Updated code to ensure only lastmodified gets refreshed in metadata since it always follows a scan * Optimized how metadata is populated on the series. Instead of re-reading the comicInfos, instead I read the data from the underlying chapter entities. This reduces N additional reads AND enables the ability in the future to show/edit chapter level metadata. * Spelling mistake * Fixed a concurency issue by not selecting Genres from DB. Added a test for long paths. * Fixed a bug in filter where collection tag wasn't populating on load * Cleaned up the logic for changelog to better compare against the installed verison. For nightly users, show the last stable as installed. * Removed some demo code * SplitQuery to allow loading tags much faster for series metadata load. * Bump versions by dotnet-bump-version. * Fixed the book reader off by one issue with loading last page (#911) * Bump versions by dotnet-bump-version. * Misc Fixes (#914) * Fixed the book reader off by one issue with loading last page * Fixed a case where scanner would not delete a series if another series with same name but different format was added in that same scan. * Added some missing tag generation (chapter language and summary) * Bump versions by dotnet-bump-version. * Implemented Publication Status in SeriesMetadata and the ability to filter it. (#915) * Bump versions by dotnet-bump-version. * Book Reader Issue Take 2 (#916) * Implemented Publication Status in SeriesMetadata and the ability to filter it. * Updated the docs for Language on metadata to specify it's a BCP-47 code to match Anansi Project. Fixed a bug with reader from previous PR. * Bump versions by dotnet-bump-version. * Don't tag a series as completed if count is 0. (#917) * Bump versions by dotnet-bump-version. * Last Page Rendering Twice on Web Reader Fix (#920) * Don't tag a series as completed if count is 0. * Removed some dead code and added some spacers for when certain fields are disabled so filter section still looks good. * Fixed a bug where last page of a manga reader would be rendered twice when paging backwards. * Bump versions by dotnet-bump-version. * Metadata Performance Scan (#921) * Refactored updating chapter metadata from ComicInfo into the Scan loop. This let's us avoid an additional N file reads (expensive) in the metadata service, as we already have to read them in the scan loop. * Refactored Series level metadata aggregation into the scan loop. This allows for the batching of DB updates to be much smaller, thus faster without much overhead of GC. * Refactored some of the code for ProcessFile to remove a few redundant if statements * Fixed broken build (#922) * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Misc Fixes and Changes (#927) * Cleaned up a ton of warnings/suggestions from the IDE. * Fixed a bug when clearing the filters some presets could be undone. * Renamed a class in the OPDS spec * Simplified logic for when Fit To Screen rendering logic occurs. It now works always rather than only on cover images. * Give some additional info to the user on what the differences between Library Types are * Don't scan .qpkg folders (QNAP devices) * Refactored some code to enable ability to test CoverImage Test. This is a broken test, test.zip is waiting on an issue in NetVips. * Fixed an issue where Extra might get flagged as special too early, if in a word like Extraordinary * Cleaned up the regex for the extra issue to be more flexible * Bump versions by dotnet-bump-version. * Bump follow-redirects from 1.13.0 to 1.14.7 in /UI/Web (#929) Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.13.0 to 1.14.7. - [Release notes](https://github.com/follow-redirects/follow-redirects/releases) - [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.13.0...v1.14.7) --- updated-dependencies: - dependency-name: follow-redirects dependency-type: indirect ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> * Fixed a bug in CleanupBookmarks where the Except was deleting all files because the path separators didn't match. (#932) * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Feature/parse scanned files tests (#934) * Fixed a bug in CleanupBookmarks where the Except was deleting all files because the path separators didn't match. * Added unit tests for ParseScannedFiles.cs. * Fixed some unit tests. Parser will now clear out multiple spaces in a row and replace with a single. * Bump versions by dotnet-bump-version. * Performance Enhancements (#937) * Bump versions by dotnet-bump-version. * Unit Tests & New Natural Sort (#941) * Added a lot of tests * More tests! Added a Parser.NormalizePath to normalize all paths within Kavita. * Fixed a bug where MarkChaptersAsUnread implementation wasn't consistent between different files and lead to extra row generation for no reason. * Added more unit tests * Found a better implementation for Natural Sorting. Added tests and validate it works. Next commit will swap out natural Sort for new Extension. * Replaced NaturalSortComparer with OrderByNatural. * Drastically simplified and sped up FindFirstEntry for finding cover images in archives * Initial fix for a epub bug where metadata defines key as absolute path but document uses a relative path. We now have a hack to correct for the epub. * Bump versions by dotnet-bump-version. * Fix image aspect ratio in rare case (#935) * Bump versions by dotnet-bump-version. * Fixed missing handlers for adding a chapter to a reading list from card details modal and adding series to collection from series detail. (#942) * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Metadata Tags (#947) * Implemented the ability to click a metadata tag (in series detail) and load a pre-filtered view. Apply still needs to be implemented (preset load is out of sync with external filter) * Refactored people to properly use typeahead so duplicates don't happen and use an observable chain so we can update the screen correctly * Many refactoring to ensure that the timings for filtering always works * Bump versions by dotnet-bump-version. * Removed a hack that was put in when users complained about a tool improperly tagging. This is not the case for most tools. (#949) * Scanner not merging with series that has LocalizedName match (#950) * When performing a scan, series should group if they share the same localized name as a pre-existing series. * Fixed a bug where a series with a different name and localized name weren't merging with a different set of files with the same naming as localized name. * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Bump versions by dotnet-bump-version. * Reader Fixes (#951) * Normalized paths on download controller and when scan is killed due to missing or empty folders, log a critical error. * Tweaked the query for OnDeck to better promote recently added chapters in a series with read progress, but it's still not perfect. * Fixed an issue where up/down key weren't working unless you clicked on the book explicitly * Fixed an issue where infinite scroller was broken in fullscreen mode * When toggling fullscreen mode on infinite scroller, the current page is retained as current position * Fixed an issue where a double render would occur when we didn't need to render as fit split * Stop showing loader when not using fit split * Bump versions by dotnet-bump-version. * Shakeout testing Fixes (#952) * Cleaned up some old code in download bookmark that could create pointless temp folders. * Fixed a bad http call on reading list remove read and cleaned up the messaging * Undid an optimization in finding cover image due to it perfoming depth first rather than breadth. * Updated CleanComicInfo to have Translators and CoverArtists, which were previously missing. * Renamed Refresh Metadata to Refresh Covers on the UI, given Metadata refresh is done in Scan. * Library detail will now retain the search query in the UI. Reduced the amount of api calls to the backend on load. * Reverted allowing the filter to reside in the UI (even though it does work). * Updated the Age Rating to match the v2.1 spec. * Fixed a bug where progress wasn't being saved * Fixed line height not having any effect due to not applying to children elements in the reader * Fixed some wording for Refresh Covers confirmation * Delete Series will now send an event to the UI informing that series was deleted. * Change Progress widget to show Refreshing Covers for * When we exit early due to potential missing folders/drives in a scan, tell the UI that scan is 100% done. * Fixed manage library not supressing scan loader when a complete came in * Fixed a spelling difference for Publication Status between filter and series detail * Fixed a bug where collection detail page would flash on first load due to duplicate load events * Added bookmarks to backups * Fixed issues where fullscreen mode would break infinite scroller contiunous reader * Bump versions by dotnet-bump-version. * Fixed GetTags having wrong return type defined (#954) * Bump versions by dotnet-bump-version. * Missing Age Ratings (#955) * Fixed GetTags having wrong return type defined * Added missing Age Rating tags * Bump versions by dotnet-bump-version. * Version bump for release (#953) Co-authored-by: Robbie Davis <robbie@therobbiedavis.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Andrew Mackrodt <andrewmackrodt@gmail.com>
This commit is contained in:
parent
7fb41f0945
commit
41096d6dc5
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -9,7 +9,7 @@ assignees: ''
|
||||
|
||||
**If this is a feature request, request [here](https://feats.kavitareader.com/) instead. Feature requests will be deleted from Github.**
|
||||
|
||||
|
||||
Please put as much information as possible to help me understand your issue. OS, browser, version are very important!
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
133
.github/workflows/sonar-scan.yml
vendored
133
.github/workflows/sonar-scan.yml
vendored
@ -20,7 +20,8 @@ jobs:
|
||||
- name: Setup .NET Core
|
||||
uses: actions/setup-dotnet@v1
|
||||
with:
|
||||
dotnet-version: 5.0.100
|
||||
include-prerelease: True
|
||||
dotnet-version: '6.0'
|
||||
|
||||
- name: Install dependencies
|
||||
run: dotnet restore
|
||||
@ -35,67 +36,68 @@ jobs:
|
||||
name: csproj
|
||||
path: Kavita.Common/Kavita.Common.csproj
|
||||
|
||||
test:
|
||||
name: Install Sonar & Test
|
||||
needs: build
|
||||
runs-on: windows-latest
|
||||
steps:
|
||||
- name: Checkout Repo
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET Core
|
||||
uses: actions/setup-dotnet@v1
|
||||
with:
|
||||
dotnet-version: 5.0.100
|
||||
|
||||
- name: Install dependencies
|
||||
run: dotnet restore
|
||||
|
||||
- name: Set up JDK 11
|
||||
uses: actions/setup-java@v1
|
||||
with:
|
||||
java-version: 1.11
|
||||
|
||||
- name: Cache SonarCloud packages
|
||||
uses: actions/cache@v1
|
||||
with:
|
||||
path: ~\sonar\cache
|
||||
key: ${{ runner.os }}-sonar
|
||||
restore-keys: ${{ runner.os }}-sonar
|
||||
|
||||
- name: Cache SonarCloud scanner
|
||||
id: cache-sonar-scanner
|
||||
uses: actions/cache@v1
|
||||
with:
|
||||
path: .\.sonar\scanner
|
||||
key: ${{ runner.os }}-sonar-scanner
|
||||
restore-keys: ${{ runner.os }}-sonar-scanner
|
||||
|
||||
- name: Install SonarCloud scanner
|
||||
if: steps.cache-sonar-scanner.outputs.cache-hit != 'true'
|
||||
shell: powershell
|
||||
run: |
|
||||
New-Item -Path .\.sonar\scanner -ItemType Directory
|
||||
dotnet tool update dotnet-sonarscanner --tool-path .\.sonar\scanner
|
||||
|
||||
- name: Sonar Scan
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
|
||||
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
|
||||
shell: powershell
|
||||
run: |
|
||||
.\.sonar\scanner\dotnet-sonarscanner begin /k:"Kareadita_Kavita" /o:"kareadita" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="https://sonarcloud.io"
|
||||
dotnet build --configuration Release
|
||||
.\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
|
||||
|
||||
- name: Test
|
||||
run: dotnet test --no-restore --verbosity normal
|
||||
# test:
|
||||
# name: Install Sonar & Test
|
||||
# needs: build
|
||||
# runs-on: windows-latest
|
||||
# steps:
|
||||
# - name: Checkout Repo
|
||||
# uses: actions/checkout@v2
|
||||
# with:
|
||||
# fetch-depth: 0
|
||||
#
|
||||
# - name: Setup .NET Core
|
||||
# uses: actions/setup-dotnet@v1
|
||||
# with:
|
||||
# include-prerelease: True
|
||||
# dotnet-version: '6.0'
|
||||
#
|
||||
# - name: Install dependencies
|
||||
# run: dotnet restore
|
||||
#
|
||||
# - name: Set up JDK 11
|
||||
# uses: actions/setup-java@v1
|
||||
# with:
|
||||
# java-version: 1.11
|
||||
#
|
||||
# - name: Cache SonarCloud packages
|
||||
# uses: actions/cache@v1
|
||||
# with:
|
||||
# path: ~\sonar\cache
|
||||
# key: ${{ runner.os }}-sonar
|
||||
# restore-keys: ${{ runner.os }}-sonar
|
||||
#
|
||||
# - name: Cache SonarCloud scanner
|
||||
# id: cache-sonar-scanner
|
||||
# uses: actions/cache@v1
|
||||
# with:
|
||||
# path: .\.sonar\scanner
|
||||
# key: ${{ runner.os }}-sonar-scanner
|
||||
# restore-keys: ${{ runner.os }}-sonar-scanner
|
||||
#
|
||||
# - name: Install SonarCloud scanner
|
||||
# if: steps.cache-sonar-scanner.outputs.cache-hit != 'true'
|
||||
# shell: powershell
|
||||
# run: |
|
||||
# New-Item -Path .\.sonar\scanner -ItemType Directory
|
||||
# dotnet tool update dotnet-sonarscanner --tool-path .\.sonar\scanner
|
||||
#
|
||||
# - name: Sonar Scan
|
||||
# env:
|
||||
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
|
||||
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
|
||||
# shell: powershell
|
||||
# run: |
|
||||
# .\.sonar\scanner\dotnet-sonarscanner begin /k:"Kareadita_Kavita" /o:"kareadita" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="https://sonarcloud.io"
|
||||
# dotnet build --configuration Release
|
||||
# .\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
|
||||
#
|
||||
# - name: Test
|
||||
# run: dotnet test --no-restore --verbosity normal
|
||||
|
||||
version:
|
||||
name: Bump version on Develop push
|
||||
needs: [ build, test ]
|
||||
needs: [ build ]
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/develop' }}
|
||||
steps:
|
||||
@ -106,7 +108,8 @@ jobs:
|
||||
- name: Setup .NET Core
|
||||
uses: actions/setup-dotnet@v1
|
||||
with:
|
||||
dotnet-version: 5.0.100
|
||||
include-prerelease: True
|
||||
dotnet-version: '6.0'
|
||||
|
||||
- name: Install dependencies
|
||||
run: dotnet restore
|
||||
@ -122,7 +125,7 @@ jobs:
|
||||
|
||||
develop:
|
||||
name: Build Nightly Docker if Develop push
|
||||
needs: [ build, test, version ]
|
||||
needs: [ build, version ]
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/develop' }}
|
||||
steps:
|
||||
@ -186,7 +189,8 @@ jobs:
|
||||
- name: Compile dotnet app
|
||||
uses: actions/setup-dotnet@v1
|
||||
with:
|
||||
dotnet-version: '5.0.x'
|
||||
include-prerelease: True
|
||||
dotnet-version: '6.0'
|
||||
- run: ./monorepo-build.sh
|
||||
|
||||
- name: Login to Docker Hub
|
||||
@ -225,7 +229,7 @@ jobs:
|
||||
|
||||
stable:
|
||||
name: Build Stable Docker if Main push
|
||||
needs: [ build, test ]
|
||||
needs: [ build ]
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }}
|
||||
steps:
|
||||
@ -299,7 +303,8 @@ jobs:
|
||||
- name: Compile dotnet app
|
||||
uses: actions/setup-dotnet@v1
|
||||
with:
|
||||
dotnet-version: '5.0.x'
|
||||
include-prerelease: True
|
||||
dotnet-version: '6.0'
|
||||
- run: ./monorepo-build.sh
|
||||
|
||||
- name: Login to Docker Hub
|
||||
|
6
.gitignore
vendored
6
.gitignore
vendored
@ -508,6 +508,7 @@ UI/Web/dist/
|
||||
/API/config/cache/
|
||||
/API/config/temp/
|
||||
/API/config/stats/
|
||||
/API/config/bookmarks/
|
||||
/API/config/kavita.db
|
||||
/API/config/kavita.db-shm
|
||||
/API/config/kavita.db-wal
|
||||
@ -517,5 +518,8 @@ API/config/covers/
|
||||
API/config/*.db
|
||||
API/config/stats/*
|
||||
API/config/stats/app_stats.json
|
||||
|
||||
API/config/pre-metadata/
|
||||
API/config/post-metadata/
|
||||
API.Tests/TestResults/
|
||||
UI/Web/.vscode/settings.json
|
||||
/API.Tests/Services/Test Data/ArchiveService/CoverImages/output/*
|
||||
|
@ -1,7 +1,7 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net5.0</TargetFramework>
|
||||
<TargetFramework>net6.0</TargetFramework>
|
||||
<OutputType>Exe</OutputType>
|
||||
</PropertyGroup>
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
using System.IO;
|
||||
using System.IO.Abstractions;
|
||||
using API.Entities.Enums;
|
||||
using API.Interfaces.Services;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.Services.Tasks.Scanner;
|
||||
@ -20,11 +20,15 @@ namespace API.Benchmark
|
||||
private readonly ParseScannedFiles _parseScannedFiles;
|
||||
private readonly ILogger<ParseScannedFiles> _logger = Substitute.For<ILogger<ParseScannedFiles>>();
|
||||
private readonly ILogger<BookService> _bookLogger = Substitute.For<ILogger<BookService>>();
|
||||
private readonly IArchiveService _archiveService = Substitute.For<ArchiveService>();
|
||||
|
||||
public ParseScannedFilesBenchmarks()
|
||||
{
|
||||
IBookService bookService = new BookService(_bookLogger);
|
||||
_parseScannedFiles = new ParseScannedFiles(bookService, _logger);
|
||||
var directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem());
|
||||
_parseScannedFiles = new ParseScannedFiles(
|
||||
Substitute.For<ILogger>(),
|
||||
directoryService,
|
||||
new ReadingItemService(_archiveService, new BookService(_bookLogger, directoryService, new ImageService(Substitute.For<ILogger<ImageService>>(), directoryService)), Substitute.For<ImageService>(), directoryService));
|
||||
}
|
||||
|
||||
// [Benchmark]
|
||||
|
@ -3,6 +3,7 @@ using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Comparators;
|
||||
using API.DTOs;
|
||||
using API.Extensions;
|
||||
using BenchmarkDotNet.Attributes;
|
||||
using BenchmarkDotNet.Order;
|
||||
|
||||
@ -16,9 +17,6 @@ namespace API.Benchmark
|
||||
[RankColumn]
|
||||
public class TestBenchmark
|
||||
{
|
||||
private readonly NaturalSortComparer _naturalSortComparer = new ();
|
||||
|
||||
|
||||
private static IEnumerable<VolumeDto> GenerateVolumes(int max)
|
||||
{
|
||||
var random = new Random();
|
||||
@ -50,11 +48,11 @@ namespace API.Benchmark
|
||||
return list;
|
||||
}
|
||||
|
||||
private void SortSpecialChapters(IEnumerable<VolumeDto> volumes)
|
||||
private static void SortSpecialChapters(IEnumerable<VolumeDto> volumes)
|
||||
{
|
||||
foreach (var v in volumes.Where(vDto => vDto.Number == 0))
|
||||
{
|
||||
v.Chapters = v.Chapters.OrderBy(x => x.Range, _naturalSortComparer).ToList();
|
||||
v.Chapters = v.Chapters.OrderByNatural(x => x.Range).ToList();
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,15 +1,16 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net5.0</TargetFramework>
|
||||
<TargetFramework>net6.0</TargetFramework>
|
||||
|
||||
<IsPackable>false</IsPackable>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="5.0.10" />
|
||||
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="16.11.0" />
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="6.0.1" />
|
||||
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.0.0" />
|
||||
<PackageReference Include="NSubstitute" Version="4.2.2" />
|
||||
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="16.1.2" />
|
||||
<PackageReference Include="xunit" Version="2.4.1" />
|
||||
<PackageReference Include="xunit.runner.visualstudio" Version="2.4.3">
|
||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
|
||||
|
17
API.Tests/Comparers/ChapterSortComparerZeroFirstTests.cs
Normal file
17
API.Tests/Comparers/ChapterSortComparerZeroFirstTests.cs
Normal file
@ -0,0 +1,17 @@
|
||||
using System.Linq;
|
||||
using API.Comparators;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Comparers;
|
||||
|
||||
public class ChapterSortComparerZeroFirstTests
|
||||
{
|
||||
[Theory]
|
||||
[InlineData(new[] {1, 2, 0}, new[] {0, 1, 2,})]
|
||||
[InlineData(new[] {3, 1, 2}, new[] {1, 2, 3})]
|
||||
[InlineData(new[] {1, 0, 0}, new[] {0, 0, 1})]
|
||||
public void ChapterSortComparerZeroFirstTest(int[] input, int[] expected)
|
||||
{
|
||||
Assert.Equal(expected, input.OrderBy(f => f, new ChapterSortComparerZeroFirst()).ToArray());
|
||||
}
|
||||
}
|
26
API.Tests/Comparers/NumericComparerTests.cs
Normal file
26
API.Tests/Comparers/NumericComparerTests.cs
Normal file
@ -0,0 +1,26 @@
|
||||
using System;
|
||||
using API.Comparators;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Comparers;
|
||||
|
||||
public class NumericComparerTests
|
||||
{
|
||||
[Theory]
|
||||
[InlineData(
|
||||
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
|
||||
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
|
||||
)]
|
||||
public void NumericComparerTest(string[] input, string[] expected)
|
||||
{
|
||||
var nc = new NumericComparer();
|
||||
Array.Sort(input, nc);
|
||||
|
||||
var i = 0;
|
||||
foreach (var s in input)
|
||||
{
|
||||
Assert.Equal(s, expected[i]);
|
||||
i++;
|
||||
}
|
||||
}
|
||||
}
|
@ -8,13 +8,20 @@ namespace API.Tests.Comparers
|
||||
{
|
||||
[Theory]
|
||||
[InlineData(
|
||||
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
|
||||
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
|
||||
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
|
||||
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
|
||||
)]
|
||||
public void TestLogicalComparer(string[] input, string[] expected)
|
||||
[InlineData(
|
||||
new[] {"a.jpg", "aaa.jpg", "1.jpg", },
|
||||
new[] {"1.jpg", "a.jpg", "aaa.jpg"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"a.jpg", "aaa.jpg", "1.jpg", "!cover.png"},
|
||||
new[] {"!cover.png", "1.jpg", "a.jpg", "aaa.jpg"}
|
||||
)]
|
||||
public void StringComparer(string[] input, string[] expected)
|
||||
{
|
||||
NumericComparer nc = new NumericComparer();
|
||||
Array.Sort(input, nc);
|
||||
Array.Sort(input, StringLogicalComparer.Compare);
|
||||
|
||||
var i = 0;
|
||||
foreach (var s in input)
|
||||
@ -24,4 +31,4 @@ namespace API.Tests.Comparers
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -9,9 +9,11 @@ namespace API.Tests.Converters
|
||||
[InlineData("daily", "0 0 * * *")]
|
||||
[InlineData("disabled", "0 0 31 2 *")]
|
||||
[InlineData("weekly", "0 0 * * 1")]
|
||||
[InlineData("", "0 0 31 2 *")]
|
||||
[InlineData("sdfgdf", "")]
|
||||
public void ConvertTest(string input, string expected)
|
||||
{
|
||||
Assert.Equal(expected, CronConverter.ConvertToCronNotation(input));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
38
API.Tests/Entities/ComicInfoTests.cs
Normal file
38
API.Tests/Entities/ComicInfoTests.cs
Normal file
@ -0,0 +1,38 @@
|
||||
using API.Data.Metadata;
|
||||
using API.Entities.Enums;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Entities;
|
||||
|
||||
public class ComicInfoTests
|
||||
{
|
||||
#region ConvertAgeRatingToEnum
|
||||
|
||||
[Theory]
|
||||
[InlineData("G", AgeRating.G)]
|
||||
[InlineData("Everyone", AgeRating.Everyone)]
|
||||
[InlineData("Teen", AgeRating.Teen)]
|
||||
[InlineData("Adults Only 18+", AgeRating.AdultsOnly)]
|
||||
[InlineData("Early Childhood", AgeRating.EarlyChildhood)]
|
||||
[InlineData("Everyone 10+", AgeRating.Everyone10Plus)]
|
||||
[InlineData("M", AgeRating.Mature)]
|
||||
[InlineData("MA 15+", AgeRating.Mature15Plus)]
|
||||
[InlineData("Mature 17+", AgeRating.Mature17Plus)]
|
||||
[InlineData("Rating Pending", AgeRating.RatingPending)]
|
||||
[InlineData("X18+", AgeRating.X18Plus)]
|
||||
[InlineData("Kids to Adults", AgeRating.KidsToAdults)]
|
||||
[InlineData("NotValid", AgeRating.Unknown)]
|
||||
[InlineData("PG", AgeRating.PG)]
|
||||
[InlineData("R18+", AgeRating.R18Plus)]
|
||||
public void ConvertAgeRatingToEnum_ShouldConvertCorrectly(string input, AgeRating expected)
|
||||
{
|
||||
Assert.Equal(expected, ComicInfo.ConvertAgeRatingToEnum(input));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ConvertAgeRatingToEnum_ShouldCompareCaseInsensitive()
|
||||
{
|
||||
Assert.Equal(AgeRating.RatingPending, ComicInfo.ConvertAgeRatingToEnum("rating pending"));
|
||||
}
|
||||
#endregion
|
||||
}
|
@ -1,4 +1,5 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
@ -9,7 +10,7 @@ namespace API.Tests.Extensions
|
||||
{
|
||||
public class ChapterListExtensionsTests
|
||||
{
|
||||
private Chapter CreateChapter(string range, string number, MangaFile file, bool isSpecial)
|
||||
private static Chapter CreateChapter(string range, string number, MangaFile file, bool isSpecial)
|
||||
{
|
||||
return new Chapter()
|
||||
{
|
||||
@ -20,7 +21,7 @@ namespace API.Tests.Extensions
|
||||
};
|
||||
}
|
||||
|
||||
private MangaFile CreateFile(string file, MangaFormat format)
|
||||
private static MangaFile CreateFile(string file, MangaFormat format)
|
||||
{
|
||||
return new MangaFile()
|
||||
{
|
||||
@ -28,7 +29,7 @@ namespace API.Tests.Extensions
|
||||
Format = format
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public void GetAnyChapterByRange_Test_ShouldBeNull()
|
||||
{
|
||||
@ -51,11 +52,11 @@ namespace API.Tests.Extensions
|
||||
};
|
||||
|
||||
var actualChapter = chapterList.GetChapterByRange(info);
|
||||
|
||||
|
||||
Assert.NotEqual(chapterList[0], actualChapter);
|
||||
|
||||
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public void GetAnyChapterByRange_Test_ShouldBeNotNull()
|
||||
{
|
||||
@ -78,9 +79,39 @@ namespace API.Tests.Extensions
|
||||
};
|
||||
|
||||
var actualChapter = chapterList.GetChapterByRange(info);
|
||||
|
||||
|
||||
Assert.Equal(chapterList[0], actualChapter);
|
||||
|
||||
}
|
||||
|
||||
#region GetFirstChapterWithFiles
|
||||
|
||||
[Fact]
|
||||
public void GetFirstChapterWithFiles_ShouldReturnAllChapters()
|
||||
{
|
||||
var chapterList = new List<Chapter>()
|
||||
{
|
||||
CreateChapter("darker than black", "0", CreateFile("/manga/darker than black.cbz", MangaFormat.Archive), true),
|
||||
CreateChapter("darker than black", "1", CreateFile("/manga/darker than black.cbz", MangaFormat.Archive), false),
|
||||
};
|
||||
|
||||
Assert.Equal(chapterList.First(), chapterList.GetFirstChapterWithFiles());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFirstChapterWithFiles_ShouldReturnSecondChapter()
|
||||
{
|
||||
var chapterList = new List<Chapter>()
|
||||
{
|
||||
CreateChapter("darker than black", "0", CreateFile("/manga/darker than black.cbz", MangaFormat.Archive), true),
|
||||
CreateChapter("darker than black", "1", CreateFile("/manga/darker than black.cbz", MangaFormat.Archive), false),
|
||||
};
|
||||
|
||||
chapterList.First().Files = new List<MangaFile>();
|
||||
|
||||
Assert.Equal(chapterList.Last(), chapterList.GetFirstChapterWithFiles());
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,15 +1,12 @@
|
||||
using System;
|
||||
using System.Linq;
|
||||
using API.Comparators;
|
||||
using System.Linq;
|
||||
using API.Extensions;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Comparers
|
||||
{
|
||||
public class NaturalSortComparerTest
|
||||
{
|
||||
private readonly NaturalSortComparer _nc = new NaturalSortComparer();
|
||||
namespace API.Tests.Extensions;
|
||||
|
||||
[Theory]
|
||||
public class EnumerableExtensionsTests
|
||||
{
|
||||
[Theory]
|
||||
[InlineData(
|
||||
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
|
||||
new[] {"x1.jpg", "x3.jpg", "x4.jpg", "x10.jpg", "x11.jpg"}
|
||||
@ -46,19 +43,43 @@ namespace API.Tests.Comparers
|
||||
new[] {"Solo Leveling - c000 (v01) - p000 [Cover] [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p001 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p002 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p003 [dig] [Yen Press] [LuCaZ].jpg"},
|
||||
new[] {"Solo Leveling - c000 (v01) - p000 [Cover] [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p001 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p002 [dig] [Yen Press] [LuCaZ].jpg", "Solo Leveling - c000 (v01) - p003 [dig] [Yen Press] [LuCaZ].jpg"}
|
||||
)]
|
||||
public void TestNaturalSortComparer(string[] input, string[] expected)
|
||||
[InlineData(
|
||||
new[] {"Marvel2In1-7", "Marvel2In1-7-01", "Marvel2In1-7-02"},
|
||||
new[] {"Marvel2In1-7", "Marvel2In1-7-01", "Marvel2In1-7-02"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"001", "002", "!001"},
|
||||
new[] {"!001", "001", "002"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"001.jpg", "002.jpg", "!001.jpg"},
|
||||
new[] {"!001.jpg", "001.jpg", "002.jpg"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"001", "002", "!002"},
|
||||
new[] {"!002", "001", "002"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"001", ""},
|
||||
new[] {"", "001"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"Honzuki no Gekokujou_ Part 2/_Ch.019/002.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.019/001.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.020/001.jpg"},
|
||||
new[] {"Honzuki no Gekokujou_ Part 2/_Ch.019/001.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.019/002.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.020/001.jpg"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {@"F:\/Anime_Series_Pelis/MANGA/Mangahere (EN)\Kirara Fantasia\_Ch.001\001.jpg", @"F:\/Anime_Series_Pelis/MANGA/Mangahere (EN)\Kirara Fantasia\_Ch.001\002.jpg"},
|
||||
new[] {@"F:\/Anime_Series_Pelis/MANGA/Mangahere (EN)\Kirara Fantasia\_Ch.001\001.jpg", @"F:\/Anime_Series_Pelis/MANGA/Mangahere (EN)\Kirara Fantasia\_Ch.001\002.jpg"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"01/001.jpg", "001.jpg"},
|
||||
new[] {"001.jpg", "01/001.jpg"}
|
||||
)]
|
||||
public void TestNaturalSort(string[] input, string[] expected)
|
||||
{
|
||||
Array.Sort(input, _nc);
|
||||
|
||||
var i = 0;
|
||||
foreach (var s in input)
|
||||
{
|
||||
Assert.Equal(s, expected[i]);
|
||||
i++;
|
||||
}
|
||||
Assert.Equal(expected, input.OrderByNatural(x => x).ToArray());
|
||||
}
|
||||
|
||||
|
||||
[Theory]
|
||||
[InlineData(
|
||||
new[] {"x1.jpg", "x10.jpg", "x3.jpg", "x4.jpg", "x11.jpg"},
|
||||
@ -84,6 +105,10 @@ namespace API.Tests.Comparers
|
||||
new[] {"001.jpg", "10.jpg",},
|
||||
new[] {"001.jpg", "10.jpg",}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"001", "002", "!001"},
|
||||
new[] {"!001", "001", "002"}
|
||||
)]
|
||||
[InlineData(
|
||||
new[] {"10/001.jpg", "10.jpg",},
|
||||
new[] {"10.jpg", "10/001.jpg",}
|
||||
@ -92,9 +117,13 @@ namespace API.Tests.Comparers
|
||||
new[] {"Batman - Black white vol 1 #04.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr"},
|
||||
new[] {"Batman - Black white vol 1 #01.cbr", "Batman - Black white vol 1 #02.cbr", "Batman - Black white vol 1 #03.cbr", "Batman - Black white vol 1 #04.cbr"}
|
||||
)]
|
||||
public void TestNaturalSortComparerLinq(string[] input, string[] expected)
|
||||
[InlineData(
|
||||
new[] {"Honzuki no Gekokujou_ Part 2/_Ch.019/002.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.019/001.jpg"},
|
||||
new[] {"Honzuki no Gekokujou_ Part 2/_Ch.019/001.jpg", "Honzuki no Gekokujou_ Part 2/_Ch.019/002.jpg"}
|
||||
)]
|
||||
public void TestNaturalSortLinq(string[] input, string[] expected)
|
||||
{
|
||||
var output = input.OrderBy(c => c, _nc);
|
||||
var output = input.OrderByNatural(x => x);
|
||||
|
||||
var i = 0;
|
||||
foreach (var s in output)
|
||||
@ -103,5 +132,4 @@ namespace API.Tests.Comparers
|
||||
i++;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
48
API.Tests/Extensions/FilterDtoExtensionsTests.cs
Normal file
48
API.Tests/Extensions/FilterDtoExtensionsTests.cs
Normal file
@ -0,0 +1,48 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using API.DTOs.Filtering;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Extensions;
|
||||
|
||||
public class FilterDtoExtensionsTests
|
||||
{
|
||||
[Fact]
|
||||
public void GetSqlFilter_ShouldReturnAllFormats()
|
||||
{
|
||||
var filter = new FilterDto()
|
||||
{
|
||||
Formats = null
|
||||
};
|
||||
|
||||
Assert.Equal(Enum.GetValues<MangaFormat>(), filter.GetSqlFilter());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetSqlFilter_ShouldReturnAllFormats2()
|
||||
{
|
||||
var filter = new FilterDto()
|
||||
{
|
||||
Formats = new List<MangaFormat>()
|
||||
};
|
||||
|
||||
Assert.Equal(Enum.GetValues<MangaFormat>(), filter.GetSqlFilter());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetSqlFilter_ShouldReturnJust2()
|
||||
{
|
||||
var formats = new List<MangaFormat>()
|
||||
{
|
||||
MangaFormat.Archive, MangaFormat.Epub
|
||||
};
|
||||
var filter = new FilterDto()
|
||||
{
|
||||
Formats = formats
|
||||
};
|
||||
|
||||
Assert.Equal(formats, filter.GetSqlFilter());
|
||||
}
|
||||
}
|
@ -1,15 +1,27 @@
|
||||
using System.Collections.Generic;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.Tests.Helpers;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Extensions
|
||||
{
|
||||
public class ParserInfoListExtensions
|
||||
{
|
||||
private readonly DefaultParser _defaultParser;
|
||||
public ParserInfoListExtensions()
|
||||
{
|
||||
_defaultParser =
|
||||
new DefaultParser(new DirectoryService(Substitute.For<ILogger<DirectoryService>>(),
|
||||
new MockFileSystem()));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData(new[] {"1", "1", "3-5", "5", "8", "0", "0"}, new[] {"1", "3-5", "5", "8", "0"})]
|
||||
public void DistinctVolumesTest(string[] volumeNumbers, string[] expectedNumbers)
|
||||
@ -17,7 +29,7 @@ namespace API.Tests.Extensions
|
||||
var infos = volumeNumbers.Select(n => new ParserInfo() {Volumes = n}).ToList();
|
||||
Assert.Equal(expectedNumbers, infos.DistinctVolumes());
|
||||
}
|
||||
|
||||
|
||||
[Theory]
|
||||
[InlineData(new[] {@"Cynthia The Mission - c000-006 (v06) [Desudesu&Brolen].zip"}, new[] {@"E:\Manga\Cynthia the Mission\Cynthia The Mission - c000-006 (v06) [Desudesu&Brolen].zip"}, true)]
|
||||
[InlineData(new[] {@"Cynthia The Mission - c000-006 (v06-07) [Desudesu&Brolen].zip"}, new[] {@"E:\Manga\Cynthia the Mission\Cynthia The Mission - c000-006 (v06) [Desudesu&Brolen].zip"}, true)]
|
||||
@ -27,7 +39,7 @@ namespace API.Tests.Extensions
|
||||
var infos = new List<ParserInfo>();
|
||||
foreach (var filename in inputInfos)
|
||||
{
|
||||
infos.Add(API.Parser.Parser.Parse(
|
||||
infos.Add(_defaultParser.Parse(
|
||||
filename,
|
||||
string.Empty));
|
||||
}
|
||||
@ -38,4 +50,4 @@ namespace API.Tests.Extensions
|
||||
Assert.Equal(expectedHasInfo, infos.HasInfo(chapter));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
20
API.Tests/Extensions/PathExtensionsTests.cs
Normal file
20
API.Tests/Extensions/PathExtensionsTests.cs
Normal file
@ -0,0 +1,20 @@
|
||||
using System.IO;
|
||||
using Xunit;
|
||||
using API.Extensions;
|
||||
|
||||
namespace API.Tests.Extensions;
|
||||
|
||||
public class PathExtensionsTests
|
||||
{
|
||||
#region GetFullPathWithoutExtension
|
||||
|
||||
[Theory]
|
||||
[InlineData("joe.png", "joe")]
|
||||
[InlineData("c:/directory/joe.png", "c:/directory/joe")]
|
||||
public void GetFullPathWithoutExtension_Test(string input, string expected)
|
||||
{
|
||||
Assert.Equal(Path.GetFullPath(expected), input.GetFullPathWithoutExtension());
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
@ -1,6 +1,11 @@
|
||||
using API.Entities;
|
||||
using System.Linq;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Entities.Metadata;
|
||||
using API.Extensions;
|
||||
using API.Parser;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using API.Tests.Helpers;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Extensions
|
||||
@ -31,6 +36,38 @@ namespace API.Tests.Extensions
|
||||
Assert.Equal(expected, series.NameInList(list));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData(new [] {"Darker than Black", "Darker Than Black", "Darker than Black"}, new [] {"Darker than Black"}, MangaFormat.Archive, true)]
|
||||
[InlineData(new [] {"Darker than Black", "Darker Than Black", "Darker than Black"}, new [] {"Darker_than_Black"}, MangaFormat.Archive, true)]
|
||||
[InlineData(new [] {"Darker than Black", "Darker Than Black", "Darker than Black"}, new [] {"Darker then Black!"}, MangaFormat.Archive, false)]
|
||||
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot"}, new [] {"Salem's Lot"}, MangaFormat.Archive, true)]
|
||||
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot"}, new [] {"salems lot"}, MangaFormat.Archive, true)]
|
||||
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot"}, new [] {"salem's lot"}, MangaFormat.Archive, true)]
|
||||
// Different normalizations pass as we check normalization against an on-the-fly calculation so we don't delete series just because we change how normalization works
|
||||
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot", "salems lot"}, new [] {"salem's lot"}, MangaFormat.Archive, true)]
|
||||
[InlineData(new [] {"Rent-a-Girlfriend", "Rent-a-Girlfriend", "Kanojo, Okarishimasu", "rentagirlfriend"}, new [] {"Kanojo, Okarishimasu"}, MangaFormat.Archive, true)]
|
||||
public void NameInListParserInfoTest(string[] seriesInput, string[] list, MangaFormat format, bool expected)
|
||||
{
|
||||
var series = new Series()
|
||||
{
|
||||
Name = seriesInput[0],
|
||||
LocalizedName = seriesInput[1],
|
||||
OriginalName = seriesInput[2],
|
||||
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
|
||||
Metadata = new SeriesMetadata(),
|
||||
};
|
||||
|
||||
var parserInfos = list.Select(s => new ParsedSeries()
|
||||
{
|
||||
Name = s,
|
||||
NormalizedName = API.Parser.Parser.Normalize(s),
|
||||
}).ToList();
|
||||
|
||||
// This doesn't do any checks against format
|
||||
Assert.Equal(expected, series.NameInList(parserInfos));
|
||||
}
|
||||
|
||||
|
||||
[Theory]
|
||||
[InlineData(new [] {"Darker than Black", "Darker Than Black", "Darker than Black"}, "Darker than Black", true)]
|
||||
[InlineData(new [] {"Rent-a-Girlfriend", "Rent-a-Girlfriend", "Kanojo, Okarishimasu", "rentagirlfriend"}, "Kanojo, Okarishimasu", true)]
|
||||
|
177
API.Tests/Extensions/VolumeListExtensionsTests.cs
Normal file
177
API.Tests/Extensions/VolumeListExtensionsTests.cs
Normal file
@ -0,0 +1,177 @@
|
||||
using System.Collections.Generic;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Tests.Helpers;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Extensions;
|
||||
|
||||
public class VolumeListExtensionsTests
|
||||
{
|
||||
#region FirstWithChapters
|
||||
|
||||
[Fact]
|
||||
public void FirstWithChapters_ReturnsVolumeWithChapters()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()),
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("2", false),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[1].Number, volumes.FirstWithChapters(false).Number);
|
||||
Assert.Equal(volumes[1].Number, volumes.FirstWithChapters(true).Number);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FirstWithChapters_Book()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[0].Number, volumes.FirstWithChapters(true).Number);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FirstWithChapters_NonBook()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[0].Number, volumes.FirstWithChapters(false).Number);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetCoverImage
|
||||
|
||||
[Fact]
|
||||
public void GetCoverImage_ArchiveFormat()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[0].Number, volumes.GetCoverImage(MangaFormat.Archive).Number);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCoverImage_EpubFormat()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[1].Name, volumes.GetCoverImage(MangaFormat.Epub).Name);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCoverImage_PdfFormat()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[1].Name, volumes.GetCoverImage(MangaFormat.Pdf).Name);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCoverImage_ImageFormat()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[0].Name, volumes.GetCoverImage(MangaFormat.Image).Name);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCoverImage_ImageFormat_NoSpecials()
|
||||
{
|
||||
var volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("2", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("3", false),
|
||||
EntityFactory.CreateChapter("4", false),
|
||||
}),
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false),
|
||||
EntityFactory.CreateChapter("0", true),
|
||||
}),
|
||||
};
|
||||
|
||||
Assert.Equal(volumes[1].Name, volumes.GetCoverImage(MangaFormat.Image).Name);
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
}
|
293
API.Tests/Helpers/CacheHelperTests.cs
Normal file
293
API.Tests/Helpers/CacheHelperTests.cs
Normal file
@ -0,0 +1,293 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using API.Entities;
|
||||
using API.Helpers;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class CacheHelperTests
|
||||
{
|
||||
private const string TestCoverImageDirectory = @"c:\";
|
||||
private const string TestCoverImageFile = "thumbnail.jpg";
|
||||
private readonly string _testCoverPath = Path.Join(TestCoverImageDirectory, TestCoverImageFile);
|
||||
private const string TestCoverArchive = @"file in folder.zip";
|
||||
private readonly ICacheHelper _cacheHelper;
|
||||
|
||||
public CacheHelperTests()
|
||||
{
|
||||
var file = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), file },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), file }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
_cacheHelper = new CacheHelper(fileService);
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("", false)]
|
||||
[InlineData("C:/", false)]
|
||||
[InlineData(null, false)]
|
||||
public void CoverImageExists_DoesFileExist(string coverImage, bool exists)
|
||||
{
|
||||
Assert.Equal(exists, _cacheHelper.CoverImageExists(coverImage));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CoverImageExists_FileExists()
|
||||
{
|
||||
Assert.True(_cacheHelper.CoverImageExists(TestCoverArchive));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnFirstRun()
|
||||
{
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now
|
||||
};
|
||||
Assert.True(_cacheHelper.ShouldUpdateCoverImage(null, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
|
||||
false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetNotLocked()
|
||||
{
|
||||
// Represents first run
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now
|
||||
};
|
||||
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
|
||||
false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetNotLocked_2()
|
||||
{
|
||||
// Represents first run
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now
|
||||
};
|
||||
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now,
|
||||
false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetLocked()
|
||||
{
|
||||
// Represents first run
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now
|
||||
};
|
||||
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
|
||||
false, true));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetLocked_Modified()
|
||||
{
|
||||
// Represents first run
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now
|
||||
};
|
||||
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
|
||||
false, true));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_CoverImageSetAndReplaced_Modified()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var created = DateTime.Now.Subtract(TimeSpan.FromHours(1));
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = DateTime.Now.Subtract(TimeSpan.FromMinutes(1))
|
||||
};
|
||||
Assert.True(cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, created,
|
||||
false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceCreated()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var chapter = new Chapter()
|
||||
{
|
||||
Created = filesystemFile.LastWriteTime.DateTime,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
Assert.True(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceLastModified()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var chapter = new Chapter()
|
||||
{
|
||||
Created = filesystemFile.LastWriteTime.DateTime,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
Assert.True(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceLastModified_ForceUpdate()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var chapter = new Chapter()
|
||||
{
|
||||
Created = filesystemFile.LastWriteTime.DateTime,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = TestCoverArchive,
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, true, file));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void HasFileNotChangedSinceCreationOrLastScan_ModifiedSinceLastScan()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var chapter = new Chapter()
|
||||
{
|
||||
Created = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10)),
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10))
|
||||
};
|
||||
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(TestCoverImageDirectory, TestCoverArchive),
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void HasFileNotChangedSinceCreationOrLastScan_ModifiedSinceLastScan_ButLastModifiedSame()
|
||||
{
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
|
||||
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
var cacheHelper = new CacheHelper(fileService);
|
||||
|
||||
var chapter = new Chapter()
|
||||
{
|
||||
Created = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10)),
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
|
||||
var file = new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(TestCoverImageDirectory, TestCoverArchive),
|
||||
LastModified = filesystemFile.LastWriteTime.DateTime
|
||||
};
|
||||
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
|
||||
}
|
||||
|
||||
}
|
@ -1,6 +1,8 @@
|
||||
using System.Collections.Generic;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Entities.Metadata;
|
||||
|
||||
namespace API.Tests.Helpers
|
||||
{
|
||||
@ -27,6 +29,7 @@ namespace API.Tests.Helpers
|
||||
return new Volume()
|
||||
{
|
||||
Name = volumeNumber,
|
||||
Number = int.Parse(volumeNumber),
|
||||
Pages = 0,
|
||||
Chapters = chapters ?? new List<Chapter>()
|
||||
};
|
||||
@ -75,4 +78,4 @@ namespace API.Tests.Helpers
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
110
API.Tests/Helpers/GenreHelperTests.cs
Normal file
110
API.Tests/Helpers/GenreHelperTests.cs
Normal file
@ -0,0 +1,110 @@
|
||||
using System.Collections.Generic;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Helpers;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class GenreHelperTests
|
||||
{
|
||||
[Fact]
|
||||
public void UpdateGenre_ShouldAddNewGenre()
|
||||
{
|
||||
var allGenres = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
DbFactory.Genre("action", false),
|
||||
DbFactory.Genre("Sci-fi", false),
|
||||
};
|
||||
var genreAdded = new List<Genre>();
|
||||
|
||||
GenreHelper.UpdateGenre(allGenres, new[] {"Action", "Adventure"}, false, genre =>
|
||||
{
|
||||
genreAdded.Add(genre);
|
||||
});
|
||||
|
||||
Assert.Equal(2, genreAdded.Count);
|
||||
Assert.Equal(4, allGenres.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void UpdateGenre_ShouldNotAddDuplicateGenre()
|
||||
{
|
||||
var allGenres = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
DbFactory.Genre("action", false),
|
||||
DbFactory.Genre("Sci-fi", false),
|
||||
|
||||
};
|
||||
var genreAdded = new List<Genre>();
|
||||
|
||||
GenreHelper.UpdateGenre(allGenres, new[] {"Action", "Scifi"}, false, genre =>
|
||||
{
|
||||
genreAdded.Add(genre);
|
||||
});
|
||||
|
||||
Assert.Equal(3, allGenres.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void AddGenre_ShouldAddOnlyNonExistingGenre()
|
||||
{
|
||||
var existingGenres = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
DbFactory.Genre("action", false),
|
||||
DbFactory.Genre("Sci-fi", false),
|
||||
};
|
||||
|
||||
|
||||
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Action", false));
|
||||
Assert.Equal(3, existingGenres.Count);
|
||||
|
||||
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("action", false));
|
||||
Assert.Equal(3, existingGenres.Count);
|
||||
|
||||
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Shonen", false));
|
||||
Assert.Equal(4, existingGenres.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void AddGenre_ShouldNotAddSameNameAndExternal()
|
||||
{
|
||||
var existingGenres = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
DbFactory.Genre("action", false),
|
||||
DbFactory.Genre("Sci-fi", false),
|
||||
};
|
||||
|
||||
|
||||
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Action", true));
|
||||
Assert.Equal(3, existingGenres.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void KeepOnlySamePeopleBetweenLists()
|
||||
{
|
||||
var existingGenres = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
DbFactory.Genre("Sci-fi", false),
|
||||
};
|
||||
|
||||
var peopleFromChapters = new List<Genre>
|
||||
{
|
||||
DbFactory.Genre("Action", false),
|
||||
};
|
||||
|
||||
var genreRemoved = new List<Genre>();
|
||||
GenreHelper.KeepOnlySameGenreBetweenLists(existingGenres,
|
||||
peopleFromChapters, genre =>
|
||||
{
|
||||
genreRemoved.Add(genre);
|
||||
});
|
||||
|
||||
Assert.Equal(1, genreRemoved.Count);
|
||||
}
|
||||
}
|
@ -1,6 +1,10 @@
|
||||
using System.IO;
|
||||
using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services.Tasks.Scanner;
|
||||
|
||||
namespace API.Tests.Helpers
|
||||
{
|
||||
@ -21,5 +25,49 @@ namespace API.Tests.Helpers
|
||||
Volumes = volumes
|
||||
};
|
||||
}
|
||||
|
||||
public static void AddToParsedInfo(IDictionary<ParsedSeries, List<ParserInfo>> collectedSeries, ParserInfo info)
|
||||
{
|
||||
var existingKey = collectedSeries.Keys.FirstOrDefault(ps =>
|
||||
ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series));
|
||||
existingKey ??= new ParsedSeries()
|
||||
{
|
||||
Format = info.Format,
|
||||
Name = info.Series,
|
||||
NormalizedName = API.Parser.Parser.Normalize(info.Series)
|
||||
};
|
||||
if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>))
|
||||
{
|
||||
((ConcurrentDictionary<ParsedSeries, List<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
{
|
||||
oldValue ??= new List<ParserInfo>();
|
||||
if (!oldValue.Contains(info))
|
||||
{
|
||||
oldValue.Add(info);
|
||||
}
|
||||
|
||||
return oldValue;
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
if (!collectedSeries.ContainsKey(existingKey))
|
||||
{
|
||||
collectedSeries.Add(existingKey, new List<ParserInfo>() {info});
|
||||
}
|
||||
else
|
||||
{
|
||||
var list = collectedSeries[existingKey];
|
||||
if (!list.Contains(info))
|
||||
{
|
||||
list.Add(info);
|
||||
}
|
||||
|
||||
collectedSeries[existingKey] = list;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
75
API.Tests/Helpers/ParserInfoHelperTests.cs
Normal file
75
API.Tests/Helpers/ParserInfoHelperTests.cs
Normal file
@ -0,0 +1,75 @@
|
||||
using System.Collections.Generic;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Entities.Metadata;
|
||||
using API.Helpers;
|
||||
using API.Parser;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class ParserInfoHelperTests
|
||||
{
|
||||
#region SeriesHasMatchingParserInfoFormat
|
||||
|
||||
[Fact]
|
||||
public void SeriesHasMatchingParserInfoFormat_ShouldBeFalse()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
|
||||
var series = new Series()
|
||||
{
|
||||
Name = "Darker Than Black",
|
||||
LocalizedName = "Darker Than Black",
|
||||
OriginalName = "Darker Than Black",
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Number = 1,
|
||||
Name = "1"
|
||||
}
|
||||
},
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
|
||||
Metadata = new SeriesMetadata(),
|
||||
Format = MangaFormat.Epub
|
||||
};
|
||||
|
||||
Assert.False(ParserInfoHelpers.SeriesHasMatchingParserInfoFormat(series, infos));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void SeriesHasMatchingParserInfoFormat_ShouldBeTrue()
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
|
||||
var series = new Series()
|
||||
{
|
||||
Name = "Darker Than Black",
|
||||
LocalizedName = "Darker Than Black",
|
||||
OriginalName = "Darker Than Black",
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Number = 1,
|
||||
Name = "1"
|
||||
}
|
||||
},
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
|
||||
Metadata = new SeriesMetadata(),
|
||||
Format = MangaFormat.Epub
|
||||
};
|
||||
|
||||
Assert.True(ParserInfoHelpers.SeriesHasMatchingParserInfoFormat(series, infos));
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
140
API.Tests/Helpers/PersonHelperTests.cs
Normal file
140
API.Tests/Helpers/PersonHelperTests.cs
Normal file
@ -0,0 +1,140 @@
|
||||
using System.Collections.Generic;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Helpers;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class PersonHelperTests
|
||||
{
|
||||
[Fact]
|
||||
public void UpdatePeople_ShouldAddNewPeople()
|
||||
{
|
||||
var allPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer)
|
||||
};
|
||||
var peopleAdded = new List<Person>();
|
||||
|
||||
PersonHelper.UpdatePeople(allPeople, new[] {"Joseph Shmo", "Sally Ann"}, PersonRole.Writer, person =>
|
||||
{
|
||||
peopleAdded.Add(person);
|
||||
});
|
||||
|
||||
Assert.Equal(2, peopleAdded.Count);
|
||||
Assert.Equal(4, allPeople.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void UpdatePeople_ShouldNotAddDuplicatePeople()
|
||||
{
|
||||
var allPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer),
|
||||
DbFactory.Person("Sally Ann", PersonRole.CoverArtist),
|
||||
|
||||
};
|
||||
var peopleAdded = new List<Person>();
|
||||
|
||||
PersonHelper.UpdatePeople(allPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.CoverArtist, person =>
|
||||
{
|
||||
peopleAdded.Add(person);
|
||||
});
|
||||
|
||||
Assert.Equal(3, allPeople.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void RemovePeople_ShouldRemovePeopleOfSameRole()
|
||||
{
|
||||
var existingPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer)
|
||||
};
|
||||
var peopleRemoved = new List<Person>();
|
||||
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.Writer, person =>
|
||||
{
|
||||
peopleRemoved.Add(person);
|
||||
});
|
||||
|
||||
Assert.NotEqual(existingPeople, peopleRemoved);
|
||||
Assert.Equal(1, peopleRemoved.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void RemovePeople_ShouldRemovePeopleFromBothRoles()
|
||||
{
|
||||
var existingPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer)
|
||||
};
|
||||
var peopleRemoved = new List<Person>();
|
||||
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.Writer, person =>
|
||||
{
|
||||
peopleRemoved.Add(person);
|
||||
});
|
||||
|
||||
Assert.NotEqual(existingPeople, peopleRemoved);
|
||||
Assert.Equal(1, peopleRemoved.Count);
|
||||
|
||||
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo"}, PersonRole.CoverArtist, person =>
|
||||
{
|
||||
peopleRemoved.Add(person);
|
||||
});
|
||||
|
||||
Assert.Equal(0, existingPeople.Count);
|
||||
Assert.Equal(2, peopleRemoved.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void KeepOnlySamePeopleBetweenLists()
|
||||
{
|
||||
var existingPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer),
|
||||
DbFactory.Person("Sally", PersonRole.Writer),
|
||||
};
|
||||
|
||||
var peopleFromChapters = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
};
|
||||
|
||||
var peopleRemoved = new List<Person>();
|
||||
PersonHelper.KeepOnlySamePeopleBetweenLists(existingPeople,
|
||||
peopleFromChapters, person =>
|
||||
{
|
||||
peopleRemoved.Add(person);
|
||||
});
|
||||
|
||||
Assert.Equal(2, peopleRemoved.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void AddPeople_ShouldAddOnlyNonExistingPeople()
|
||||
{
|
||||
var existingPeople = new List<Person>
|
||||
{
|
||||
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
|
||||
DbFactory.Person("Joe Shmo", PersonRole.Writer),
|
||||
DbFactory.Person("Sally", PersonRole.Writer),
|
||||
};
|
||||
|
||||
|
||||
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo", PersonRole.CoverArtist));
|
||||
Assert.Equal(3, existingPeople.Count);
|
||||
|
||||
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo", PersonRole.Writer));
|
||||
Assert.Equal(3, existingPeople.Count);
|
||||
|
||||
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo Two", PersonRole.CoverArtist));
|
||||
Assert.Equal(4, existingPeople.Count);
|
||||
}
|
||||
}
|
160
API.Tests/Helpers/SeriesHelperTests.cs
Normal file
160
API.Tests/Helpers/SeriesHelperTests.cs
Normal file
@ -0,0 +1,160 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Helpers;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class SeriesHelperTests
|
||||
{
|
||||
#region FindSeries
|
||||
[Fact]
|
||||
public void FindSeries_ShouldFind_SameFormat()
|
||||
{
|
||||
var series = DbFactory.Series("Darker than Black");
|
||||
series.OriginalName = "Something Random";
|
||||
series.Format = MangaFormat.Archive;
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
Name = "Darker than Black",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
Name = "Darker than Black".ToLower(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
Name = "Darker than Black".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FindSeries_ShouldNotFind_WrongFormat()
|
||||
{
|
||||
var series = DbFactory.Series("Darker than Black");
|
||||
series.OriginalName = "Something Random";
|
||||
series.Format = MangaFormat.Archive;
|
||||
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Darker than Black",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
|
||||
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Darker than Black".ToLower(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
|
||||
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Darker than Black".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
|
||||
}));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FindSeries_ShouldFind_UsingOriginalName()
|
||||
{
|
||||
var series = DbFactory.Series("Darker than Black");
|
||||
series.OriginalName = "Something Random";
|
||||
series.Format = MangaFormat.Image;
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random".ToLower(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "SomethingRandom".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("SomethingRandom")
|
||||
}));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FindSeries_ShouldFind_UsingLocalizedName()
|
||||
{
|
||||
var series = DbFactory.Series("Darker than Black");
|
||||
series.LocalizedName = "Something Random";
|
||||
series.Format = MangaFormat.Image;
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random".ToLower(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "Something Random".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("Something Random")
|
||||
}));
|
||||
|
||||
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Image,
|
||||
Name = "SomethingRandom".ToUpper(),
|
||||
NormalizedName = API.Parser.Parser.Normalize("SomethingRandom")
|
||||
}));
|
||||
}
|
||||
#endregion
|
||||
|
||||
[Fact]
|
||||
public void RemoveMissingSeries_Should_RemoveSeries()
|
||||
{
|
||||
var existingSeries = new List<Series>()
|
||||
{
|
||||
EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
EntityFactory.CreateSeries("Darker than Black"),
|
||||
EntityFactory.CreateSeries("Beastars"),
|
||||
};
|
||||
var missingSeries = new List<Series>()
|
||||
{
|
||||
EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
};
|
||||
existingSeries = SeriesHelper.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
|
||||
|
||||
Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
|
||||
Assert.Equal(missingSeries.Count, removeCount);
|
||||
}
|
||||
}
|
119
API.Tests/Helpers/TagHelperTests.cs
Normal file
119
API.Tests/Helpers/TagHelperTests.cs
Normal file
@ -0,0 +1,119 @@
|
||||
using System.Collections.Generic;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Helpers;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Helpers;
|
||||
|
||||
public class TagHelperTests
|
||||
{
|
||||
[Fact]
|
||||
public void UpdateTag_ShouldAddNewTag()
|
||||
{
|
||||
var allTags = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
DbFactory.Tag("action", false),
|
||||
DbFactory.Tag("Sci-fi", false),
|
||||
};
|
||||
var tagAdded = new List<Tag>();
|
||||
|
||||
TagHelper.UpdateTag(allTags, new[] {"Action", "Adventure"}, false, (tag, added) =>
|
||||
{
|
||||
if (added)
|
||||
{
|
||||
tagAdded.Add(tag);
|
||||
}
|
||||
|
||||
});
|
||||
|
||||
Assert.Equal(1, tagAdded.Count);
|
||||
Assert.Equal(4, allTags.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void UpdateTag_ShouldNotAddDuplicateTag()
|
||||
{
|
||||
var allTags = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
DbFactory.Tag("action", false),
|
||||
DbFactory.Tag("Sci-fi", false),
|
||||
|
||||
};
|
||||
var tagAdded = new List<Tag>();
|
||||
|
||||
TagHelper.UpdateTag(allTags, new[] {"Action", "Scifi"}, false, (tag, added) =>
|
||||
{
|
||||
if (added)
|
||||
{
|
||||
tagAdded.Add(tag);
|
||||
}
|
||||
TagHelper.AddTagIfNotExists(allTags, tag);
|
||||
});
|
||||
|
||||
Assert.Equal(3, allTags.Count);
|
||||
Assert.Empty(tagAdded);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void AddTag_ShouldAddOnlyNonExistingTag()
|
||||
{
|
||||
var existingTags = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
DbFactory.Tag("action", false),
|
||||
DbFactory.Tag("Sci-fi", false),
|
||||
};
|
||||
|
||||
|
||||
TagHelper.AddTagIfNotExists(existingTags, DbFactory.Tag("Action", false));
|
||||
Assert.Equal(3, existingTags.Count);
|
||||
|
||||
TagHelper.AddTagIfNotExists(existingTags, DbFactory.Tag("action", false));
|
||||
Assert.Equal(3, existingTags.Count);
|
||||
|
||||
TagHelper.AddTagIfNotExists(existingTags, DbFactory.Tag("Shonen", false));
|
||||
Assert.Equal(4, existingTags.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void AddTag_ShouldNotAddSameNameAndExternal()
|
||||
{
|
||||
var existingTags = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
DbFactory.Tag("action", false),
|
||||
DbFactory.Tag("Sci-fi", false),
|
||||
};
|
||||
|
||||
|
||||
TagHelper.AddTagIfNotExists(existingTags, DbFactory.Tag("Action", true));
|
||||
Assert.Equal(3, existingTags.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void KeepOnlySamePeopleBetweenLists()
|
||||
{
|
||||
var existingTags = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
DbFactory.Tag("Sci-fi", false),
|
||||
};
|
||||
|
||||
var peopleFromChapters = new List<Tag>
|
||||
{
|
||||
DbFactory.Tag("Action", false),
|
||||
};
|
||||
|
||||
var tagRemoved = new List<Tag>();
|
||||
TagHelper.KeepOnlySameTagBetweenLists(existingTags,
|
||||
peopleFromChapters, tag =>
|
||||
{
|
||||
tagRemoved.Add(tag);
|
||||
});
|
||||
|
||||
Assert.Equal(1, tagRemoved.Count);
|
||||
}
|
||||
}
|
@ -1,7 +1,10 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
using Xunit.Abstractions;
|
||||
|
||||
@ -10,10 +13,14 @@ namespace API.Tests.Parser
|
||||
public class ComicParserTests
|
||||
{
|
||||
private readonly ITestOutputHelper _testOutputHelper;
|
||||
private readonly DefaultParser _defaultParser;
|
||||
|
||||
public ComicParserTests(ITestOutputHelper testOutputHelper)
|
||||
{
|
||||
_testOutputHelper = testOutputHelper;
|
||||
_defaultParser =
|
||||
new DefaultParser(new DirectoryService(Substitute.For<ILogger<DirectoryService>>(),
|
||||
new MockFileSystem()));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
@ -61,8 +68,17 @@ namespace API.Tests.Parser
|
||||
[InlineData("Demon 012 (Sep 1973) c2c", "Demon")]
|
||||
[InlineData("Dragon Age - Until We Sleep 01 (of 03)", "Dragon Age - Until We Sleep")]
|
||||
[InlineData("Green Lantern v2 017 - The Spy-Eye that doomed Green Lantern v2", "Green Lantern")]
|
||||
[InlineData("Green Lantern - Circle of Fire Special - Adam Strange (2000)", "Green Lantern - Circle of Fire - Adam Strange")]
|
||||
[InlineData("Identity Crisis Extra - Rags Morales Sketches (2005)", "Identity Crisis - Rags Morales Sketches")]
|
||||
[InlineData("Green Lantern - Circle of Fire Special - Adam Strange (2000)", "Green Lantern - Circle of Fire - Adam Strange")]
|
||||
[InlineData("Identity Crisis Extra - Rags Morales Sketches (2005)", "Identity Crisis - Rags Morales Sketches")]
|
||||
[InlineData("Daredevil - t6 - 10 - (2019)", "Daredevil")]
|
||||
[InlineData("Batgirl T2000 #57", "Batgirl")]
|
||||
[InlineData("Teen Titans t1 001 (1966-02) (digital) (OkC.O.M.P.U.T.O.-Novus)", "Teen Titans")]
|
||||
[InlineData("Conquistador_-Tome_2", "Conquistador")]
|
||||
[InlineData("Max_l_explorateur-_Tome_0", "Max l explorateur")]
|
||||
[InlineData("Chevaliers d'Héliopolis T3 - Rubedo, l'oeuvre au rouge (Jodorowsky & Jérémy)", "Chevaliers d'Héliopolis")]
|
||||
[InlineData("Bd Fr-Aldebaran-Antares-t6", "Aldebaran-Antares")]
|
||||
[InlineData("Tintin - T22 Vol 714 pour Sydney", "Tintin")]
|
||||
[InlineData("Fables 2010 Vol. 1 Legends in Exile", "Fables 2010")]
|
||||
public void ParseComicSeriesTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseComicSeries(filename));
|
||||
@ -101,6 +117,15 @@ namespace API.Tests.Parser
|
||||
[InlineData("Cyberpunk 2077 - Trauma Team 04.cbz", "0")]
|
||||
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "0")]
|
||||
[InlineData("Daredevil - v6 - 10 - (2019)", "6")]
|
||||
// Tome Tests
|
||||
[InlineData("Daredevil - t6 - 10 - (2019)", "6")]
|
||||
[InlineData("Batgirl T2000 #57", "2000")]
|
||||
[InlineData("Teen Titans t1 001 (1966-02) (digital) (OkC.O.M.P.U.T.O.-Novus)", "1")]
|
||||
[InlineData("Conquistador_Tome_2", "2")]
|
||||
[InlineData("Max_l_explorateur-_Tome_0", "0")]
|
||||
[InlineData("Chevaliers d'Héliopolis T3 - Rubedo, l'oeuvre au rouge (Jodorowsky & Jérémy)", "3")]
|
||||
[InlineData("Adventure Time (2012)/Adventure Time #1 (2012)", "0")]
|
||||
[InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "1")]
|
||||
public void ParseComicVolumeTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseComicVolume(filename));
|
||||
@ -144,6 +169,8 @@ namespace API.Tests.Parser
|
||||
[InlineData("2000 AD 0366 [1984-04-28] (flopbie)", "366")]
|
||||
[InlineData("Daredevil - v6 - 10 - (2019)", "10")]
|
||||
[InlineData("Batman Beyond 2016 - Chapter 001.cbz", "1")]
|
||||
[InlineData("Adventure Time (2012)/Adventure Time #1 (2012)", "1")]
|
||||
[InlineData("Adventure Time TPB (2012)/Adventure Time v01 (2012).cbz", "0")]
|
||||
public void ParseComicChapterTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseComicChapter(filename));
|
||||
@ -155,76 +182,13 @@ namespace API.Tests.Parser
|
||||
[InlineData("Zombie Tramp vs. Vampblade TPB (2016) (Digital) (TheArchivist-Empire)", true)]
|
||||
[InlineData("Baldwin the Brave & Other Tales Special SP1.cbr", true)]
|
||||
[InlineData("Mouse Guard Specials - Spring 1153 - Fraggle Rock FCBD 2010", true)]
|
||||
[InlineData("Boule et Bill - THS -Bill à disparu", true)]
|
||||
[InlineData("Asterix - HS - Les 12 travaux d'Astérix", true)]
|
||||
[InlineData("Sillage Hors Série - Le Collectionneur - Concordance-DKFR", true)]
|
||||
[InlineData("laughs", false)]
|
||||
public void ParseComicSpecialTest(string input, bool expected)
|
||||
{
|
||||
Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseComicSpecial(input)));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ParseInfoTest()
|
||||
{
|
||||
const string rootPath = @"E:/Comics/";
|
||||
var expected = new Dictionary<string, ParserInfo>();
|
||||
var filepath = @"E:/Comics/Teen Titans/Teen Titans v1 Annual 01 (1967) SP01.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Teen Titans", Volumes = "0",
|
||||
Chapters = "0", Filename = "Teen Titans v1 Annual 01 (1967) SP01.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
// Fallback test with bad naming
|
||||
filepath = @"E:\Comics\Comics\Babe\Babe Vol.1 #1-4\Babe 01.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Babe", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Babe 01.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Comics\Comics\Publisher\Batman the Detective (2021)\Batman the Detective - v6 - 11 - (2021).cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Batman the Detective", Volumes = "6", Edition = "",
|
||||
Chapters = "11", Filename = "Batman the Detective - v6 - 11 - (2021).cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Comics\Comics\Batman - The Man Who Laughs #1 (2005)\Batman - The Man Who Laughs #1 (2005).cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Batman - The Man Who Laughs", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Batman - The Man Who Laughs #1 (2005).cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
foreach (var file in expected.Keys)
|
||||
{
|
||||
var expectedInfo = expected[file];
|
||||
var actual = API.Parser.Parser.Parse(file, rootPath, LibraryType.Comic);
|
||||
if (expectedInfo == null)
|
||||
{
|
||||
Assert.Null(actual);
|
||||
return;
|
||||
}
|
||||
Assert.NotNull(actual);
|
||||
_testOutputHelper.WriteLine($"Validating {file}");
|
||||
Assert.Equal(expectedInfo.Format, actual.Format);
|
||||
_testOutputHelper.WriteLine("Format ✓");
|
||||
Assert.Equal(expectedInfo.Series, actual.Series);
|
||||
_testOutputHelper.WriteLine("Series ✓");
|
||||
Assert.Equal(expectedInfo.Chapters, actual.Chapters);
|
||||
_testOutputHelper.WriteLine("Chapters ✓");
|
||||
Assert.Equal(expectedInfo.Volumes, actual.Volumes);
|
||||
_testOutputHelper.WriteLine("Volumes ✓");
|
||||
Assert.Equal(expectedInfo.Edition, actual.Edition);
|
||||
_testOutputHelper.WriteLine("Edition ✓");
|
||||
Assert.Equal(expectedInfo.Filename, actual.Filename);
|
||||
_testOutputHelper.WriteLine("Filename ✓");
|
||||
Assert.Equal(expectedInfo.FullFilePath, actual.FullFilePath);
|
||||
_testOutputHelper.WriteLine("FullFilePath ✓");
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
312
API.Tests/Parser/DefaultParserTests.cs
Normal file
312
API.Tests/Parser/DefaultParserTests.cs
Normal file
@ -0,0 +1,312 @@
|
||||
using System.Collections.Generic;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
using Xunit.Abstractions;
|
||||
|
||||
namespace API.Tests.Parser;
|
||||
|
||||
public class DefaultParserTests
|
||||
{
|
||||
private readonly ITestOutputHelper _testOutputHelper;
|
||||
private readonly DefaultParser _defaultParser;
|
||||
|
||||
public DefaultParserTests(ITestOutputHelper testOutputHelper)
|
||||
{
|
||||
_testOutputHelper = testOutputHelper;
|
||||
var directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new MockFileSystem());
|
||||
_defaultParser = new DefaultParser(directoryService);
|
||||
}
|
||||
|
||||
|
||||
#region ParseFromFallbackFolders
|
||||
[Theory]
|
||||
[InlineData("C:/", "C:/Love Hina/Love Hina - Special.cbz", "Love Hina")]
|
||||
[InlineData("C:/", "C:/Love Hina/Specials/Ani-Hina Art Collection.cbz", "Love Hina")]
|
||||
[InlineData("C:/", "C:/Mujaki no Rakuen Something/Mujaki no Rakuen Vol12 ch76.cbz", "Mujaki no Rakuen")]
|
||||
[InlineData("C:/", "C:/Something Random/Mujaki no Rakuen SP01.cbz", "Something Random")]
|
||||
public void ParseFromFallbackFolders_FallbackShouldParseSeries(string rootDir, string inputPath, string expectedSeries)
|
||||
{
|
||||
var actual = _defaultParser.Parse(inputPath, rootDir);
|
||||
if (actual == null)
|
||||
{
|
||||
Assert.NotNull(actual);
|
||||
return;
|
||||
}
|
||||
|
||||
Assert.Equal(expectedSeries, actual.Series);
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
|
||||
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
|
||||
[InlineData("/manga/Monster #8/Ch. 001-016 [MangaPlus] [Digital] [amit34521]/Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]/13.jpg", "Monster #8~0~1")]
|
||||
public void ParseFromFallbackFolders_ShouldParseSeriesVolumeAndChapter(string inputFile, string expectedParseInfo)
|
||||
{
|
||||
const string rootDirectory = "/manga/";
|
||||
var tokens = expectedParseInfo.Split("~");
|
||||
var actual = new ParserInfo {Chapters = "0", Volumes = "0"};
|
||||
_defaultParser.ParseFromFallbackFolders(inputFile, rootDirectory, LibraryType.Manga, ref actual);
|
||||
Assert.Equal(tokens[0], actual.Series);
|
||||
Assert.Equal(tokens[1], actual.Volumes);
|
||||
Assert.Equal(tokens[2], actual.Chapters);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
|
||||
#region Parse
|
||||
|
||||
[Fact]
|
||||
public void Parse_ParseInfo_Manga()
|
||||
{
|
||||
const string rootPath = @"E:/Manga/";
|
||||
var expected = new Dictionary<string, ParserInfo>();
|
||||
var filepath = @"E:/Manga/Mujaki no Rakuen/Mujaki no Rakuen Vol12 ch76.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Mujaki no Rakuen", Volumes = "12",
|
||||
Chapters = "76", Filename = "Mujaki no Rakuen Vol12 ch76.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:/Manga/Shimoneta to Iu Gainen ga Sonzai Shinai Taikutsu na Sekai Man-hen/Vol 1.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Shimoneta to Iu Gainen ga Sonzai Shinai Taikutsu na Sekai Man-hen", Volumes = "1",
|
||||
Chapters = "0", Filename = "Vol 1.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Beelzebub\Beelzebub_01_[Noodles].zip";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Beelzebub", Volumes = "0",
|
||||
Chapters = "1", Filename = "Beelzebub_01_[Noodles].zip", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Ichinensei ni Nacchattara\Ichinensei_ni_Nacchattara_v01_ch01_[Taruby]_v1.1.zip";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Ichinensei ni Nacchattara", Volumes = "1",
|
||||
Chapters = "1", Filename = "Ichinensei_ni_Nacchattara_v01_ch01_[Taruby]_v1.1.zip", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Tenjo Tenge (Color)\Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Tenjo Tenge", Volumes = "1", Edition = "Full Contact Edition",
|
||||
Chapters = "0", Filename = "Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Akame ga KILL! ZERO (2016-2019) (Digital) (LuCaZ)\Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Akame ga KILL! ZERO", Volumes = "1", Edition = "",
|
||||
Chapters = "0", Filename = "Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Dorohedoro\Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Dorohedoro", Volumes = "1", Edition = "",
|
||||
Chapters = "0", Filename = "Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\APOSIMZ\APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "APOSIMZ", Volumes = "0", Edition = "",
|
||||
Chapters = "40", Filename = "APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Corpse Party Musume\Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Kedouin Makoto - Corpse Party Musume", Volumes = "0", Edition = "",
|
||||
Chapters = "9", Filename = "Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Goblin Slayer\Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Goblin Slayer - Brand New Day", Volumes = "0", Edition = "",
|
||||
Chapters = "6.5", Filename = "Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Summer Time Rendering\Specials\Record 014 (between chapter 083 and ch084) SP11.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Summer Time Rendering", Volumes = "0", Edition = "",
|
||||
Chapters = "0", Filename = "Record 014 (between chapter 083 and ch084) SP11.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = true
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Seraph of the End\Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Seraph of the End - Vampire Reign", Volumes = "0", Edition = "",
|
||||
Chapters = "93", Filename = "Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Kono Subarashii Sekai ni Bakuen wo!\Vol. 00 Ch. 000.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Kono Subarashii Sekai ni Bakuen wo!", Volumes = "0", Edition = "",
|
||||
Chapters = "0", Filename = "Vol. 00 Ch. 000.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Toukyou Akazukin\Vol. 01 Ch. 001.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Toukyou Akazukin", Volumes = "1", Edition = "",
|
||||
Chapters = "1", Filename = "Vol. 01 Ch. 001.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Harrison, Kim - The Good, The Bad, and the Undead - Hollows Vol 2.5.epub";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Harrison, Kim - The Good, The Bad, and the Undead - Hollows", Volumes = "2.5", Edition = "",
|
||||
Chapters = "0", Filename = "Harrison, Kim - The Good, The Bad, and the Undead - Hollows Vol 2.5.epub", Format = MangaFormat.Epub,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
// If an image is cover exclusively, ignore it
|
||||
filepath = @"E:\Manga\Seraph of the End\cover.png";
|
||||
expected.Add(filepath, null);
|
||||
|
||||
filepath = @"E:\Manga\The Beginning After the End\Chapter 001.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "The Beginning After the End", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Chapter 001.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Monster #8\Ch. 001-016 [MangaPlus] [Digital] [amit34521]\Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]\13.jpg";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Monster #8", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "13.jpg", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Air Gear\Air Gear Omnibus v01 (2016) (Digital) (Shadowcat-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Air Gear", Volumes = "1", Edition = "Omnibus",
|
||||
Chapters = "0", Filename = "Air Gear Omnibus v01 (2016) (Digital) (Shadowcat-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
|
||||
foreach (var file in expected.Keys)
|
||||
{
|
||||
var expectedInfo = expected[file];
|
||||
var actual = _defaultParser.Parse(file, rootPath);
|
||||
if (expectedInfo == null)
|
||||
{
|
||||
Assert.Null(actual);
|
||||
return;
|
||||
}
|
||||
Assert.NotNull(actual);
|
||||
_testOutputHelper.WriteLine($"Validating {file}");
|
||||
Assert.Equal(expectedInfo.Format, actual.Format);
|
||||
_testOutputHelper.WriteLine("Format ✓");
|
||||
Assert.Equal(expectedInfo.Series, actual.Series);
|
||||
_testOutputHelper.WriteLine("Series ✓");
|
||||
Assert.Equal(expectedInfo.Chapters, actual.Chapters);
|
||||
_testOutputHelper.WriteLine("Chapters ✓");
|
||||
Assert.Equal(expectedInfo.Volumes, actual.Volumes);
|
||||
_testOutputHelper.WriteLine("Volumes ✓");
|
||||
Assert.Equal(expectedInfo.Edition, actual.Edition);
|
||||
_testOutputHelper.WriteLine("Edition ✓");
|
||||
Assert.Equal(expectedInfo.Filename, actual.Filename);
|
||||
_testOutputHelper.WriteLine("Filename ✓");
|
||||
Assert.Equal(expectedInfo.FullFilePath, actual.FullFilePath);
|
||||
_testOutputHelper.WriteLine("FullFilePath ✓");
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Parse_ParseInfo_Comic()
|
||||
{
|
||||
const string rootPath = @"E:/Comics/";
|
||||
var expected = new Dictionary<string, ParserInfo>();
|
||||
var filepath = @"E:/Comics/Teen Titans/Teen Titans v1 Annual 01 (1967) SP01.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Teen Titans", Volumes = "0",
|
||||
Chapters = "0", Filename = "Teen Titans v1 Annual 01 (1967) SP01.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
// Fallback test with bad naming
|
||||
filepath = @"E:\Comics\Comics\Babe\Babe Vol.1 #1-4\Babe 01.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Babe", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Babe 01.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Comics\Comics\Publisher\Batman the Detective (2021)\Batman the Detective - v6 - 11 - (2021).cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Batman the Detective", Volumes = "6", Edition = "",
|
||||
Chapters = "11", Filename = "Batman the Detective - v6 - 11 - (2021).cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Comics\Comics\Batman - The Man Who Laughs #1 (2005)\Batman - The Man Who Laughs #1 (2005).cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Batman - The Man Who Laughs", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Batman - The Man Who Laughs #1 (2005).cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
foreach (var file in expected.Keys)
|
||||
{
|
||||
var expectedInfo = expected[file];
|
||||
var actual = _defaultParser.Parse(file, rootPath, LibraryType.Comic);
|
||||
if (expectedInfo == null)
|
||||
{
|
||||
Assert.Null(actual);
|
||||
return;
|
||||
}
|
||||
Assert.NotNull(actual);
|
||||
_testOutputHelper.WriteLine($"Validating {file}");
|
||||
Assert.Equal(expectedInfo.Format, actual.Format);
|
||||
_testOutputHelper.WriteLine("Format ✓");
|
||||
Assert.Equal(expectedInfo.Series, actual.Series);
|
||||
_testOutputHelper.WriteLine("Series ✓");
|
||||
Assert.Equal(expectedInfo.Chapters, actual.Chapters);
|
||||
_testOutputHelper.WriteLine("Chapters ✓");
|
||||
Assert.Equal(expectedInfo.Volumes, actual.Volumes);
|
||||
_testOutputHelper.WriteLine("Volumes ✓");
|
||||
Assert.Equal(expectedInfo.Edition, actual.Edition);
|
||||
_testOutputHelper.WriteLine("Edition ✓");
|
||||
Assert.Equal(expectedInfo.Filename, actual.Filename);
|
||||
_testOutputHelper.WriteLine("Filename ✓");
|
||||
Assert.Equal(expectedInfo.FullFilePath, actual.FullFilePath);
|
||||
_testOutputHelper.WriteLine("FullFilePath ✓");
|
||||
}
|
||||
}
|
||||
#endregion
|
||||
}
|
@ -167,6 +167,9 @@ namespace API.Tests.Parser
|
||||
[InlineData("Great_Teacher_Onizuka_v16[TheSpectrum]", "Great Teacher Onizuka")]
|
||||
[InlineData("[Renzokusei]_Kimi_wa_Midara_na_Boku_no_Joou_Ch5_Final_Chapter", "Kimi wa Midara na Boku no Joou")]
|
||||
[InlineData("Battle Royale, v01 (2000) [TokyoPop] [Manga-Sketchbook]", "Battle Royale")]
|
||||
[InlineData("Kaiju No. 8 036 (2021) (Digital)", "Kaiju No. 8")]
|
||||
[InlineData("Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz", "Seraph of the End - Vampire Reign")]
|
||||
[InlineData("Love Hina - Volume 01 [Scans].pdf", "Love Hina")]
|
||||
public void ParseSeriesTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename));
|
||||
@ -240,6 +243,8 @@ namespace API.Tests.Parser
|
||||
[InlineData("Deku_&_Bakugo_-_Rising_v1_c1.1.cbz", "1.1")]
|
||||
[InlineData("Chapter 63 - The Promise Made for 520 Cenz.cbr", "63")]
|
||||
[InlineData("Harrison, Kim - The Good, The Bad, and the Undead - Hollows Vol 2.5.epub", "0")]
|
||||
[InlineData("Kaiju No. 8 036 (2021) (Digital)", "36")]
|
||||
[InlineData("Samurai Jack Vol. 01 - The threads of Time", "0")]
|
||||
public void ParseChaptersTest(string filename, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseChapter(filename));
|
||||
@ -255,6 +260,7 @@ namespace API.Tests.Parser
|
||||
[InlineData("Chobits Omnibus Edition v01 [Dark Horse]", "Omnibus Edition")]
|
||||
[InlineData("[dmntsf.net] One Piece - Digital Colored Comics Vol. 20 Ch. 177 - 30 Million vs 81 Million.cbz", "")]
|
||||
[InlineData("AKIRA - c003 (v01) [Full Color] [Darkhorse].cbz", "Full Color")]
|
||||
[InlineData("Love Hina Omnibus v05 (2015) (Digital-HD) (Asgard-Empire).cbz", "Omnibus")]
|
||||
public void ParseEditionTest(string input, string expected)
|
||||
{
|
||||
Assert.Equal(expected, API.Parser.Parser.ParseEdition(input));
|
||||
@ -272,6 +278,8 @@ namespace API.Tests.Parser
|
||||
[InlineData("Yuki Merry - 4-Komga Anthology", false)]
|
||||
[InlineData("Beastars - SP01", false)]
|
||||
[InlineData("Beastars SP01", false)]
|
||||
[InlineData("The League of Extraordinary Gentlemen", false)]
|
||||
[InlineData("The League of Extra-ordinary Gentlemen", false)]
|
||||
public void ParseMangaSpecialTest(string input, bool expected)
|
||||
{
|
||||
Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseMangaSpecial(input)));
|
||||
@ -294,194 +302,6 @@ namespace API.Tests.Parser
|
||||
}
|
||||
|
||||
|
||||
[Theory]
|
||||
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
|
||||
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
|
||||
[InlineData("/manga/Monster #8/Ch. 001-016 [MangaPlus] [Digital] [amit34521]/Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]/13.jpg", "Monster #8~0~1")]
|
||||
public void ParseFromFallbackFoldersTest(string inputFile, string expectedParseInfo)
|
||||
{
|
||||
const string rootDirectory = "/manga/";
|
||||
var tokens = expectedParseInfo.Split("~");
|
||||
var actual = new ParserInfo {Chapters = "0", Volumes = "0"};
|
||||
API.Parser.Parser.ParseFromFallbackFolders(inputFile, rootDirectory, LibraryType.Manga, ref actual);
|
||||
Assert.Equal(tokens[0], actual.Series);
|
||||
Assert.Equal(tokens[1], actual.Volumes);
|
||||
Assert.Equal(tokens[2], actual.Chapters);
|
||||
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ParseInfoTest()
|
||||
{
|
||||
const string rootPath = @"E:/Manga/";
|
||||
var expected = new Dictionary<string, ParserInfo>();
|
||||
var filepath = @"E:/Manga/Mujaki no Rakuen/Mujaki no Rakuen Vol12 ch76.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Mujaki no Rakuen", Volumes = "12",
|
||||
Chapters = "76", Filename = "Mujaki no Rakuen Vol12 ch76.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:/Manga/Shimoneta to Iu Gainen ga Sonzai Shinai Taikutsu na Sekai Man-hen/Vol 1.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Shimoneta to Iu Gainen ga Sonzai Shinai Taikutsu na Sekai Man-hen", Volumes = "1",
|
||||
Chapters = "0", Filename = "Vol 1.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Beelzebub\Beelzebub_01_[Noodles].zip";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Beelzebub", Volumes = "0",
|
||||
Chapters = "1", Filename = "Beelzebub_01_[Noodles].zip", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Ichinensei ni Nacchattara\Ichinensei_ni_Nacchattara_v01_ch01_[Taruby]_v1.1.zip";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Ichinensei ni Nacchattara", Volumes = "1",
|
||||
Chapters = "1", Filename = "Ichinensei_ni_Nacchattara_v01_ch01_[Taruby]_v1.1.zip", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Tenjo Tenge (Color)\Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Tenjo Tenge", Volumes = "1", Edition = "Full Contact Edition",
|
||||
Chapters = "0", Filename = "Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Akame ga KILL! ZERO (2016-2019) (Digital) (LuCaZ)\Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Akame ga KILL! ZERO", Volumes = "1", Edition = "",
|
||||
Chapters = "0", Filename = "Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Dorohedoro\Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Dorohedoro", Volumes = "1", Edition = "",
|
||||
Chapters = "0", Filename = "Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\APOSIMZ\APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "APOSIMZ", Volumes = "0", Edition = "",
|
||||
Chapters = "40", Filename = "APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Corpse Party Musume\Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Kedouin Makoto - Corpse Party Musume", Volumes = "0", Edition = "",
|
||||
Chapters = "9", Filename = "Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Goblin Slayer\Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Goblin Slayer - Brand New Day", Volumes = "0", Edition = "",
|
||||
Chapters = "6.5", Filename = "Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Summer Time Rendering\Specials\Record 014 (between chapter 083 and ch084) SP11.cbr";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Summer Time Rendering", Volumes = "0", Edition = "",
|
||||
Chapters = "0", Filename = "Record 014 (between chapter 083 and ch084) SP11.cbr", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = true
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Seraph of the End\Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Seraph of the End - Vampire Reign", Volumes = "0", Edition = "",
|
||||
Chapters = "93", Filename = "Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Kono Subarashii Sekai ni Bakuen wo!\Vol. 00 Ch. 000.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Kono Subarashii Sekai ni Bakuen wo!", Volumes = "0", Edition = "",
|
||||
Chapters = "0", Filename = "Vol. 00 Ch. 000.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Toukyou Akazukin\Vol. 01 Ch. 001.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Toukyou Akazukin", Volumes = "1", Edition = "",
|
||||
Chapters = "1", Filename = "Vol. 01 Ch. 001.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Harrison, Kim - The Good, The Bad, and the Undead - Hollows Vol 2.5.epub";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Harrison, Kim - The Good, The Bad, and the Undead - Hollows", Volumes = "2.5", Edition = "",
|
||||
Chapters = "0", Filename = "Harrison, Kim - The Good, The Bad, and the Undead - Hollows Vol 2.5.epub", Format = MangaFormat.Epub,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
// If an image is cover exclusively, ignore it
|
||||
filepath = @"E:\Manga\Seraph of the End\cover.png";
|
||||
expected.Add(filepath, null);
|
||||
|
||||
filepath = @"E:\Manga\The Beginning After the End\Chapter 001.cbz";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "The Beginning After the End", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "Chapter 001.cbz", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
filepath = @"E:\Manga\Monster #8\Ch. 001-016 [MangaPlus] [Digital] [amit34521]\Monster #8 Ch. 001 [MangaPlus] [Digital] [amit34521]\13.jpg";
|
||||
expected.Add(filepath, new ParserInfo
|
||||
{
|
||||
Series = "Monster #8", Volumes = "0", Edition = "",
|
||||
Chapters = "1", Filename = "13.jpg", Format = MangaFormat.Archive,
|
||||
FullFilePath = filepath, IsSpecial = false
|
||||
});
|
||||
|
||||
|
||||
foreach (var file in expected.Keys)
|
||||
{
|
||||
var expectedInfo = expected[file];
|
||||
var actual = API.Parser.Parser.Parse(file, rootPath);
|
||||
if (expectedInfo == null)
|
||||
{
|
||||
Assert.Null(actual);
|
||||
return;
|
||||
}
|
||||
Assert.NotNull(actual);
|
||||
_testOutputHelper.WriteLine($"Validating {file}");
|
||||
Assert.Equal(expectedInfo.Format, actual.Format);
|
||||
_testOutputHelper.WriteLine("Format ✓");
|
||||
Assert.Equal(expectedInfo.Series, actual.Series);
|
||||
_testOutputHelper.WriteLine("Series ✓");
|
||||
Assert.Equal(expectedInfo.Chapters, actual.Chapters);
|
||||
_testOutputHelper.WriteLine("Chapters ✓");
|
||||
Assert.Equal(expectedInfo.Volumes, actual.Volumes);
|
||||
_testOutputHelper.WriteLine("Volumes ✓");
|
||||
Assert.Equal(expectedInfo.Edition, actual.Edition);
|
||||
_testOutputHelper.WriteLine("Edition ✓");
|
||||
Assert.Equal(expectedInfo.Filename, actual.Filename);
|
||||
_testOutputHelper.WriteLine("Filename ✓");
|
||||
Assert.Equal(expectedInfo.FullFilePath, actual.FullFilePath);
|
||||
_testOutputHelper.WriteLine("FullFilePath ✓");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -20,7 +20,7 @@ namespace API.Tests.Parser
|
||||
Title = "darker than black",
|
||||
Volumes = "0"
|
||||
};
|
||||
|
||||
|
||||
var p2 = new ParserInfo()
|
||||
{
|
||||
Chapters = "1",
|
||||
@ -32,7 +32,7 @@ namespace API.Tests.Parser
|
||||
Title = "Darker Than Black",
|
||||
Volumes = "0"
|
||||
};
|
||||
|
||||
|
||||
var expected = new ParserInfo()
|
||||
{
|
||||
Chapters = "1",
|
||||
@ -45,11 +45,11 @@ namespace API.Tests.Parser
|
||||
Volumes = "0"
|
||||
};
|
||||
p1.Merge(p2);
|
||||
|
||||
|
||||
AssertSame(expected, p1);
|
||||
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public void MergeFromTest2()
|
||||
{
|
||||
@ -64,7 +64,7 @@ namespace API.Tests.Parser
|
||||
Title = "darker than black",
|
||||
Volumes = "0"
|
||||
};
|
||||
|
||||
|
||||
var p2 = new ParserInfo()
|
||||
{
|
||||
Chapters = "0",
|
||||
@ -76,7 +76,7 @@ namespace API.Tests.Parser
|
||||
Title = "Darker Than Black",
|
||||
Volumes = "1"
|
||||
};
|
||||
|
||||
|
||||
var expected = new ParserInfo()
|
||||
{
|
||||
Chapters = "1",
|
||||
@ -93,9 +93,9 @@ namespace API.Tests.Parser
|
||||
AssertSame(expected, p1);
|
||||
|
||||
}
|
||||
|
||||
|
||||
private void AssertSame(ParserInfo expected, ParserInfo actual)
|
||||
|
||||
private static void AssertSame(ParserInfo expected, ParserInfo actual)
|
||||
{
|
||||
Assert.Equal(expected.Chapters, actual.Chapters);
|
||||
Assert.Equal(expected.Volumes, actual.Volumes);
|
||||
@ -107,4 +107,4 @@ namespace API.Tests.Parser
|
||||
Assert.Equal(expected.FullFilePath, actual.FullFilePath);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,3 +1,5 @@
|
||||
using System.Linq;
|
||||
using API.Entities.Enums;
|
||||
using Xunit;
|
||||
using static API.Parser.Parser;
|
||||
|
||||
@ -5,6 +7,14 @@ namespace API.Tests.Parser
|
||||
{
|
||||
public class ParserTests
|
||||
{
|
||||
[Theory]
|
||||
[InlineData("Joe Shmo, Green Blue", "Joe Shmo, Green Blue")]
|
||||
[InlineData("Shmo, Joe", "Shmo, Joe")]
|
||||
[InlineData(" Joe Shmo ", "Joe Shmo")]
|
||||
public void CleanAuthorTest(string input, string expected)
|
||||
{
|
||||
Assert.Equal(expected, CleanAuthor(input));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("Beastars - SP01", true)]
|
||||
@ -40,6 +50,8 @@ namespace API.Tests.Parser
|
||||
[InlineData("Hello_I_am_here ", false, "Hello I am here")]
|
||||
[InlineData("[ReleaseGroup] The Title", false, "The Title")]
|
||||
[InlineData("[ReleaseGroup]_The_Title", false, "The Title")]
|
||||
[InlineData("-The Title", false, "The Title")]
|
||||
[InlineData("- The Title", false, "The Title")]
|
||||
[InlineData("[Suihei Kiki]_Kasumi_Otoko_no_Ko_[Taruby]_v1.1", false, "Kasumi Otoko no Ko v1.1")]
|
||||
[InlineData("Batman - Detective Comics - Rebirth Deluxe Edition Book 04 (2019) (digital) (Son of Ultron-Empire)", true, "Batman - Detective Comics - Rebirth Deluxe Edition")]
|
||||
public void CleanTitleTest(string input, bool isComic, string expected)
|
||||
@ -47,20 +59,28 @@ namespace API.Tests.Parser
|
||||
Assert.Equal(expected, CleanTitle(input, isComic));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("src: url(fonts/AvenirNext-UltraLight.ttf)", true)]
|
||||
[InlineData("src: url(ideal-sans-serif.woff)", true)]
|
||||
[InlineData("src: local(\"Helvetica Neue Bold\")", true)]
|
||||
[InlineData("src: url(\"/fonts/OpenSans-Regular-webfont.woff2\")", true)]
|
||||
[InlineData("src: local(\"/fonts/OpenSans-Regular-webfont.woff2\")", true)]
|
||||
[InlineData("src: url(data:application/x-font-woff", false)]
|
||||
public void FontCssRewriteMatches(string input, bool expectedMatch)
|
||||
{
|
||||
Assert.Equal(expectedMatch, FontSrcUrlRegex.Matches(input).Count > 0);
|
||||
}
|
||||
|
||||
// [Theory]
|
||||
// //[InlineData("@font-face{font-family:\"PaytoneOne\";src:url(\"..\\/Fonts\\/PaytoneOne.ttf\")}", "@font-face{font-family:\"PaytoneOne\";src:url(\"PaytoneOne.ttf\")}")]
|
||||
// [InlineData("@font-face{font-family:\"PaytoneOne\";src:url(\"..\\/Fonts\\/PaytoneOne.ttf\")}", "..\\/Fonts\\/PaytoneOne.ttf")]
|
||||
// //[InlineData("@font-face{font-family:'PaytoneOne';src:url('..\\/Fonts\\/PaytoneOne.ttf')}", "@font-face{font-family:'PaytoneOne';src:url('PaytoneOne.ttf')}")]
|
||||
// //[InlineData("@font-face{\r\nfont-family:'PaytoneOne';\r\nsrc:url('..\\/Fonts\\/PaytoneOne.ttf')\r\n}", "@font-face{font-family:'PaytoneOne';src:url('PaytoneOne.ttf')}")]
|
||||
// public void ReplaceStyleUrlTest(string input, string expected)
|
||||
// {
|
||||
// var replacementStr = "PaytoneOne.ttf";
|
||||
// // Use Match to validate since replace is weird
|
||||
// //Assert.Equal(expected, FontSrcUrlRegex.Replace(input, "$1" + replacementStr + "$2" + "$3"));
|
||||
// var match = FontSrcUrlRegex.Match(input);
|
||||
// Assert.Equal(!string.IsNullOrEmpty(expected), FontSrcUrlRegex.Match(input).Success);
|
||||
// }
|
||||
[Theory]
|
||||
[InlineData("src: url(fonts/AvenirNext-UltraLight.ttf)", new [] {"src: url(", "fonts/AvenirNext-UltraLight.ttf", ")"})]
|
||||
[InlineData("src: url(ideal-sans-serif.woff)", new [] {"src: url(", "ideal-sans-serif.woff", ")"})]
|
||||
[InlineData("src: local(\"Helvetica Neue Bold\")", new [] {"src: local(\"", "Helvetica Neue Bold", "\")"})]
|
||||
[InlineData("src: url(\"/fonts/OpenSans-Regular-webfont.woff2\")", new [] {"src: url(\"", "/fonts/OpenSans-Regular-webfont.woff2", "\")"})]
|
||||
[InlineData("src: local(\"/fonts/OpenSans-Regular-webfont.woff2\")", new [] {"src: local(\"", "/fonts/OpenSans-Regular-webfont.woff2", "\")"})]
|
||||
public void FontCssCorrectlySeparates(string input, string[] expected)
|
||||
{
|
||||
Assert.Equal(expected, FontSrcUrlRegex.Match(input).Groups.Values.Select(g => g.Value).Where((_, i) => i > 0).ToArray());
|
||||
}
|
||||
|
||||
|
||||
[Theory]
|
||||
@ -132,28 +152,14 @@ namespace API.Tests.Parser
|
||||
[InlineData("test.jpeg", true)]
|
||||
[InlineData("test.png", true)]
|
||||
[InlineData(".test.jpg", false)]
|
||||
[InlineData("!test.jpg", false)]
|
||||
[InlineData("!test.jpg", true)]
|
||||
[InlineData("test.webp", true)]
|
||||
public void IsImageTest(string filename, bool expected)
|
||||
{
|
||||
Assert.Equal(expected, IsImage(filename));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("C:/", "C:/Love Hina/Love Hina - Special.cbz", "Love Hina")]
|
||||
[InlineData("C:/", "C:/Love Hina/Specials/Ani-Hina Art Collection.cbz", "Love Hina")]
|
||||
[InlineData("C:/", "C:/Mujaki no Rakuen Something/Mujaki no Rakuen Vol12 ch76.cbz", "Mujaki no Rakuen")]
|
||||
public void FallbackTest(string rootDir, string inputPath, string expectedSeries)
|
||||
{
|
||||
var actual = Parse(inputPath, rootDir);
|
||||
if (actual == null)
|
||||
{
|
||||
Assert.NotNull(actual);
|
||||
return;
|
||||
}
|
||||
|
||||
Assert.Equal(expectedSeries, actual.Series);
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("Love Hina - Special.jpg", false)]
|
||||
@ -164,6 +170,8 @@ namespace API.Tests.Parser
|
||||
[InlineData("cover.jpg", true)]
|
||||
[InlineData("cover.png", true)]
|
||||
[InlineData("ch1/cover.png", true)]
|
||||
[InlineData("ch1/backcover.png", false)]
|
||||
[InlineData("backcover.png", false)]
|
||||
public void IsCoverImageTest(string inputPath, bool expected)
|
||||
{
|
||||
Assert.Equal(expected, IsCoverImage(inputPath));
|
||||
@ -174,9 +182,23 @@ namespace API.Tests.Parser
|
||||
[InlineData("TEST/Love Hina - Special.jpg", false)]
|
||||
[InlineData("__macosx/Love Hina/", false)]
|
||||
[InlineData("MACOSX/Love Hina/", false)]
|
||||
[InlineData("._Love Hina/Love Hina/", true)]
|
||||
[InlineData("@Recently-Snapshot/Love Hina/", true)]
|
||||
public void HasBlacklistedFolderInPathTest(string inputPath, bool expected)
|
||||
{
|
||||
Assert.Equal(expected, HasBlacklistedFolderInPath(inputPath));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
[InlineData("/manga/1/1/1", "/manga/1/1/1")]
|
||||
[InlineData("/manga/1/1/1.jpg", "/manga/1/1/1.jpg")]
|
||||
[InlineData(@"/manga/1/1\1.jpg", @"/manga/1/1/1.jpg")]
|
||||
[InlineData("/manga/1/1//1", "/manga/1/1//1")]
|
||||
[InlineData("/manga/1\\1\\1", "/manga/1/1/1")]
|
||||
[InlineData("C:/manga/1\\1\\1.jpg", "C:/manga/1/1/1.jpg")]
|
||||
public void NormalizePathTest(string inputPath, string expected)
|
||||
{
|
||||
Assert.Equal(expected, NormalizePath(inputPath));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,11 +1,14 @@
|
||||
using System.Diagnostics;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.IO.Compression;
|
||||
using System.Linq;
|
||||
using API.Archive;
|
||||
using API.Data.Metadata;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NetVips;
|
||||
using NSubstitute;
|
||||
using NSubstitute.Extensions;
|
||||
using Xunit;
|
||||
@ -19,12 +22,12 @@ namespace API.Tests.Services
|
||||
private readonly ArchiveService _archiveService;
|
||||
private readonly ILogger<ArchiveService> _logger = Substitute.For<ILogger<ArchiveService>>();
|
||||
private readonly ILogger<DirectoryService> _directoryServiceLogger = Substitute.For<ILogger<DirectoryService>>();
|
||||
private readonly IDirectoryService _directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>());
|
||||
private readonly IDirectoryService _directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem());
|
||||
|
||||
public ArchiveServiceTests(ITestOutputHelper testOutputHelper)
|
||||
{
|
||||
_testOutputHelper = testOutputHelper;
|
||||
_archiveService = new ArchiveService(_logger, _directoryService);
|
||||
_archiveService = new ArchiveService(_logger, _directoryService, new ImageService(Substitute.For<ILogger<ImageService>>(), _directoryService));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
@ -107,15 +110,15 @@ namespace API.Tests.Services
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives");
|
||||
var extractDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives/Extraction");
|
||||
|
||||
DirectoryService.ClearAndDeleteDirectory(extractDirectory);
|
||||
_directoryService.ClearAndDeleteDirectory(extractDirectory);
|
||||
|
||||
Stopwatch sw = Stopwatch.StartNew();
|
||||
var sw = Stopwatch.StartNew();
|
||||
_archiveService.ExtractArchive(Path.Join(testDirectory, archivePath), extractDirectory);
|
||||
var di1 = new DirectoryInfo(extractDirectory);
|
||||
Assert.Equal(expectedFileCount, di1.Exists ? di1.GetFiles().Length : 0);
|
||||
Assert.Equal(expectedFileCount, di1.Exists ? _directoryService.GetFiles(extractDirectory, searchOption:SearchOption.AllDirectories).Count() : 0);
|
||||
_testOutputHelper.WriteLine($"Processed in {sw.ElapsedMilliseconds} ms");
|
||||
|
||||
DirectoryService.ClearAndDeleteDirectory(extractDirectory);
|
||||
_directoryService.ClearAndDeleteDirectory(extractDirectory);
|
||||
}
|
||||
|
||||
|
||||
@ -128,7 +131,7 @@ namespace API.Tests.Services
|
||||
[InlineData(new [] {"Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c060 (v10) - p200 [Digital] [LuCaZ].jpg", "folder.jpg"}, "folder.jpg")]
|
||||
public void FindFolderEntry(string[] files, string expected)
|
||||
{
|
||||
var foundFile = _archiveService.FindFolderEntry(files);
|
||||
var foundFile = ArchiveService.FindFolderEntry(files);
|
||||
Assert.Equal(expected, string.IsNullOrEmpty(foundFile) ? "" : foundFile);
|
||||
}
|
||||
|
||||
@ -141,6 +144,7 @@ namespace API.Tests.Services
|
||||
[InlineData(new [] {"__MACOSX/cover.jpg", "vol1/page 01.jpg"}, "vol1/page 01.jpg")]
|
||||
[InlineData(new [] {"Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg", "Akame ga KILL! ZERO - c060 (v10) - p200 [Digital] [LuCaZ].jpg", "folder.jpg"}, "Akame ga KILL! ZERO - c055 (v10) - p000 [Digital] [LuCaZ].jpg")]
|
||||
[InlineData(new [] {"001.jpg", "001 - chapter 1/001.jpg"}, "001.jpg")]
|
||||
[InlineData(new [] {"chapter 1/001.jpg", "chapter 2/002.jpg", "somefile.jpg"}, "somefile.jpg")]
|
||||
public void FindFirstEntry(string[] files, string expected)
|
||||
{
|
||||
var foundFile = ArchiveService.FirstFileEntry(files, string.Empty);
|
||||
@ -157,27 +161,29 @@ namespace API.Tests.Services
|
||||
[InlineData("macos_native.zip", "macos_native.jpg")]
|
||||
[InlineData("v10 - duplicate covers.cbz", "v10 - duplicate covers.expected.jpg")]
|
||||
[InlineData("sorting.zip", "sorting.expected.jpg")]
|
||||
[InlineData("test.zip", "test.expected.jpg")] // https://github.com/kleisauke/net-vips/issues/155
|
||||
public void GetCoverImage_Default_Test(string inputFile, string expectedOutputFile)
|
||||
{
|
||||
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger));
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages");
|
||||
var expectedBytes = File.ReadAllBytes(Path.Join(testDirectory, expectedOutputFile));
|
||||
var ds = Substitute.For<DirectoryService>(_directoryServiceLogger, new FileSystem());
|
||||
var imageService = new ImageService(Substitute.For<ILogger<ImageService>>(), ds);
|
||||
var archiveService = Substitute.For<ArchiveService>(_logger, ds, imageService);
|
||||
|
||||
var testDirectory = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages"));
|
||||
var expectedBytes = Image.Thumbnail(Path.Join(testDirectory, expectedOutputFile), 320).WriteToBuffer(".png");
|
||||
|
||||
archiveService.Configure().CanOpen(Path.Join(testDirectory, inputFile)).Returns(ArchiveLibrary.Default);
|
||||
var sw = Stopwatch.StartNew();
|
||||
|
||||
var outputDir = Path.Join(testDirectory, "output");
|
||||
DirectoryService.ClearAndDeleteDirectory(outputDir);
|
||||
DirectoryService.ExistOrCreate(outputDir);
|
||||
|
||||
_directoryService.ClearDirectory(outputDir);
|
||||
_directoryService.ExistOrCreate(outputDir);
|
||||
|
||||
var coverImagePath = archiveService.GetCoverImage(Path.Join(testDirectory, inputFile),
|
||||
Path.GetFileNameWithoutExtension(inputFile) + "_output");
|
||||
var actual = File.ReadAllBytes(coverImagePath);
|
||||
Path.GetFileNameWithoutExtension(inputFile) + "_output", outputDir);
|
||||
var actual = File.ReadAllBytes(Path.Join(outputDir, coverImagePath));
|
||||
|
||||
|
||||
Assert.Equal(expectedBytes, actual);
|
||||
_testOutputHelper.WriteLine($"Processed in {sw.ElapsedMilliseconds} ms");
|
||||
DirectoryService.ClearAndDeleteDirectory(outputDir);
|
||||
//_directoryService.ClearAndDeleteDirectory(outputDir);
|
||||
}
|
||||
|
||||
|
||||
@ -191,35 +197,69 @@ namespace API.Tests.Services
|
||||
[InlineData("sorting.zip", "sorting.expected.jpg")]
|
||||
public void GetCoverImage_SharpCompress_Test(string inputFile, string expectedOutputFile)
|
||||
{
|
||||
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger));
|
||||
var imageService = new ImageService(Substitute.For<ILogger<ImageService>>(), _directoryService);
|
||||
var archiveService = Substitute.For<ArchiveService>(_logger,
|
||||
new DirectoryService(_directoryServiceLogger, new FileSystem()), imageService);
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages");
|
||||
var expectedBytes = File.ReadAllBytes(Path.Join(testDirectory, expectedOutputFile));
|
||||
|
||||
var outputDir = Path.Join(testDirectory, "output");
|
||||
_directoryService.ClearDirectory(outputDir);
|
||||
_directoryService.ExistOrCreate(outputDir);
|
||||
|
||||
archiveService.Configure().CanOpen(Path.Join(testDirectory, inputFile)).Returns(ArchiveLibrary.SharpCompress);
|
||||
Stopwatch sw = Stopwatch.StartNew();
|
||||
Assert.Equal(expectedBytes, File.ReadAllBytes(archiveService.GetCoverImage(Path.Join(testDirectory, inputFile), Path.GetFileNameWithoutExtension(inputFile) + "_output")));
|
||||
_testOutputHelper.WriteLine($"Processed in {sw.ElapsedMilliseconds} ms");
|
||||
var actualBytes = File.ReadAllBytes(archiveService.GetCoverImage(Path.Join(testDirectory, inputFile),
|
||||
Path.GetFileNameWithoutExtension(inputFile) + "_output", outputDir));
|
||||
Assert.Equal(expectedBytes, actualBytes);
|
||||
|
||||
_directoryService.ClearAndDeleteDirectory(outputDir);
|
||||
}
|
||||
|
||||
// TODO: This is broken on GA due to DirectoryService.CoverImageDirectory
|
||||
//[Theory]
|
||||
[Theory]
|
||||
[InlineData("Archives/macos_native.zip")]
|
||||
[InlineData("Formats/One File with DB_Supported.zip")]
|
||||
public void CanParseCoverImage(string inputFile)
|
||||
{
|
||||
var imageService = Substitute.For<IImageService>();
|
||||
imageService.WriteCoverThumbnail(Arg.Any<Stream>(), Arg.Any<string>(), Arg.Any<string>()).Returns(x => "cover.jpg");
|
||||
var archiveService = new ArchiveService(_logger, _directoryService, imageService);
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/");
|
||||
Assert.NotEmpty(File.ReadAllBytes(_archiveService.GetCoverImage(Path.Join(testDirectory, inputFile), Path.GetFileNameWithoutExtension(inputFile) + "_output")));
|
||||
var inputPath = Path.GetFullPath(Path.Join(testDirectory, inputFile));
|
||||
var outputPath = Path.Join(testDirectory, Path.GetFileNameWithoutExtension(inputFile) + "_output");
|
||||
new DirectoryInfo(outputPath).Create();
|
||||
var expectedImage = archiveService.GetCoverImage(inputPath, inputFile, outputPath);
|
||||
Assert.Equal("cover.jpg", expectedImage);
|
||||
new DirectoryInfo(outputPath).Delete();
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldHaveComicInfo()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/ComicInfos");
|
||||
var archive = Path.Join(testDirectory, "file in folder.zip");
|
||||
var summaryInfo = "By all counts, Ryouta Sakamoto is a loser when he's not holed up in his room, bombing things into oblivion in his favorite online action RPG. But his very own uneventful life is blown to pieces when he's abducted and taken to an uninhabited island, where he soon learns the hard way that he's being pitted against others just like him in a explosives-riddled death match! How could this be happening? Who's putting them up to this? And why!? The name, not to mention the objective, of this very real survival game is eerily familiar to Ryouta, who has mastered its virtual counterpart-BTOOOM! Can Ryouta still come out on top when he's playing for his life!?";
|
||||
#region ShouldHaveComicInfo
|
||||
|
||||
Assert.Equal(summaryInfo, _archiveService.GetComicInfo(archive).Summary);
|
||||
}
|
||||
[Fact]
|
||||
public void ShouldHaveComicInfo()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/ComicInfos");
|
||||
var archive = Path.Join(testDirectory, "ComicInfo.zip");
|
||||
const string summaryInfo = "By all counts, Ryouta Sakamoto is a loser when he's not holed up in his room, bombing things into oblivion in his favorite online action RPG. But his very own uneventful life is blown to pieces when he's abducted and taken to an uninhabited island, where he soon learns the hard way that he's being pitted against others just like him in a explosives-riddled death match! How could this be happening? Who's putting them up to this? And why!? The name, not to mention the objective, of this very real survival game is eerily familiar to Ryouta, who has mastered its virtual counterpart-BTOOOM! Can Ryouta still come out on top when he's playing for his life!?";
|
||||
|
||||
var comicInfo = _archiveService.GetComicInfo(archive);
|
||||
Assert.NotNull(comicInfo);
|
||||
Assert.Equal(summaryInfo, comicInfo.Summary);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldHaveComicInfo_WithAuthors()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/ComicInfos");
|
||||
var archive = Path.Join(testDirectory, "ComicInfo_authors.zip");
|
||||
|
||||
var comicInfo = _archiveService.GetComicInfo(archive);
|
||||
Assert.NotNull(comicInfo);
|
||||
Assert.Equal("Junya Inoue", comicInfo.Writer);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CanParseComicInfo
|
||||
|
||||
[Fact]
|
||||
public void CanParseComicInfo()
|
||||
@ -243,5 +283,38 @@ namespace API.Tests.Services
|
||||
|
||||
Assert.NotStrictEqual(expected, actual);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region FindCoverImageFilename
|
||||
|
||||
[Theory]
|
||||
[InlineData(new string[] {}, "", null)]
|
||||
[InlineData(new [] {"001.jpg", "002.jpg"}, "Test.zip", "001.jpg")]
|
||||
[InlineData(new [] {"001.jpg", "!002.jpg"}, "Test.zip", "!002.jpg")]
|
||||
[InlineData(new [] {"001.jpg", "!001.jpg"}, "Test.zip", "!001.jpg")]
|
||||
[InlineData(new [] {"001.jpg", "cover.jpg"}, "Test.zip", "cover.jpg")]
|
||||
[InlineData(new [] {"001.jpg", "Chapter 20/cover.jpg", "Chapter 21/0001.jpg"}, "Test.zip", "Chapter 20/cover.jpg")]
|
||||
[InlineData(new [] {"._/001.jpg", "._/cover.jpg", "010.jpg"}, "Test.zip", "010.jpg")]
|
||||
[InlineData(new [] {"001.txt", "002.txt", "a.jpg"}, "Test.zip", "a.jpg")]
|
||||
public void FindCoverImageFilename(string[] filenames, string archiveName, string expected)
|
||||
{
|
||||
Assert.Equal(expected, _archiveService.FindCoverImageFilename(archiveName, filenames));
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region CreateZipForDownload
|
||||
|
||||
//[Fact]
|
||||
public void CreateZipForDownloadTest()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
//_archiveService.CreateZipForDownload(new []{}, outputDirectory)
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
||||
}
|
||||
|
143
API.Tests/Services/BackupServiceTests.cs
Normal file
143
API.Tests/Services/BackupServiceTests.cs
Normal file
@ -0,0 +1,143 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using API.SignalR;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.Extensions.Configuration;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services;
|
||||
|
||||
public class BackupServiceTests
|
||||
{
|
||||
private readonly ILogger<BackupService> _logger = Substitute.For<ILogger<BackupService>>();
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
private readonly IConfiguration _config;
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
private const string CacheDirectory = "C:/kavita/config/cache/";
|
||||
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
||||
private const string BackupDirectory = "C:/kavita/config/backups/";
|
||||
private const string LogDirectory = "C:/kavita/config/logs/";
|
||||
|
||||
public BackupServiceTests()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder()
|
||||
.UseSqlite(CreateInMemoryDatabase())
|
||||
.Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
|
||||
_unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
||||
_config = Substitute.For<IConfiguration>();
|
||||
|
||||
}
|
||||
|
||||
#region Setup
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
var filesystem = CreateFileSystem();
|
||||
|
||||
await Seed.SeedSettings(_context, new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
||||
|
||||
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
||||
setting.Value = CacheDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
||||
setting.Value = BackupDirectory;
|
||||
|
||||
_context.ServerSetting.Update(setting);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
{
|
||||
Name = "Manga",
|
||||
Folders = new List<FolderPath>()
|
||||
{
|
||||
new FolderPath()
|
||||
{
|
||||
Path = "C:/data/"
|
||||
}
|
||||
}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
}
|
||||
|
||||
private async Task ResetDB()
|
||||
{
|
||||
_context.Series.RemoveRange(_context.Series.ToList());
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
}
|
||||
|
||||
private static MockFileSystem CreateFileSystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
||||
fileSystem.AddDirectory("C:/kavita/config/");
|
||||
fileSystem.AddDirectory(CacheDirectory);
|
||||
fileSystem.AddDirectory(CoverImageDirectory);
|
||||
fileSystem.AddDirectory(BackupDirectory);
|
||||
fileSystem.AddDirectory(LogDirectory);
|
||||
fileSystem.AddDirectory("C:/data/");
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
|
||||
|
||||
#region GetLogFiles
|
||||
|
||||
public void GetLogFiles_ExpectAllFiles_NoRollingFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{LogDirectory}kavita.log", new MockFileData(""));
|
||||
filesystem.AddFile($"{LogDirectory}kavita1.log", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
// You can't mock _config extensions because they are static
|
||||
_config.GetMaxRollingFiles().Returns(1);
|
||||
_config.GetLoggingFileName().Returns(ds.FileSystem.Path.Join(LogDirectory, "kavita.log"));
|
||||
|
||||
var backupService = new BackupService(_logger, _unitOfWork, ds, _config, _messageHub);
|
||||
|
||||
Assert.Single(backupService.GetLogFiles(1, LogDirectory));
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
}
|
@ -1,5 +1,5 @@
|
||||
using System.IO;
|
||||
using API.Interfaces.Services;
|
||||
using System.IO.Abstractions;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
@ -14,7 +14,8 @@ namespace API.Tests.Services
|
||||
|
||||
public BookServiceTests()
|
||||
{
|
||||
_bookService = new BookService(_logger);
|
||||
var directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem());
|
||||
_bookService = new BookService(_logger, directoryService, new ImageService(Substitute.For<ILogger<ImageService>>(), directoryService));
|
||||
}
|
||||
|
||||
[Theory]
|
||||
@ -27,5 +28,40 @@ namespace API.Tests.Services
|
||||
Assert.Equal(expectedPages, _bookService.GetNumberOfPages(Path.Join(testDirectory, filePath)));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldHaveComicInfo()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/BookService/EPUB");
|
||||
var archive = Path.Join(testDirectory, "The Golden Harpoon; Or, Lost Among the Floes A Story of the Whaling Grounds.epub");
|
||||
const string summaryInfo = "Book Description";
|
||||
|
||||
var comicInfo = _bookService.GetComicInfo(archive);
|
||||
Assert.NotNull(comicInfo);
|
||||
Assert.Equal(summaryInfo, comicInfo.Summary);
|
||||
Assert.Equal("genre1, genre2", comicInfo.Genre);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldHaveComicInfo_WithAuthors()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/BookService/EPUB");
|
||||
var archive = Path.Join(testDirectory, "The Golden Harpoon; Or, Lost Among the Floes A Story of the Whaling Grounds.epub");
|
||||
|
||||
var comicInfo = _bookService.GetComicInfo(archive);
|
||||
Assert.NotNull(comicInfo);
|
||||
Assert.Equal("Roger Starbuck,Junya Inoue", comicInfo.Writer);
|
||||
}
|
||||
|
||||
|
||||
#region BookEscaping
|
||||
|
||||
[Fact]
|
||||
public void EscapeCSSImportReferencesTest()
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -1,115 +1,508 @@
|
||||
namespace API.Tests.Services
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Metadata;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.SignalR;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services
|
||||
{
|
||||
internal class MockReadingItemServiceForCacheService : IReadingItemService
|
||||
{
|
||||
private readonly DirectoryService _directoryService;
|
||||
|
||||
public MockReadingItemServiceForCacheService(DirectoryService directoryService)
|
||||
{
|
||||
_directoryService = directoryService;
|
||||
}
|
||||
|
||||
public ComicInfo GetComicInfo(string filePath)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
public int GetNumberOfPages(string filePath, MangaFormat format)
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
|
||||
public string GetCoverImage(string fileFilePath, string fileName, MangaFormat format)
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
public void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1)
|
||||
{
|
||||
throw new System.NotImplementedException();
|
||||
}
|
||||
|
||||
public ParserInfo Parse(string path, string rootPath, LibraryType type)
|
||||
{
|
||||
throw new System.NotImplementedException();
|
||||
}
|
||||
}
|
||||
public class CacheServiceTests
|
||||
{
|
||||
// private readonly CacheService _cacheService;
|
||||
// private readonly ILogger<CacheService> _logger = Substitute.For<ILogger<CacheService>>();
|
||||
// private readonly IUnitOfWork _unitOfWork = Substitute.For<IUnitOfWork>();
|
||||
// private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
|
||||
// private readonly IDirectoryService _directoryService = Substitute.For<DirectoryService>();
|
||||
//
|
||||
// public CacheServiceTests()
|
||||
// {
|
||||
// _cacheService = new CacheService(_logger, _unitOfWork, _archiveService, _directoryService);
|
||||
// }
|
||||
|
||||
private readonly ILogger<CacheService> _logger = Substitute.For<ILogger<CacheService>>();
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
private const string CacheDirectory = "C:/kavita/config/cache/";
|
||||
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
||||
private const string BackupDirectory = "C:/kavita/config/backups/";
|
||||
private const string DataDirectory = "C:/data/";
|
||||
|
||||
public CacheServiceTests()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder()
|
||||
.UseSqlite(CreateInMemoryDatabase())
|
||||
.Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
|
||||
_unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
||||
}
|
||||
|
||||
#region Setup
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
var filesystem = CreateFileSystem();
|
||||
|
||||
await Seed.SeedSettings(_context, new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
||||
|
||||
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
||||
setting.Value = CacheDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
||||
setting.Value = BackupDirectory;
|
||||
|
||||
_context.ServerSetting.Update(setting);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
{
|
||||
Name = "Manga",
|
||||
Folders = new List<FolderPath>()
|
||||
{
|
||||
new FolderPath()
|
||||
{
|
||||
Path = "C:/data/"
|
||||
}
|
||||
}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
}
|
||||
|
||||
private async Task ResetDB()
|
||||
{
|
||||
_context.Series.RemoveRange(_context.Series.ToList());
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
}
|
||||
|
||||
private static MockFileSystem CreateFileSystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
||||
fileSystem.AddDirectory("C:/kavita/config/");
|
||||
fileSystem.AddDirectory(CacheDirectory);
|
||||
fileSystem.AddDirectory(CoverImageDirectory);
|
||||
fileSystem.AddDirectory(BackupDirectory);
|
||||
fileSystem.AddDirectory(DataDirectory);
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region Ensure
|
||||
|
||||
[Fact]
|
||||
public async Task Ensure_DirectoryAlreadyExists_DontExtractAnything()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{DataDirectory}Test v1.zip", new MockFileData(""));
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
await ResetDB();
|
||||
var s = DbFactory.Series("Test");
|
||||
var v = DbFactory.Volume("1");
|
||||
var c = new Chapter()
|
||||
{
|
||||
Number = "1",
|
||||
Files = new List<MangaFile>()
|
||||
{
|
||||
new MangaFile()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
FilePath = $"{DataDirectory}Test v1.zip",
|
||||
}
|
||||
}
|
||||
};
|
||||
v.Chapters.Add(c);
|
||||
s.Volumes.Add(v);
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
await cleanupService.Ensure(1);
|
||||
Assert.Empty(ds.GetFiles(filesystem.Path.Join(CacheDirectory, "1"), searchOption:SearchOption.AllDirectories));
|
||||
}
|
||||
|
||||
// [Fact]
|
||||
// public async void Ensure_ShouldExtractArchive(int chapterId)
|
||||
// public async Task Ensure_DirectoryAlreadyExists_ExtractsImages()
|
||||
// {
|
||||
//
|
||||
// // CacheDirectory needs to be customized.
|
||||
// _unitOfWork.VolumeRepository.GetChapterAsync(chapterId).Returns(new Chapter
|
||||
// // TODO: Figure out a way to test this
|
||||
// var filesystem = CreateFileSystem();
|
||||
// filesystem.AddFile($"{DataDirectory}Test v1.zip", new MockFileData(""));
|
||||
// filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
// var archiveService = Substitute.For<IArchiveService>();
|
||||
// archiveService.ExtractArchive($"{DataDirectory}Test v1.zip",
|
||||
// filesystem.Path.Join(CacheDirectory, "1"));
|
||||
// var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
||||
// new ReadingItemService(archiveService, Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
//
|
||||
// await ResetDB();
|
||||
// var s = DbFactory.Series("Test");
|
||||
// var v = DbFactory.Volume("1");
|
||||
// var c = new Chapter()
|
||||
// {
|
||||
// Id = 1,
|
||||
// Number = "1",
|
||||
// Files = new List<MangaFile>()
|
||||
// {
|
||||
// new MangaFile()
|
||||
// {
|
||||
// FilePath = ""
|
||||
// Format = MangaFormat.Archive,
|
||||
// FilePath = $"{DataDirectory}Test v1.zip",
|
||||
// }
|
||||
// }
|
||||
// });
|
||||
//
|
||||
// await _cacheService.Ensure(1);
|
||||
//
|
||||
// var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/CacheService/Archives");
|
||||
// };
|
||||
// v.Chapters.Add(c);
|
||||
// s.Volumes.Add(v);
|
||||
// s.LibraryId = 1;
|
||||
// _context.Series.Add(s);
|
||||
//
|
||||
// await _context.SaveChangesAsync();
|
||||
//
|
||||
// await cleanupService.Ensure(1);
|
||||
// Assert.Empty(ds.GetFiles(filesystem.Path.Join(CacheDirectory, "1"), searchOption:SearchOption.AllDirectories));
|
||||
// }
|
||||
|
||||
//string GetCachedPagePath(Volume volume, int page)
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region CleanupChapters
|
||||
|
||||
[Fact]
|
||||
public void CleanupChapters_AllFilesShouldBeDeleted()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{CacheDirectory}1/001.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CacheDirectory}1/002.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CacheDirectory}3/003.jpg", new MockFileData(""));
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
cleanupService.CleanupChapters(new []{1, 3});
|
||||
Assert.Empty(ds.GetFiles(CacheDirectory, searchOption:SearchOption.AllDirectories));
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetCachedEpubFile
|
||||
|
||||
[Fact]
|
||||
public void GetCachedEpubFile_ShouldReturnFirstEpub()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{DataDirectory}1.epub", new MockFileData(""));
|
||||
filesystem.AddFile($"{DataDirectory}2.epub", new MockFileData(""));
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
var c = new Chapter()
|
||||
{
|
||||
Files = new List<MangaFile>()
|
||||
{
|
||||
new MangaFile()
|
||||
{
|
||||
FilePath = $"{DataDirectory}1.epub"
|
||||
},
|
||||
new MangaFile()
|
||||
{
|
||||
FilePath = $"{DataDirectory}2.epub"
|
||||
}
|
||||
}
|
||||
};
|
||||
cs.GetCachedEpubFile(1, c);
|
||||
Assert.Same($"{DataDirectory}1.epub", cs.GetCachedEpubFile(1, c));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetCachedPagePath
|
||||
|
||||
[Fact]
|
||||
public void GetCachedPagePath_ReturnNullIfNoFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
||||
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
||||
|
||||
var c = new Chapter()
|
||||
{
|
||||
Id = 1,
|
||||
Files = new List<MangaFile>()
|
||||
};
|
||||
|
||||
var fileIndex = 0;
|
||||
foreach (var file in c.Files)
|
||||
{
|
||||
for (var i = 0; i < file.Pages - 1; i++)
|
||||
{
|
||||
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileIndex++;
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
// Flatten to prepare for how GetFullPath expects
|
||||
ds.Flatten($"{CacheDirectory}1/");
|
||||
|
||||
var path = cs.GetCachedPagePath(c, 11);
|
||||
Assert.Equal(string.Empty, path);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCachedPagePath_GetFileFromFirstFile()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
||||
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
||||
|
||||
var c = new Chapter()
|
||||
{
|
||||
Id = 1,
|
||||
Files = new List<MangaFile>()
|
||||
{
|
||||
new MangaFile()
|
||||
{
|
||||
Id = 1,
|
||||
FilePath = $"{DataDirectory}1.zip",
|
||||
Pages = 10
|
||||
|
||||
},
|
||||
new MangaFile()
|
||||
{
|
||||
Id = 2,
|
||||
FilePath = $"{DataDirectory}2.zip",
|
||||
Pages = 5
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
var fileIndex = 0;
|
||||
foreach (var file in c.Files)
|
||||
{
|
||||
for (var i = 0; i < file.Pages; i++)
|
||||
{
|
||||
filesystem.AddFile($"{CacheDirectory}1/00{fileIndex}_00{i+1}.jpg", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileIndex++;
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
// Flatten to prepare for how GetFullPath expects
|
||||
ds.Flatten($"{CacheDirectory}1/");
|
||||
|
||||
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/000_001.jpg"), ds.FileSystem.Path.GetFullPath(cs.GetCachedPagePath(c, 0)));
|
||||
|
||||
}
|
||||
|
||||
|
||||
[Fact]
|
||||
public void GetCachedPagePath_GetLastPageFromSingleFile()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
||||
|
||||
var c = new Chapter()
|
||||
{
|
||||
Id = 1,
|
||||
Files = new List<MangaFile>()
|
||||
{
|
||||
new MangaFile()
|
||||
{
|
||||
Id = 1,
|
||||
FilePath = $"{DataDirectory}1.zip",
|
||||
Pages = 10
|
||||
|
||||
}
|
||||
}
|
||||
};
|
||||
c.Pages = c.Files.Sum(f => f.Pages);
|
||||
|
||||
var fileIndex = 0;
|
||||
foreach (var file in c.Files)
|
||||
{
|
||||
for (var i = 0; i < file.Pages; i++)
|
||||
{
|
||||
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileIndex++;
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
// Flatten to prepare for how GetFullPath expects
|
||||
ds.Flatten($"{CacheDirectory}1/");
|
||||
|
||||
// Remember that we start at 0, so this is the 10th file
|
||||
var path = cs.GetCachedPagePath(c, c.Pages);
|
||||
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/000_0{c.Pages}.jpg"), ds.FileSystem.Path.GetFullPath(path));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetCachedPagePath_GetFileFromSecondFile()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddDirectory($"{CacheDirectory}1/");
|
||||
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
||||
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
||||
|
||||
var c = new Chapter()
|
||||
{
|
||||
Id = 1,
|
||||
Files = new List<MangaFile>()
|
||||
{
|
||||
new MangaFile()
|
||||
{
|
||||
Id = 1,
|
||||
FilePath = $"{DataDirectory}1.zip",
|
||||
Pages = 10
|
||||
|
||||
},
|
||||
new MangaFile()
|
||||
{
|
||||
Id = 2,
|
||||
FilePath = $"{DataDirectory}2.zip",
|
||||
Pages = 5
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
var fileIndex = 0;
|
||||
foreach (var file in c.Files)
|
||||
{
|
||||
for (var i = 0; i < file.Pages; i++)
|
||||
{
|
||||
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileIndex++;
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
new ReadingItemService(Substitute.For<IArchiveService>(), Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
||||
|
||||
// Flatten to prepare for how GetFullPath expects
|
||||
ds.Flatten($"{CacheDirectory}1/");
|
||||
|
||||
// Remember that we start at 0, so this is the page + 1 file
|
||||
var path = cs.GetCachedPagePath(c, 10);
|
||||
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/001_001.jpg"), ds.FileSystem.Path.GetFullPath(path));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region ExtractChapterFiles
|
||||
|
||||
// [Fact]
|
||||
// //[InlineData("", 0, "")]
|
||||
// public void GetCachedPagePathTest_Should()
|
||||
// public void ExtractChapterFiles_ShouldExtractOnlyImages()
|
||||
// {
|
||||
//
|
||||
// // string archivePath = "flat file.zip";
|
||||
// // int pageNum = 0;
|
||||
// // string expected = "cache/1/pexels-photo-6551949.jpg";
|
||||
// //
|
||||
// // var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives");
|
||||
// // var file = Path.Join(testDirectory, archivePath);
|
||||
// // var volume = new Volume
|
||||
// // {
|
||||
// // Id = 1,
|
||||
// // Files = new List<MangaFile>()
|
||||
// // {
|
||||
// // new()
|
||||
// // {
|
||||
// // Id = 1,
|
||||
// // Chapter = 0,
|
||||
// // FilePath = archivePath,
|
||||
// // Format = MangaFormat.Archive,
|
||||
// // Pages = 1,
|
||||
// // }
|
||||
// // },
|
||||
// // Name = "1",
|
||||
// // Number = 1
|
||||
// // };
|
||||
// //
|
||||
// // var cacheService = Substitute.ForPartsOf<CacheService>();
|
||||
// // cacheService.Configure().CacheDirectoryIsAccessible().Returns(true);
|
||||
// // cacheService.Configure().GetVolumeCachePath(1, volume.Files.ElementAt(0)).Returns("cache/1/");
|
||||
// // _directoryService.Configure().GetFilesWithExtension("cache/1/").Returns(new string[] {"pexels-photo-6551949.jpg"});
|
||||
// // Assert.Equal(expected, _cacheService.GetCachedPagePath(volume, pageNum));
|
||||
// //Assert.True(true);
|
||||
// const string testDirectory = "/manga/";
|
||||
// var fileSystem = new MockFileSystem();
|
||||
// for (var i = 0; i < 10; i++)
|
||||
// {
|
||||
// fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
// }
|
||||
//
|
||||
// [Fact]
|
||||
// public void GetOrderedChaptersTest()
|
||||
// {
|
||||
// // var files = new List<Chapter>()
|
||||
// // {
|
||||
// // new()
|
||||
// // {
|
||||
// // Number = "1"
|
||||
// // },
|
||||
// // new()
|
||||
// // {
|
||||
// // Chapter = 2
|
||||
// // },
|
||||
// // new()
|
||||
// // {
|
||||
// // Chapter = 0
|
||||
// // },
|
||||
// // };
|
||||
// // var expected = new List<MangaFile>()
|
||||
// // {
|
||||
// // new()
|
||||
// // {
|
||||
// // Chapter = 1
|
||||
// // },
|
||||
// // new()
|
||||
// // {
|
||||
// // Chapter = 2
|
||||
// // },
|
||||
// // new()
|
||||
// // {
|
||||
// // Chapter = 0
|
||||
// // },
|
||||
// // };
|
||||
// // Assert.NotStrictEqual(expected, _cacheService.GetOrderedChapters(files));
|
||||
// }
|
||||
// fileSystem.AddDirectory(CacheDirectory);
|
||||
//
|
||||
|
||||
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
// var cs = new CacheService(_logger, _unitOfWork, ds,
|
||||
// new MockReadingItemServiceForCacheService(ds));
|
||||
//
|
||||
//
|
||||
// cs.ExtractChapterFiles(CacheDirectory, new List<MangaFile>()
|
||||
// {
|
||||
// new MangaFile()
|
||||
// {
|
||||
// ChapterId = 1,
|
||||
// Format = MangaFormat.Archive,
|
||||
// Pages = 2,
|
||||
// FilePath =
|
||||
// }
|
||||
// })
|
||||
// }
|
||||
|
||||
#endregion
|
||||
}
|
||||
}
|
||||
}
|
||||
|
433
API.Tests/Services/CleanupServiceTests.cs
Normal file
433
API.Tests/Services/CleanupServiceTests.cs
Normal file
@ -0,0 +1,433 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using API.SignalR;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services;
|
||||
|
||||
public class CleanupServiceTests
|
||||
{
|
||||
private readonly ILogger<CleanupService> _logger = Substitute.For<ILogger<CleanupService>>();
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
private const string CacheDirectory = "C:/kavita/config/cache/";
|
||||
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
||||
private const string BackupDirectory = "C:/kavita/config/backups/";
|
||||
private const string BookmarkDirectory = "C:/kavita/config/bookmarks/";
|
||||
|
||||
|
||||
public CleanupServiceTests()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder()
|
||||
.UseSqlite(CreateInMemoryDatabase())
|
||||
.Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
|
||||
_unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
||||
}
|
||||
|
||||
#region Setup
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
var filesystem = CreateFileSystem();
|
||||
|
||||
await Seed.SeedSettings(_context, new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
||||
|
||||
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
||||
setting.Value = CacheDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
||||
setting.Value = BackupDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BookmarkDirectory).SingleAsync();
|
||||
setting.Value = BookmarkDirectory;
|
||||
|
||||
_context.ServerSetting.Update(setting);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
{
|
||||
Name = "Manga",
|
||||
Folders = new List<FolderPath>()
|
||||
{
|
||||
new FolderPath()
|
||||
{
|
||||
Path = "C:/data/"
|
||||
}
|
||||
}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
}
|
||||
|
||||
private async Task ResetDB()
|
||||
{
|
||||
_context.Series.RemoveRange(_context.Series.ToList());
|
||||
_context.Users.RemoveRange(_context.Users.ToList());
|
||||
_context.AppUserBookmark.RemoveRange(_context.AppUserBookmark.ToList());
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
}
|
||||
|
||||
private static MockFileSystem CreateFileSystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
||||
fileSystem.AddDirectory("C:/kavita/config/");
|
||||
fileSystem.AddDirectory(CacheDirectory);
|
||||
fileSystem.AddDirectory(CoverImageDirectory);
|
||||
fileSystem.AddDirectory(BackupDirectory);
|
||||
fileSystem.AddDirectory(BookmarkDirectory);
|
||||
fileSystem.AddDirectory("C:/data/");
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
|
||||
#region DeleteSeriesCoverImages
|
||||
|
||||
[Fact]
|
||||
public async Task DeleteSeriesCoverImages_ShouldDeleteAll()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_03.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_1000.jpg", new MockFileData(""));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
var s = DbFactory.Series("Test 1");
|
||||
s.CoverImage = "series_01.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
s = DbFactory.Series("Test 2");
|
||||
s.CoverImage = "series_03.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
s = DbFactory.Series("Test 3");
|
||||
s.CoverImage = "series_1000.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
|
||||
await cleanupService.DeleteSeriesCoverImages();
|
||||
|
||||
Assert.Empty(ds.GetFiles(CoverImageDirectory));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task DeleteSeriesCoverImages_ShouldNotDeleteLinkedFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_03.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}series_1000.jpg", new MockFileData(""));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
// Add 2 series with cover images
|
||||
var s = DbFactory.Series("Test 1");
|
||||
s.CoverImage = "series_01.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
s = DbFactory.Series("Test 2");
|
||||
s.CoverImage = "series_03.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
|
||||
await cleanupService.DeleteSeriesCoverImages();
|
||||
|
||||
Assert.Equal(2, ds.GetFiles(CoverImageDirectory).Count());
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region DeleteChapterCoverImages
|
||||
[Fact]
|
||||
public async Task DeleteChapterCoverImages_ShouldNotDeleteLinkedFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CoverImageDirectory}v01_c01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}v01_c03.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}v01_c1000.jpg", new MockFileData(""));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
// Add 2 series with cover images
|
||||
var s = DbFactory.Series("Test 1");
|
||||
var v = DbFactory.Volume("1");
|
||||
v.Chapters.Add(new Chapter()
|
||||
{
|
||||
CoverImage = "v01_c01.jpg"
|
||||
});
|
||||
v.CoverImage = "v01_c01.jpg";
|
||||
s.Volumes.Add(v);
|
||||
s.CoverImage = "series_01.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
s = DbFactory.Series("Test 2");
|
||||
v = DbFactory.Volume("1");
|
||||
v.Chapters.Add(new Chapter()
|
||||
{
|
||||
CoverImage = "v01_c03.jpg"
|
||||
});
|
||||
v.CoverImage = "v01_c03jpg";
|
||||
s.Volumes.Add(v);
|
||||
s.CoverImage = "series_03.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
|
||||
await cleanupService.DeleteChapterCoverImages();
|
||||
|
||||
Assert.Equal(2, ds.GetFiles(CoverImageDirectory).Count());
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region DeleteTagCoverImages
|
||||
|
||||
[Fact]
|
||||
public async Task DeleteTagCoverImages_ShouldNotDeleteLinkedFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CoverImageDirectory}tag_01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}tag_02.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CoverImageDirectory}tag_1000.jpg", new MockFileData(""));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
// Add 2 series with cover images
|
||||
var s = DbFactory.Series("Test 1");
|
||||
s.Metadata.CollectionTags = new List<CollectionTag>();
|
||||
s.Metadata.CollectionTags.Add(new CollectionTag()
|
||||
{
|
||||
Title = "Something",
|
||||
CoverImage ="tag_01.jpg"
|
||||
});
|
||||
s.CoverImage = "series_01.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
s = DbFactory.Series("Test 2");
|
||||
s.Metadata.CollectionTags = new List<CollectionTag>();
|
||||
s.Metadata.CollectionTags.Add(new CollectionTag()
|
||||
{
|
||||
Title = "Something 2",
|
||||
CoverImage ="tag_02.jpg"
|
||||
});
|
||||
s.CoverImage = "series_03.jpg";
|
||||
s.LibraryId = 1;
|
||||
_context.Series.Add(s);
|
||||
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
|
||||
await cleanupService.DeleteTagCoverImages();
|
||||
|
||||
Assert.Equal(2, ds.GetFiles(CoverImageDirectory).Count());
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CleanupCacheDirectory
|
||||
|
||||
[Fact]
|
||||
public void CleanupCacheDirectory_ClearAllFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CacheDirectory}01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CacheDirectory}02.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
cleanupService.CleanupCacheDirectory();
|
||||
Assert.Empty(ds.GetFiles(CacheDirectory, searchOption: SearchOption.AllDirectories));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CleanupCacheDirectory_ClearAllFilesInSubDirectory()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{CacheDirectory}01.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{CacheDirectory}subdir/02.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
cleanupService.CleanupCacheDirectory();
|
||||
Assert.Empty(ds.GetFiles(CacheDirectory, searchOption: SearchOption.AllDirectories));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CleanupBackups
|
||||
|
||||
[Fact]
|
||||
public void CleanupBackups_LeaveOneFile_SinceAllAreExpired()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
CreationTime = DateTimeOffset.Now.Subtract(TimeSpan.FromDays(31))
|
||||
};
|
||||
filesystem.AddFile($"{BackupDirectory}kavita_backup_11_29_2021_12_00_13 AM.zip", filesystemFile);
|
||||
filesystem.AddFile($"{BackupDirectory}kavita_backup_12_3_2021_9_27_58 AM.zip", filesystemFile);
|
||||
filesystem.AddFile($"{BackupDirectory}randomfile.zip", filesystemFile);
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
cleanupService.CleanupBackups();
|
||||
Assert.Single(ds.GetFiles(BackupDirectory, searchOption: SearchOption.AllDirectories));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CleanupBackups_LeaveLestExpired()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
var filesystemFile = new MockFileData("")
|
||||
{
|
||||
CreationTime = DateTimeOffset.Now.Subtract(TimeSpan.FromDays(31))
|
||||
};
|
||||
filesystem.AddFile($"{BackupDirectory}kavita_backup_11_29_2021_12_00_13 AM.zip", filesystemFile);
|
||||
filesystem.AddFile($"{BackupDirectory}kavita_backup_12_3_2021_9_27_58 AM.zip", filesystemFile);
|
||||
filesystem.AddFile($"{BackupDirectory}randomfile.zip", new MockFileData("")
|
||||
{
|
||||
CreationTime = DateTimeOffset.Now.Subtract(TimeSpan.FromDays(14))
|
||||
});
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
cleanupService.CleanupBackups();
|
||||
Assert.True(filesystem.File.Exists($"{BackupDirectory}randomfile.zip"));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CleanupBookmarks
|
||||
|
||||
[Fact]
|
||||
public async Task CleanupBookmarks_LeaveAllFiles()
|
||||
{
|
||||
var filesystem = CreateFileSystem();
|
||||
filesystem.AddFile($"{BookmarkDirectory}1/1/1/0001.jpg", new MockFileData(""));
|
||||
filesystem.AddFile($"{BookmarkDirectory}1/1/1/0002.jpg", new MockFileData(""));
|
||||
|
||||
// Delete all Series to reset state
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
Bookmarks = new List<AppUserBookmark>()
|
||||
{
|
||||
new AppUserBookmark()
|
||||
{
|
||||
AppUserId = 1,
|
||||
ChapterId = 1,
|
||||
Page = 1,
|
||||
FileName = "1/1/1/0001.jpg",
|
||||
SeriesId = 1,
|
||||
VolumeId = 1
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
||||
var cleanupService = new CleanupService(_logger, _unitOfWork, _messageHub,
|
||||
ds);
|
||||
|
||||
await cleanupService.CleanupBookmarks();
|
||||
|
||||
Assert.Equal(1, ds.GetFiles(BookmarkDirectory, searchOption:SearchOption.AllDirectories).Count());
|
||||
|
||||
}
|
||||
|
||||
#endregion
|
||||
}
|
@ -1,7 +1,11 @@
|
||||
using System;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using System.Threading.Tasks;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
@ -12,92 +16,658 @@ namespace API.Tests.Services
|
||||
|
||||
public class DirectoryServiceTests
|
||||
{
|
||||
private readonly DirectoryService _directoryService;
|
||||
private readonly ILogger<DirectoryService> _logger = Substitute.For<ILogger<DirectoryService>>();
|
||||
|
||||
public DirectoryServiceTests()
|
||||
{
|
||||
_directoryService = new DirectoryService(_logger);
|
||||
}
|
||||
|
||||
#region TraverseTreeParallelForEach
|
||||
[Fact]
|
||||
public void GetFilesTest_Should_Be28()
|
||||
public void TraverseTreeParallelForEach_JustArchives_ShouldBe28()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ScannerService/Manga");
|
||||
// ReSharper disable once CollectionNeverQueried.Local
|
||||
var testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 28; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = new List<string>();
|
||||
var fileCount = DirectoryService.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
|
||||
var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
|
||||
API.Parser.Parser.ArchiveFileExtensions, _logger);
|
||||
|
||||
Assert.Equal(28, fileCount);
|
||||
Assert.Equal(28, files.Count);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFiles_WithCustomRegex_ShouldPass_Test()
|
||||
public void TraverseTreeParallelForEach_LongDirectory_ShouldBe1()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/regex");
|
||||
var files = DirectoryService.GetFiles(testDirectory, @"file\d*.txt");
|
||||
Assert.Equal(2, files.Count());
|
||||
var fileSystem = new MockFileSystem();
|
||||
// Create a super long path
|
||||
var testDirectory = "/manga/";
|
||||
for (var i = 0; i < 200; i++)
|
||||
{
|
||||
testDirectory = fileSystem.FileSystem.Path.Join(testDirectory, "supercalifragilisticexpialidocious");
|
||||
}
|
||||
|
||||
|
||||
fileSystem.AddFile(fileSystem.FileSystem.Path.Join(testDirectory, "file_29.jpg"), new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = new List<string>();
|
||||
try
|
||||
{
|
||||
var fileCount = ds.TraverseTreeParallelForEach("/manga/", s => files.Add(s),
|
||||
API.Parser.Parser.ImageFileExtensions, _logger);
|
||||
Assert.Equal(1, fileCount);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Assert.False(true);
|
||||
}
|
||||
|
||||
|
||||
Assert.Equal(1, files.Count);
|
||||
}
|
||||
|
||||
|
||||
|
||||
[Fact]
|
||||
public void TraverseTreeParallelForEach_DontCountExcludedDirectories_ShouldBe28()
|
||||
{
|
||||
var testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 28; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{Path.Join(testDirectory, "@eaDir")}file_{29}.jpg", new MockFileData(""));
|
||||
fileSystem.AddFile($"{Path.Join(testDirectory, ".DS_Store")}file_{30}.jpg", new MockFileData(""));
|
||||
fileSystem.AddFile($"{Path.Join(testDirectory, ".qpkg")}file_{30}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = new List<string>();
|
||||
var fileCount = ds.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
|
||||
API.Parser.Parser.ArchiveFileExtensions, _logger);
|
||||
|
||||
Assert.Equal(28, fileCount);
|
||||
Assert.Equal(28, files.Count);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region GetFilesWithCertainExtensions
|
||||
[Fact]
|
||||
public void GetFilesWithCertainExtensions_ShouldBe10()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFilesWithExtension(testDirectory, API.Parser.Parser.ArchiveFileExtensions);
|
||||
|
||||
Assert.Equal(10, files.Length);
|
||||
Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFiles_TopLevel_ShouldBeEmpty_Test()
|
||||
public void GetFilesWithCertainExtensions_OnlyArchives()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService");
|
||||
var files = DirectoryService.GetFiles(testDirectory);
|
||||
Assert.Empty(files);
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}file_{29}.rar", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFilesWithExtension(testDirectory, ".zip|.rar");
|
||||
|
||||
Assert.Equal(11, files.Length);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region GetFiles
|
||||
[Fact]
|
||||
public void GetFiles_ArchiveOnly_ShouldBe10()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory, API.Parser.Parser.ArchiveFileExtensions).ToList();
|
||||
|
||||
Assert.Equal(10, files.Count());
|
||||
Assert.All(files, s => fileSystem.Path.GetExtension(s).Equals(".zip"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFilesWithExtensions_ShouldBeEmpty_Test()
|
||||
public void GetFiles_All_ShouldBe11()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extensions");
|
||||
var files = DirectoryService.GetFiles(testDirectory, "*.txt");
|
||||
Assert.Empty(files);
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory).ToList();
|
||||
|
||||
Assert.Equal(11, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFilesWithExtensions_Test()
|
||||
public void GetFiles_All_MixedPathSeparators()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/extension");
|
||||
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
|
||||
Assert.Equal(3, files.Count());
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"/manga\\file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory).ToList();
|
||||
|
||||
Assert.Equal(11, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFilesWithExtensions_BadDirectory_ShouldBeEmpty_Test()
|
||||
public void GetFiles_All_TopDirectoryOnly_ShouldBe10()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/doesntexist");
|
||||
var files = DirectoryService.GetFiles(testDirectory, ".cbz|.rar");
|
||||
Assert.Empty(files);
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}/SubDir/file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory).ToList();
|
||||
|
||||
Assert.Equal(10, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ListDirectory_SubDirectory_Test()
|
||||
public void GetFiles_WithSubDirectories_ShouldCountOnlyTopLevel()
|
||||
{
|
||||
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/DirectoryService/");
|
||||
var dirs = _directoryService.ListDirectory(testDirectory);
|
||||
Assert.Contains(dirs, s => s.Contains("regex"));
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}/SubDir/file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory).ToList();
|
||||
|
||||
Assert.Equal(10, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ListDirectory_NoSubDirectory_Test()
|
||||
public void GetFiles_ShouldNotReturnFilesThatAreExcluded()
|
||||
{
|
||||
var dirs = _directoryService.ListDirectory("");
|
||||
Assert.DoesNotContain(dirs, s => s.Contains("regex"));
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
fileSystem.AddFile($"{testDirectory}/._file_{29}.jpg", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory).ToList();
|
||||
|
||||
Assert.Equal(10, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFiles_WithCustomRegex_ShouldBe10()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}data-{i}.txt", new MockFileData(""));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}joe.txt", new MockFileData(""));
|
||||
fileSystem.AddFile($"{testDirectory}0d.txt", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory, @".*d.*\.txt");
|
||||
Assert.Equal(11, files.Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetFiles_WithCustomRegexThatContainsFolder_ShouldBe10()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file/data-{i}.txt", new MockFileData(""));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}joe.txt", new MockFileData(""));
|
||||
fileSystem.AddFile($"{testDirectory}0d.txt", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var files = ds.GetFiles(testDirectory, @".*d.*\.txt", SearchOption.AllDirectories);
|
||||
Assert.Equal(11, files.Count());
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region GetTotalSize
|
||||
[Fact]
|
||||
public void GetTotalSize_ShouldBeGreaterThan0()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file/data-{i}.txt", new MockFileData("abc"));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}joe.txt", new MockFileData(""));
|
||||
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var fileSize = ds.GetTotalSize(fileSystem.AllFiles);
|
||||
Assert.True(fileSize > 0);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region CopyFileToDirectory
|
||||
[Fact]
|
||||
public void CopyFileToDirectory_ShouldCopyFileToNonExistentDirectory()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFileToDirectory($"{testDirectory}file/data-0.txt", "/manga/output/");
|
||||
Assert.True(fileSystem.FileExists("manga/output/data-0.txt"));
|
||||
Assert.True(fileSystem.FileExists("manga/file/data-0.txt"));
|
||||
}
|
||||
[Fact]
|
||||
public void CopyFileToDirectory_ShouldCopyFileToExistingDirectoryAndOverwrite()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}output/data-0.txt", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFileToDirectory($"{testDirectory}file/data-0.txt", "/manga/output/");
|
||||
Assert.True(fileSystem.FileExists("/manga/output/data-0.txt"));
|
||||
Assert.True(fileSystem.FileExists("/manga/file/data-0.txt"));
|
||||
Assert.True(fileSystem.FileInfo.FromFileName("/manga/file/data-0.txt").Length == fileSystem.FileInfo.FromFileName("/manga/output/data-0.txt").Length);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region CopyDirectoryToDirectory
|
||||
[Fact]
|
||||
public void CopyDirectoryToDirectory_ShouldThrowWhenSourceDestinationDoesntExist()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}output/data-0.txt", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var ex = Assert.Throws<DirectoryNotFoundException>(() => ds.CopyDirectoryToDirectory("/comics/", "/manga/output/"));
|
||||
Assert.Equal(ex.Message, "Source directory does not exist or could not be found: " + "/comics/");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CopyDirectoryToDirectory_ShouldCopyEmptyDirectory()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
fileSystem.AddDirectory($"{testDirectory}empty/");
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyDirectoryToDirectory($"{testDirectory}empty/", "/manga/output/");
|
||||
Assert.Empty(fileSystem.DirectoryInfo.FromDirectoryName("/manga/output/").GetFiles());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CopyDirectoryToDirectory_ShouldCopyAllFileAndNestedDirectoriesOver()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-1.txt", new MockFileData("abc"));
|
||||
fileSystem.AddDirectory($"{testDirectory}empty/");
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyDirectoryToDirectory($"{testDirectory}", "/manga/output/");
|
||||
Assert.Equal(2, ds.GetFiles("/manga/output/", searchOption: SearchOption.AllDirectories).Count());
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region IsDriveMounted
|
||||
[Fact]
|
||||
public void IsDriveMounted_DriveIsNotMounted()
|
||||
{
|
||||
const string testDirectory = "c:/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}data-0.txt", new MockFileData("abc"));
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
Assert.False(ds.IsDriveMounted("d:/manga/"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsDriveMounted_DriveIsMounted()
|
||||
{
|
||||
const string testDirectory = "c:/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}data-0.txt", new MockFileData("abc"));
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
Assert.True(ds.IsDriveMounted("c:/manga/file"));
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region IsDirectoryEmpty
|
||||
[Fact]
|
||||
public void IsDirectoryEmpty_DirectoryIsEmpty()
|
||||
{
|
||||
const string testDirectory = "c:/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(testDirectory);
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
Assert.True(ds.IsDirectoryEmpty("c:/manga/"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsDirectoryEmpty_DirectoryIsNotEmpty()
|
||||
{
|
||||
const string testDirectory = "c:/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}data-0.txt", new MockFileData("abc"));
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
Assert.False(ds.IsDirectoryEmpty("c:/manga/"));
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region ExistOrCreate
|
||||
[Fact]
|
||||
public void ExistOrCreate_ShouldCreate()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.ExistOrCreate("c:/manga/output/");
|
||||
|
||||
Assert.True(ds.FileSystem.DirectoryInfo.FromDirectoryName("c:/manga/output/").Exists);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region ClearAndDeleteDirectory
|
||||
[Fact]
|
||||
public void ClearAndDeleteDirectory_ShouldDeleteSelfAndAllFilesAndFolders()
|
||||
{
|
||||
const string testDirectory = "/manga/base/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file/data-{i}.txt", new MockFileData("abc"));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}data-a.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-b.txt", new MockFileData("abc"));
|
||||
fileSystem.AddDirectory($"{testDirectory}empty/");
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.ClearAndDeleteDirectory($"{testDirectory}");
|
||||
Assert.Empty(ds.GetFiles("/manga/", searchOption: SearchOption.AllDirectories));
|
||||
Assert.Empty(ds.FileSystem.DirectoryInfo.FromDirectoryName("/manga/").GetDirectories());
|
||||
Assert.True(ds.FileSystem.DirectoryInfo.FromDirectoryName("/manga/").Exists);
|
||||
Assert.False(ds.FileSystem.DirectoryInfo.FromDirectoryName("/manga/base").Exists);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region ClearDirectory
|
||||
[Fact]
|
||||
public void ClearDirectory_ShouldDeleteAllFilesAndFolders_LeaveSelf()
|
||||
{
|
||||
const string testDirectory = "/manga/base/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file/data-{i}.txt", new MockFileData("abc"));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}data-a.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-b.txt", new MockFileData("abc"));
|
||||
fileSystem.AddDirectory($"{testDirectory}file/empty/");
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.ClearDirectory($"{testDirectory}file/");
|
||||
Assert.Empty(ds.FileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}file/").GetDirectories());
|
||||
Assert.True(ds.FileSystem.DirectoryInfo.FromDirectoryName("/manga/").Exists);
|
||||
Assert.True(ds.FileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}file/").Exists);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ClearDirectory_ShouldDeleteFoldersWithOneFileInside()
|
||||
{
|
||||
const string testDirectory = "/manga/base/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file/data-{i}.txt", new MockFileData("abc"));
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.ClearDirectory($"{testDirectory}");
|
||||
Assert.Empty(ds.FileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}").GetDirectories());
|
||||
Assert.True(ds.FileSystem.DirectoryInfo.FromDirectoryName(testDirectory).Exists);
|
||||
Assert.False(ds.FileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}file/").Exists);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region CopyFilesToDirectory
|
||||
[Fact]
|
||||
public void CopyFilesToDirectory_ShouldMoveAllFiles()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFilesToDirectory(new []{$"{testDirectory}file_{0}.zip", $"{testDirectory}file_{1}.zip"}, "/manga/output/");
|
||||
Assert.Equal(2, ds.GetFiles("/manga/output/").Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CopyFilesToDirectory_ShouldMoveAllFiles_InclFilesInNestedFolders()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
fileSystem.AddFile($"{testDirectory}nested/file_11.zip", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFilesToDirectory(new []{$"{testDirectory}file_{0}.zip", $"{testDirectory}file_{1}.zip", $"{testDirectory}nested/file_11.zip"}, "/manga/output/");
|
||||
Assert.Equal(3, ds.GetFiles("/manga/output/").Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CopyFilesToDirectory_ShouldMoveAllFiles_WithPrepend()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFilesToDirectory(new []{$"{testDirectory}file_{0}.zip", $"{testDirectory}file_{1}.zip", $"{testDirectory}nested/file_11.zip"},
|
||||
"/manga/output/", "mangarocks_");
|
||||
Assert.Equal(2, ds.GetFiles("/manga/output/").Count());
|
||||
Assert.All(ds.GetFiles("/manga/output/"), filepath => ds.FileSystem.Path.GetFileName(filepath).StartsWith("mangarocks_"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void CopyFilesToDirectory_ShouldMoveOnlyFilesThatExist()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
for (var i = 0; i < 10; i++)
|
||||
{
|
||||
fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
||||
}
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.CopyFilesToDirectory(new []{$"{testDirectory}file_{0}.zip", $"{testDirectory}file_{1}.zip", $"{testDirectory}nested/file_11.zip"},
|
||||
"/manga/output/");
|
||||
Assert.Equal(2, ds.GetFiles("/manga/output/").Count());
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region ListDirectory
|
||||
[Fact]
|
||||
public void ListDirectory_EmptyForNonExistent()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file_0.zip", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
Assert.Empty(ds.ListDirectory("/comics/"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ListDirectory_ListsAllDirectories()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory($"{testDirectory}dir1");
|
||||
fileSystem.AddDirectory($"{testDirectory}dir2");
|
||||
fileSystem.AddDirectory($"{testDirectory}dir3");
|
||||
fileSystem.AddFile($"{testDirectory}file_0.zip", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
Assert.Equal(3, ds.ListDirectory(testDirectory).Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ListDirectory_ListsOnlyNonSystemAndHiddenOnly()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory($"{testDirectory}dir1");
|
||||
var di = fileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}dir1");
|
||||
di.Attributes |= FileAttributes.System;
|
||||
fileSystem.AddDirectory($"{testDirectory}dir2");
|
||||
di = fileSystem.DirectoryInfo.FromDirectoryName($"{testDirectory}dir2");
|
||||
di.Attributes |= FileAttributes.Hidden;
|
||||
fileSystem.AddDirectory($"{testDirectory}dir3");
|
||||
fileSystem.AddFile($"{testDirectory}file_0.zip", new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
Assert.Equal(1, ds.ListDirectory(testDirectory).Count());
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region ReadFileAsync
|
||||
|
||||
[Fact]
|
||||
public async Task ReadFileAsync_ShouldGetBytes()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file_1.zip", new MockFileData("Hello"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var bytes = await ds.ReadFileAsync($"{testDirectory}file_1.zip");
|
||||
Assert.Equal(Encoding.UTF8.GetBytes("Hello"), bytes);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ReadFileAsync_ShouldReadNothingFromNonExistent()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddFile($"{testDirectory}file_1.zip", new MockFileData("Hello"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var bytes = await ds.ReadFileAsync($"{testDirectory}file_32123.zip");
|
||||
Assert.Empty(bytes);
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region FindHighestDirectoriesFromFiles
|
||||
|
||||
[Theory]
|
||||
[InlineData(new [] {"C:/Manga/"}, new [] {"C:/Manga/Love Hina/Vol. 01.cbz"}, "C:/Manga/Love Hina")]
|
||||
public void FindHighestDirectoriesFromFilesTest(string[] rootDirectories, string[] folders, string expectedDirectory)
|
||||
[InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/Dir 2/"}, new [] {"C:/Manga/Dir 1/Love Hina/Vol. 01.cbz"}, "C:/Manga/Dir 1/Love Hina")]
|
||||
[InlineData(new [] {"C:/Manga/Dir 1/", "c://Manga/"}, new [] {"D:/Manga/Love Hina/Vol. 01.cbz", "D:/Manga/Vol. 01.cbz"}, "")]
|
||||
public void FindHighestDirectoriesFromFilesTest(string[] rootDirectories, string[] files, string expectedDirectory)
|
||||
{
|
||||
var actual = DirectoryService.FindHighestDirectoriesFromFiles(rootDirectories, folders);
|
||||
var expected = new Dictionary<string, string> {{expectedDirectory, ""}};
|
||||
var fileSystem = new MockFileSystem();
|
||||
foreach (var directory in rootDirectories)
|
||||
{
|
||||
fileSystem.AddDirectory(directory);
|
||||
}
|
||||
foreach (var f in files)
|
||||
{
|
||||
fileSystem.AddFile(f, new MockFileData(""));
|
||||
}
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
var actual = ds.FindHighestDirectoriesFromFiles(rootDirectories, files);
|
||||
var expected = new Dictionary<string, string>();
|
||||
if (!string.IsNullOrEmpty(expectedDirectory))
|
||||
{
|
||||
expected = new Dictionary<string, string> {{expectedDirectory, ""}};
|
||||
}
|
||||
|
||||
Assert.Equal(expected, actual);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetFoldersTillRoot
|
||||
|
||||
[Theory]
|
||||
[InlineData("C:/Manga/", "C:/Manga/Love Hina/Specials/Omake/", "Omake,Specials,Love Hina")]
|
||||
[InlineData("C:/Manga/", "C:/Manga/Love Hina/Specials/Omake", "Omake,Specials,Love Hina")]
|
||||
@ -114,12 +684,110 @@ namespace API.Tests.Services
|
||||
[InlineData(@"M:\", @"M:\Toukyou Akazukin\Vol. 01 Ch. 005.cbz", @"Toukyou Akazukin")]
|
||||
public void GetFoldersTillRoot_Test(string rootPath, string fullpath, string expectedArray)
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(rootPath);
|
||||
fileSystem.AddFile(fullpath, new MockFileData(""));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
|
||||
var expected = expectedArray.Split(",");
|
||||
if (expectedArray.Equals(string.Empty))
|
||||
{
|
||||
expected = Array.Empty<string>();
|
||||
expected = Array.Empty<string>();
|
||||
}
|
||||
Assert.Equal(expected, DirectoryService.GetFoldersTillRoot(rootPath, fullpath));
|
||||
Assert.Equal(expected, ds.GetFoldersTillRoot(rootPath, fullpath));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region RemoveNonImages
|
||||
|
||||
[Fact]
|
||||
public void RemoveNonImages()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(testDirectory);
|
||||
fileSystem.AddFile($"{testDirectory}file/data-0.txt", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-1.jpg", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-2.png", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-3.webp", new MockFileData("abc"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.RemoveNonImages($"{testDirectory}");
|
||||
Assert.False(fileSystem.FileExists($"{testDirectory}file/data-0.txt"));
|
||||
Assert.Equal(3, ds.GetFiles($"{testDirectory}", searchOption:SearchOption.AllDirectories).Count());
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region Flatten
|
||||
|
||||
[Fact]
|
||||
public void Flatten_ShouldDoNothing()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(testDirectory);
|
||||
fileSystem.AddFile($"{testDirectory}data-1.jpg", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-2.png", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}data-3.webp", new MockFileData("abc"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.Flatten($"{testDirectory}");
|
||||
Assert.True(fileSystem.FileExists($"{testDirectory}data-1.jpg"));
|
||||
Assert.True(fileSystem.FileExists($"{testDirectory}data-2.png"));
|
||||
Assert.True(fileSystem.FileExists($"{testDirectory}data-3.webp"));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Flatten_ShouldFlatten()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory(testDirectory);
|
||||
fileSystem.AddFile($"{testDirectory}data-1.jpg", new MockFileData("abc"));
|
||||
fileSystem.AddFile($"{testDirectory}subdir/data-3.webp", new MockFileData("abc"));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
ds.Flatten($"{testDirectory}");
|
||||
Assert.Equal(2, ds.GetFiles(testDirectory).Count());
|
||||
Assert.False(fileSystem.FileExists($"{testDirectory}subdir/data-3.webp"));
|
||||
Assert.True(fileSystem.Directory.Exists($"{testDirectory}subdir/"));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CheckWriteAccess
|
||||
|
||||
[Fact]
|
||||
public async Task CheckWriteAccess_ShouldHaveAccess()
|
||||
{
|
||||
const string testDirectory = "/manga/";
|
||||
var fileSystem = new MockFileSystem();
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var hasAccess = await ds.CheckWriteAccess(ds.FileSystem.Path.Join(testDirectory, "bookmarks"));
|
||||
Assert.True(hasAccess);
|
||||
|
||||
Assert.False(ds.FileSystem.Directory.Exists(ds.FileSystem.Path.Join(testDirectory, "bookmarks")));
|
||||
Assert.False(ds.FileSystem.File.Exists(ds.FileSystem.Path.Join(testDirectory, "bookmarks", "test.txt")));
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetHumanReadableBytes
|
||||
|
||||
[Theory]
|
||||
[InlineData(1200, "1.17 KB")]
|
||||
[InlineData(1, "1 B")]
|
||||
[InlineData(10000000, "9.54 MB")]
|
||||
[InlineData(10000000000, "9.31 GB")]
|
||||
public void GetHumanReadableBytesTest(long bytes, string expected)
|
||||
{
|
||||
Assert.Equal(expected, DirectoryService.GetHumanReadableBytes(bytes));
|
||||
}
|
||||
#endregion
|
||||
}
|
||||
}
|
||||
|
44
API.Tests/Services/FileSystemTests.cs
Normal file
44
API.Tests/Services/FileSystemTests.cs
Normal file
@ -0,0 +1,44 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using API.Services;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services;
|
||||
|
||||
public class FileSystemTests
|
||||
{
|
||||
[Fact]
|
||||
public void FileHasNotBeenModifiedSinceCreation()
|
||||
{
|
||||
var file = new MockFileData("Testing is meh.")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ @"c:\myfile.txt", file }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
|
||||
Assert.False(fileService.HasFileBeenModifiedSince(@"c:\myfile.txt", DateTime.Now));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FileHasBeenModifiedSinceCreation()
|
||||
{
|
||||
var file = new MockFileData("Testing is meh.")
|
||||
{
|
||||
LastWriteTime = DateTimeOffset.Now
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
{ @"c:\myfile.txt", file }
|
||||
});
|
||||
|
||||
var fileService = new FileService(fileSystem);
|
||||
|
||||
Assert.True(fileService.HasFileBeenModifiedSince(@"c:\myfile.txt", DateTime.Now.Subtract(TimeSpan.FromMinutes(1))));
|
||||
}
|
||||
}
|
@ -1,8 +1,9 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using API.Entities;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using API.Helpers;
|
||||
using API.Services;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services
|
||||
{
|
||||
@ -10,6 +11,7 @@ namespace API.Tests.Services
|
||||
{
|
||||
private readonly string _testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives");
|
||||
private const string TestCoverImageFile = "thumbnail.jpg";
|
||||
private const string TestCoverArchive = @"c:\file in folder.zip";
|
||||
private readonly string _testCoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), @"../../../Services/Test Data/ArchiveService/CoverImages");
|
||||
//private readonly MetadataService _metadataService;
|
||||
// private readonly IUnitOfWork _unitOfWork = Substitute.For<IUnitOfWork>();
|
||||
@ -18,116 +20,23 @@ namespace API.Tests.Services
|
||||
// private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
|
||||
// private readonly ILogger<MetadataService> _logger = Substitute.For<ILogger<MetadataService>>();
|
||||
// private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
private readonly ICacheHelper _cacheHelper;
|
||||
|
||||
|
||||
public MetadataServiceTests()
|
||||
{
|
||||
//_metadataService = new MetadataService(_unitOfWork, _logger, _archiveService, _bookService, _imageService, _messageHub);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnFirstRun()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
var file = new MockFileData("")
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = DateTime.Now
|
||||
}, false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnFirstRunSeries()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null,null, false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnFirstRun_FileModified()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
|
||||
};
|
||||
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime.Subtract(TimeSpan.FromDays(1))
|
||||
}, false, false));
|
||||
}
|
||||
{ TestCoverArchive, file }
|
||||
});
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnFirstRun_CoverImageLocked()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
|
||||
}, false, true));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnSecondRun_ForceUpdate()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
|
||||
}, true, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnSecondRun_NoFileChangeButNoCoverImage()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
|
||||
}, false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnSecondRun_FileChangeButNoCoverImage()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime + TimeSpan.FromDays(1)
|
||||
}, false, false));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldNotUpdateCoverImage_OnSecondRun_CoverImageSet()
|
||||
{
|
||||
// Represents first run
|
||||
Assert.False(MetadataService.ShouldUpdateCoverImage(TestCoverImageFile, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
|
||||
}, false, false, _testCoverImageDirectory));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldNotUpdateCoverImage_OnSecondRun_HasCoverImage_NoForceUpdate_NoLock()
|
||||
{
|
||||
|
||||
Assert.False(MetadataService.ShouldUpdateCoverImage(TestCoverImageFile, new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = DateTime.Now
|
||||
}, false, false, _testCoverImageDirectory));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ShouldUpdateCoverImage_OnSecondRun_HasCoverImage_NoForceUpdate_HasLock_CoverImageDoesntExist()
|
||||
{
|
||||
|
||||
Assert.True(MetadataService.ShouldUpdateCoverImage(@"doesn't_exist.jpg", new MangaFile()
|
||||
{
|
||||
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
|
||||
LastModified = DateTime.Now
|
||||
}, false, true, _testCoverImageDirectory));
|
||||
var fileService = new FileService(fileSystem);
|
||||
_cacheHelper = new CacheHelper(fileService);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
314
API.Tests/Services/ParseScannedFilesTests.cs
Normal file
314
API.Tests/Services/ParseScannedFilesTests.cs
Normal file
@ -0,0 +1,314 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Metadata;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.Services.Tasks.Scanner;
|
||||
using API.SignalR;
|
||||
using API.Tests.Helpers;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services;
|
||||
|
||||
internal class MockReadingItemService : IReadingItemService
|
||||
{
|
||||
private readonly DefaultParser _defaultParser;
|
||||
|
||||
public MockReadingItemService(DefaultParser defaultParser)
|
||||
{
|
||||
_defaultParser = defaultParser;
|
||||
}
|
||||
|
||||
public ComicInfo GetComicInfo(string filePath)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
public int GetNumberOfPages(string filePath, MangaFormat format)
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
|
||||
public string GetCoverImage(string fileFilePath, string fileName, MangaFormat format)
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
public void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1)
|
||||
{
|
||||
throw new System.NotImplementedException();
|
||||
}
|
||||
|
||||
public ParserInfo Parse(string path, string rootPath, LibraryType type)
|
||||
{
|
||||
return _defaultParser.Parse(path, rootPath, type);
|
||||
}
|
||||
}
|
||||
|
||||
public class ParseScannedFilesTests
|
||||
{
|
||||
private readonly ILogger<ParseScannedFiles> _logger = Substitute.For<ILogger<ParseScannedFiles>>();
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
private const string CacheDirectory = "C:/kavita/config/cache/";
|
||||
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
||||
private const string BackupDirectory = "C:/kavita/config/backups/";
|
||||
private const string DataDirectory = "C:/data/";
|
||||
|
||||
public ParseScannedFilesTests()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder()
|
||||
.UseSqlite(CreateInMemoryDatabase())
|
||||
.Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
|
||||
_unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
||||
|
||||
// Since ProcessFile relies on _readingItemService, we can implement our own versions of _readingItemService so we have control over how the calls work
|
||||
}
|
||||
|
||||
#region Setup
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
var filesystem = CreateFileSystem();
|
||||
|
||||
await Seed.SeedSettings(_context, new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
||||
|
||||
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
||||
setting.Value = CacheDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
||||
setting.Value = BackupDirectory;
|
||||
|
||||
_context.ServerSetting.Update(setting);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
{
|
||||
Name = "Manga",
|
||||
Folders = new List<FolderPath>()
|
||||
{
|
||||
new FolderPath()
|
||||
{
|
||||
Path = DataDirectory
|
||||
}
|
||||
}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
}
|
||||
|
||||
private async Task ResetDB()
|
||||
{
|
||||
_context.Series.RemoveRange(_context.Series.ToList());
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
}
|
||||
|
||||
private static MockFileSystem CreateFileSystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
||||
fileSystem.AddDirectory("C:/kavita/config/");
|
||||
fileSystem.AddDirectory(CacheDirectory);
|
||||
fileSystem.AddDirectory(CoverImageDirectory);
|
||||
fileSystem.AddDirectory(BackupDirectory);
|
||||
fileSystem.AddDirectory(DataDirectory);
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetInfosByName
|
||||
|
||||
[Fact]
|
||||
public void GetInfosByName_ShouldReturnGivenMatchingSeriesName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)));
|
||||
|
||||
var infos = new List<ParserInfo>()
|
||||
{
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
|
||||
};
|
||||
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
|
||||
{
|
||||
{
|
||||
new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
Name = "Accel World",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Accel World")
|
||||
},
|
||||
infos
|
||||
},
|
||||
{
|
||||
new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Pdf,
|
||||
Name = "Accel World",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Accel World")
|
||||
},
|
||||
new List<ParserInfo>()
|
||||
}
|
||||
};
|
||||
|
||||
var series = DbFactory.Series("Accel World");
|
||||
series.Format = MangaFormat.Pdf;
|
||||
|
||||
Assert.Empty(ParseScannedFiles.GetInfosByName(parsedSeries, series));
|
||||
|
||||
series.Format = MangaFormat.Archive;
|
||||
Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count());
|
||||
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetInfosByName_ShouldReturnGivenMatchingNormalizedSeriesName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)));
|
||||
|
||||
var infos = new List<ParserInfo>()
|
||||
{
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false),
|
||||
ParserInfoFactory.CreateParsedInfo("Accel World", "2", "0", "Accel World v2.cbz", false)
|
||||
};
|
||||
var parsedSeries = new Dictionary<ParsedSeries, List<ParserInfo>>
|
||||
{
|
||||
{
|
||||
new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Archive,
|
||||
Name = "Accel World",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Accel World")
|
||||
},
|
||||
infos
|
||||
},
|
||||
{
|
||||
new ParsedSeries()
|
||||
{
|
||||
Format = MangaFormat.Pdf,
|
||||
Name = "Accel World",
|
||||
NormalizedName = API.Parser.Parser.Normalize("Accel World")
|
||||
},
|
||||
new List<ParserInfo>()
|
||||
}
|
||||
};
|
||||
|
||||
var series = DbFactory.Series("accel world");
|
||||
series.Format = MangaFormat.Archive;
|
||||
Assert.Equal(2, ParseScannedFiles.GetInfosByName(parsedSeries, series).Count());
|
||||
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region MergeName
|
||||
|
||||
[Fact]
|
||||
public void MergeName_ShouldMergeMatchingFormatAndName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)));
|
||||
|
||||
|
||||
psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, out _, out _);
|
||||
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.cbz", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.cbz", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accelworld", "1", "0", "Accel World v1.cbz", false)));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void MergeName_ShouldMerge_MismatchedFormatSameName()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)));
|
||||
|
||||
|
||||
psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, out _, out _);
|
||||
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("Accel World", "1", "0", "Accel World v1.epub", false)));
|
||||
Assert.Equal("Accel World", psf.MergeName(ParserInfoFactory.CreateParsedInfo("accel_world", "1", "0", "Accel World v1.epub", false)));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region ScanLibrariesForSeries
|
||||
|
||||
[Fact]
|
||||
public void ScanLibrariesForSeries_ShouldFindFiles()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.AddDirectory("C:/Data/");
|
||||
fileSystem.AddFile("C:/Data/Accel World v1.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.cbz", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Accel World v2.pdf", new MockFileData(string.Empty));
|
||||
fileSystem.AddFile("C:/Data/Nothing.pdf", new MockFileData(string.Empty));
|
||||
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var psf = new ParseScannedFiles(Substitute.For<ILogger<ParseScannedFiles>>(), ds,
|
||||
new MockReadingItemService(new DefaultParser(ds)));
|
||||
|
||||
|
||||
var parsedSeries = psf.ScanLibrariesForSeries(LibraryType.Manga, new List<string>() {"C:/Data/"}, out _, out _);
|
||||
|
||||
Assert.Equal(3, parsedSeries.Values.Count);
|
||||
Assert.NotEmpty(parsedSeries.Keys.Where(p => p.Format == MangaFormat.Archive && p.Name.Equals("Accel World")));
|
||||
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
}
|
814
API.Tests/Services/ReaderServiceTests.cs
Normal file
814
API.Tests/Services/ReaderServiceTests.cs
Normal file
@ -0,0 +1,814 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Repositories;
|
||||
using API.DTOs;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Helpers;
|
||||
using API.Services;
|
||||
using API.SignalR;
|
||||
using API.Tests.Helpers;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
using Microsoft.Data.Sqlite;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore.Infrastructure;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using NSubstitute;
|
||||
using Xunit;
|
||||
|
||||
namespace API.Tests.Services;
|
||||
|
||||
public class ReaderServiceTests
|
||||
{
|
||||
private readonly ILogger<CacheService> _logger = Substitute.For<ILogger<CacheService>>();
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
private const string CacheDirectory = "C:/kavita/config/cache/";
|
||||
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
||||
private const string BackupDirectory = "C:/kavita/config/backups/";
|
||||
private const string DataDirectory = "C:/data/";
|
||||
|
||||
public ReaderServiceTests()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder().UseSqlite(CreateInMemoryDatabase()).Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
|
||||
var config = new MapperConfiguration(cfg => cfg.AddProfile<AutoMapperProfiles>());
|
||||
var mapper = config.CreateMapper();
|
||||
_unitOfWork = new UnitOfWork(_context, mapper, null);
|
||||
}
|
||||
|
||||
#region Setup
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
var filesystem = CreateFileSystem();
|
||||
|
||||
await Seed.SeedSettings(_context,
|
||||
new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
||||
|
||||
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
||||
setting.Value = CacheDirectory;
|
||||
|
||||
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
||||
setting.Value = BackupDirectory;
|
||||
|
||||
_context.ServerSetting.Update(setting);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
{
|
||||
Name = "Manga", Folders = new List<FolderPath>() {new FolderPath() {Path = "C:/data/"}}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
}
|
||||
|
||||
private async Task ResetDB()
|
||||
{
|
||||
_context.Series.RemoveRange(_context.Series.ToList());
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
}
|
||||
|
||||
private static MockFileSystem CreateFileSystem()
|
||||
{
|
||||
var fileSystem = new MockFileSystem();
|
||||
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
||||
fileSystem.AddDirectory("C:/kavita/config/");
|
||||
fileSystem.AddDirectory(CacheDirectory);
|
||||
fileSystem.AddDirectory(CoverImageDirectory);
|
||||
fileSystem.AddDirectory(BackupDirectory);
|
||||
fileSystem.AddDirectory(DataDirectory);
|
||||
|
||||
return fileSystem;
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region FormatBookmarkFolderPath
|
||||
|
||||
[Theory]
|
||||
[InlineData("/manga/", 1, 1, 1, "/manga/1/1/1")]
|
||||
[InlineData("C:/manga/", 1, 1, 10001, "C:/manga/1/1/10001")]
|
||||
public void FormatBookmarkFolderPathTest(string baseDir, int userId, int seriesId, int chapterId, string expected)
|
||||
{
|
||||
Assert.Equal(expected, ReaderService.FormatBookmarkFolderPath(baseDir, userId, seriesId, chapterId));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region CapPageToChapter
|
||||
|
||||
[Fact]
|
||||
public async Task CapPageToChapterTest()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
Assert.Equal(0, await readerService.CapPageToChapter(1, -1));
|
||||
Assert.Equal(1, await readerService.CapPageToChapter(1, 10));
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region SaveReadingProgress
|
||||
|
||||
[Fact]
|
||||
public async Task SaveReadingProgress_ShouldCreateNewEntity()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var successful = await readerService.SaveReadingProgress(new ProgressDto()
|
||||
{
|
||||
ChapterId = 1,
|
||||
PageNum = 1,
|
||||
SeriesId = 1,
|
||||
VolumeId = 1,
|
||||
BookScrollId = null
|
||||
}, 1);
|
||||
|
||||
Assert.True(successful);
|
||||
Assert.NotNull(await _unitOfWork.AppUserProgressRepository.GetUserProgressAsync(1, 1));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task SaveReadingProgress_ShouldUpdateExisting()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var successful = await readerService.SaveReadingProgress(new ProgressDto()
|
||||
{
|
||||
ChapterId = 1,
|
||||
PageNum = 1,
|
||||
SeriesId = 1,
|
||||
VolumeId = 1,
|
||||
BookScrollId = null
|
||||
}, 1);
|
||||
|
||||
Assert.True(successful);
|
||||
Assert.NotNull(await _unitOfWork.AppUserProgressRepository.GetUserProgressAsync(1, 1));
|
||||
|
||||
Assert.True(await readerService.SaveReadingProgress(new ProgressDto()
|
||||
{
|
||||
ChapterId = 1,
|
||||
PageNum = 1,
|
||||
SeriesId = 1,
|
||||
VolumeId = 1,
|
||||
BookScrollId = "/h1/"
|
||||
}, 1));
|
||||
|
||||
Assert.Equal("/h1/", (await _unitOfWork.AppUserProgressRepository.GetUserProgressAsync(1, 1)).BookScrollId);
|
||||
|
||||
}
|
||||
|
||||
|
||||
#endregion
|
||||
|
||||
#region MarkChaptersAsRead
|
||||
|
||||
[Fact]
|
||||
public async Task MarkChaptersAsReadTest()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 1
|
||||
},
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 2
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var volumes = await _unitOfWork.VolumeRepository.GetVolumes(1);
|
||||
readerService.MarkChaptersAsRead(await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress), 1, volumes.First().Chapters);
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
Assert.Equal(2, (await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress)).Progresses.Count);
|
||||
}
|
||||
#endregion
|
||||
|
||||
#region MarkChapterAsUnread
|
||||
|
||||
[Fact]
|
||||
public async Task MarkChapterAsUnreadTest()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
new Volume()
|
||||
{
|
||||
Chapters = new List<Chapter>()
|
||||
{
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 1
|
||||
},
|
||||
new Chapter()
|
||||
{
|
||||
Pages = 2
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var volumes = (await _unitOfWork.VolumeRepository.GetVolumes(1)).ToList();
|
||||
readerService.MarkChaptersAsRead(await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress), 1, volumes.First().Chapters);
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
Assert.Equal(2, (await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress)).Progresses.Count);
|
||||
|
||||
readerService.MarkChaptersAsUnread(await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress), 1, volumes.First().Chapters);
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var progresses = (await _unitOfWork.UserRepository.GetUserByIdAsync(1, AppUserIncludes.Progress)).Progresses;
|
||||
Assert.Equal(0, progresses.Max(p => p.PagesRead));
|
||||
Assert.Equal(2, progresses.Count);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetNextChapterIdAsync
|
||||
|
||||
[Fact]
|
||||
public async Task GetNextChapterIdAsync_ShouldGetNextVolume()
|
||||
{
|
||||
// V1 -> V2
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("2", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("21", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("22", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("3", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("31", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("32", false, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var nextChapter = await readerService.GetNextChapterIdAsync(1, 1, 1, 1);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(nextChapter);
|
||||
Assert.Equal("2", actualChapter.Range);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetNextChapterIdAsync_ShouldRollIntoNextVolume()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("2", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("21", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("22", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("3", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("31", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("32", false, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var nextChapter = await readerService.GetNextChapterIdAsync(1, 1, 2, 1);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(nextChapter);
|
||||
Assert.Equal("21", actualChapter.Range);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetNextChapterIdAsync_ShouldNotMoveFromVolumeToSpecial()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("A.cbz", true, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("B.cbz", true, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var nextChapter = await readerService.GetNextChapterIdAsync(1, 1, 2, 1);
|
||||
Assert.Equal(-1, nextChapter);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetNextChapterIdAsync_ShouldMoveFromSpecialToSpecial()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("A.cbz", true, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("B.cbz", true, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var nextChapter = await readerService.GetNextChapterIdAsync(1, 2, 3, 1);
|
||||
Assert.NotEqual(-1, nextChapter);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(nextChapter);
|
||||
Assert.Equal("B.cbz", actualChapter.Range);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
#region GetPrevChapterIdAsync
|
||||
|
||||
[Fact]
|
||||
public async Task GetPrevChapterIdAsync_ShouldGetPrevVolume()
|
||||
{
|
||||
// V1 -> V2
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("2", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("21", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("22", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("3", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("31", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("32", false, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
var prevChapter = await readerService.GetPrevChapterIdAsync(1, 1, 2, 1);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(prevChapter);
|
||||
Assert.Equal("1", actualChapter.Range);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetPrevChapterIdAsync_ShouldRollIntoPrevVolume()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("2", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("21", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("22", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("3", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("31", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("32", false, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var prevChapter = await readerService.GetPrevChapterIdAsync(1, 2, 3, 1);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(prevChapter);
|
||||
Assert.Equal("2", actualChapter.Range);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetPrevChapterIdAsync_ShouldMoveFromVolumeToSpecial()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("A.cbz", true, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("B.cbz", true, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var prevChapter = await readerService.GetPrevChapterIdAsync(1, 1, 1, 1);
|
||||
Assert.NotEqual(-1, prevChapter);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(prevChapter);
|
||||
Assert.Equal("B.cbz", actualChapter.Range);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetPrevChapterIdAsync_ShouldMoveFromSpecialToSpecial()
|
||||
{
|
||||
await ResetDB();
|
||||
|
||||
_context.Series.Add(new Series()
|
||||
{
|
||||
Name = "Test",
|
||||
Library = new Library() {
|
||||
Name = "Test LIb",
|
||||
Type = LibraryType.Manga,
|
||||
},
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
EntityFactory.CreateVolume("1", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("1", false, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("2", false, new List<MangaFile>()),
|
||||
}),
|
||||
EntityFactory.CreateVolume("0", new List<Chapter>()
|
||||
{
|
||||
EntityFactory.CreateChapter("A.cbz", true, new List<MangaFile>()),
|
||||
EntityFactory.CreateChapter("B.cbz", true, new List<MangaFile>()),
|
||||
}),
|
||||
}
|
||||
});
|
||||
|
||||
_context.AppUser.Add(new AppUser()
|
||||
{
|
||||
UserName = "majora2007"
|
||||
});
|
||||
|
||||
await _context.SaveChangesAsync();
|
||||
|
||||
var fileSystem = new MockFileSystem();
|
||||
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
|
||||
|
||||
var prevChapter = await readerService.GetPrevChapterIdAsync(1, 2, 4, 1);
|
||||
Assert.NotEqual(-1, prevChapter);
|
||||
var actualChapter = await _unitOfWork.ChapterRepository.GetChapterAsync(prevChapter);
|
||||
Assert.Equal("A.cbz", actualChapter.Range);
|
||||
}
|
||||
|
||||
#endregion
|
||||
|
||||
// #region GetNumberOfPages
|
||||
//
|
||||
// [Fact]
|
||||
// public void GetNumberOfPages_EPUB()
|
||||
// {
|
||||
// const string testDirectory = "/manga/";
|
||||
// var fileSystem = new MockFileSystem();
|
||||
//
|
||||
// var actualFile = Path.Join(Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/BookService/EPUB"), "The Golden Harpoon; Or, Lost Among the Floes A Story of the Whaling Grounds.epub")
|
||||
// fileSystem.File.WriteAllBytes("${testDirectory}test.epub", File.ReadAllBytes(actualFile));
|
||||
//
|
||||
// fileSystem.AddDirectory(CacheDirectory);
|
||||
//
|
||||
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
||||
// var cs = new CacheService(_logger, _unitOfWork, ds, new MockReadingItemServiceForCacheService(ds));
|
||||
// var readerService = new ReaderService(_unitOfWork, Substitute.For<ILogger<ReaderService>>(), ds, cs);
|
||||
//
|
||||
//
|
||||
// }
|
||||
//
|
||||
//
|
||||
// #endregion
|
||||
|
||||
}
|
@ -3,13 +3,14 @@ using System.Collections.Concurrent;
|
||||
using System.Collections.Generic;
|
||||
using System.Data.Common;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions.TestingHelpers;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Entities.Metadata;
|
||||
using API.Helpers;
|
||||
using API.Parser;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
@ -27,55 +28,38 @@ using Xunit;
|
||||
|
||||
namespace API.Tests.Services
|
||||
{
|
||||
public class ScannerServiceTests : IDisposable
|
||||
public class ScannerServiceTests
|
||||
{
|
||||
private readonly ScannerService _scannerService;
|
||||
private readonly ILogger<ScannerService> _logger = Substitute.For<ILogger<ScannerService>>();
|
||||
private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
|
||||
private readonly IBookService _bookService = Substitute.For<IBookService>();
|
||||
private readonly IImageService _imageService = Substitute.For<IImageService>();
|
||||
private readonly ILogger<MetadataService> _metadataLogger = Substitute.For<ILogger<MetadataService>>();
|
||||
private readonly ICacheService _cacheService = Substitute.For<ICacheService>();
|
||||
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
||||
|
||||
private readonly DbConnection _connection;
|
||||
private readonly DataContext _context;
|
||||
|
||||
|
||||
public ScannerServiceTests()
|
||||
[Fact]
|
||||
public void FindSeriesNotOnDisk_Should_Remove1()
|
||||
{
|
||||
var contextOptions = new DbContextOptionsBuilder()
|
||||
.UseSqlite(CreateInMemoryDatabase())
|
||||
.Options;
|
||||
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
|
||||
_context = new DataContext(contextOptions);
|
||||
Task.Run(SeedDb).GetAwaiter().GetResult();
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Archive});
|
||||
//AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Volumes = "1", Format = MangaFormat.Epub});
|
||||
|
||||
IUnitOfWork unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
||||
|
||||
|
||||
IMetadataService metadataService = Substitute.For<MetadataService>(unitOfWork, _metadataLogger, _archiveService, _bookService, _imageService, _messageHub);
|
||||
_scannerService = new ScannerService(unitOfWork, _logger, _archiveService, metadataService, _bookService, _cacheService, _messageHub);
|
||||
}
|
||||
|
||||
private async Task<bool> SeedDb()
|
||||
{
|
||||
await _context.Database.MigrateAsync();
|
||||
await Seed.SeedSettings(_context);
|
||||
|
||||
_context.Library.Add(new Library()
|
||||
var existingSeries = new List<Series>
|
||||
{
|
||||
Name = "Manga",
|
||||
Folders = new List<FolderPath>()
|
||||
new Series()
|
||||
{
|
||||
new FolderPath()
|
||||
Name = "Darker Than Black",
|
||||
LocalizedName = "Darker Than Black",
|
||||
OriginalName = "Darker Than Black",
|
||||
Volumes = new List<Volume>()
|
||||
{
|
||||
Path = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ScannerService/Manga")
|
||||
}
|
||||
new Volume()
|
||||
{
|
||||
Number = 1,
|
||||
Name = "1"
|
||||
}
|
||||
},
|
||||
NormalizedName = API.Parser.Parser.Normalize("Darker Than Black"),
|
||||
Metadata = new SeriesMetadata(),
|
||||
Format = MangaFormat.Epub
|
||||
}
|
||||
});
|
||||
return await _context.SaveChangesAsync() > 0;
|
||||
};
|
||||
|
||||
Assert.Equal(1, ScannerService.FindSeriesNotOnDisk(existingSeries, infos).Count());
|
||||
}
|
||||
|
||||
[Fact]
|
||||
@ -83,9 +67,9 @@ namespace API.Tests.Services
|
||||
{
|
||||
var infos = new Dictionary<ParsedSeries, List<ParserInfo>>();
|
||||
|
||||
AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive});
|
||||
AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive});
|
||||
AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "10", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1", Format = MangaFormat.Archive});
|
||||
ParserInfoFactory.AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "10", Format = MangaFormat.Archive});
|
||||
|
||||
var existingSeries = new List<Series>
|
||||
{
|
||||
@ -138,78 +122,25 @@ namespace API.Tests.Services
|
||||
// Assert.Equal(expected, actualName);
|
||||
// }
|
||||
|
||||
[Fact]
|
||||
public void RemoveMissingSeries_Should_RemoveSeries()
|
||||
{
|
||||
var existingSeries = new List<Series>()
|
||||
{
|
||||
EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
EntityFactory.CreateSeries("Darker than Black"),
|
||||
EntityFactory.CreateSeries("Beastars"),
|
||||
};
|
||||
var missingSeries = new List<Series>()
|
||||
{
|
||||
EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
};
|
||||
existingSeries = ScannerService.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
|
||||
// [Fact]
|
||||
// public void RemoveMissingSeries_Should_RemoveSeries()
|
||||
// {
|
||||
// var existingSeries = new List<Series>()
|
||||
// {
|
||||
// EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
// EntityFactory.CreateSeries("Darker than Black"),
|
||||
// EntityFactory.CreateSeries("Beastars"),
|
||||
// };
|
||||
// var missingSeries = new List<Series>()
|
||||
// {
|
||||
// EntityFactory.CreateSeries("Darker than Black Vol 1"),
|
||||
// };
|
||||
// existingSeries = ScannerService.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
|
||||
//
|
||||
// Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
|
||||
// Assert.Equal(missingSeries.Count, removeCount);
|
||||
// }
|
||||
|
||||
Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
|
||||
Assert.Equal(missingSeries.Count, removeCount);
|
||||
}
|
||||
|
||||
private void AddToParsedInfo(IDictionary<ParsedSeries, List<ParserInfo>> collectedSeries, ParserInfo info)
|
||||
{
|
||||
var existingKey = collectedSeries.Keys.FirstOrDefault(ps =>
|
||||
ps.Format == info.Format && ps.NormalizedName == API.Parser.Parser.Normalize(info.Series));
|
||||
existingKey ??= new ParsedSeries()
|
||||
{
|
||||
Format = info.Format,
|
||||
Name = info.Series,
|
||||
NormalizedName = API.Parser.Parser.Normalize(info.Series)
|
||||
};
|
||||
if (collectedSeries.GetType() == typeof(ConcurrentDictionary<,>))
|
||||
{
|
||||
((ConcurrentDictionary<ParsedSeries, List<ParserInfo>>) collectedSeries).AddOrUpdate(existingKey, new List<ParserInfo>() {info}, (_, oldValue) =>
|
||||
{
|
||||
oldValue ??= new List<ParserInfo>();
|
||||
if (!oldValue.Contains(info))
|
||||
{
|
||||
oldValue.Add(info);
|
||||
}
|
||||
|
||||
return oldValue;
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
if (!collectedSeries.ContainsKey(existingKey))
|
||||
{
|
||||
collectedSeries.Add(existingKey, new List<ParserInfo>() {info});
|
||||
}
|
||||
else
|
||||
{
|
||||
var list = collectedSeries[existingKey];
|
||||
if (!list.Contains(info))
|
||||
{
|
||||
list.Add(info);
|
||||
}
|
||||
|
||||
collectedSeries[existingKey] = list;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
private static DbConnection CreateInMemoryDatabase()
|
||||
{
|
||||
var connection = new SqliteConnection("Filename=:memory:");
|
||||
|
||||
connection.Open();
|
||||
|
||||
return connection;
|
||||
}
|
||||
|
||||
public void Dispose() => _connection.Dispose();
|
||||
}
|
||||
}
|
||||
|
Binary file not shown.
Binary file not shown.
After Width: | Height: | Size: 90 KiB |
Binary file not shown.
After Width: | Height: | Size: 1.1 MiB |
BIN
API.Tests/Services/Test Data/ArchiveService/CoverImages/test.zip
Normal file
BIN
API.Tests/Services/Test Data/ArchiveService/CoverImages/test.zip
Normal file
Binary file not shown.
Binary file not shown.
81
API.Tests/Services/Test Data/BookService/EPUB/content.opf
Normal file
81
API.Tests/Services/Test Data/BookService/EPUB/content.opf
Normal file
@ -0,0 +1,81 @@
|
||||
<?xml version='1.0' encoding='UTF-8'?>
|
||||
|
||||
<package xmlns:opf="http://www.idpf.org/2007/opf" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.idpf.org/2007/opf" version="2.0" unique-identifier="id">
|
||||
<metadata>
|
||||
<dc:rights>Public domain in the USA.</dc:rights>
|
||||
<dc:identifier opf:scheme="URI" id="id">http://www.gutenberg.org/64999</dc:identifier>
|
||||
<dc:creator opf:file-as="Starbuck, Roger">Roger Starbuck</dc:creator>
|
||||
<dc:title>The Golden Harpoon / Lost Among the Floes</dc:title>
|
||||
<dc:language xsi:type="dcterms:RFC4646">en</dc:language>
|
||||
<dc:date opf:event="publication">2021-04-05</dc:date>
|
||||
<dc:date opf:event="conversion">2021-04-05T23:00:07.039989+00:00</dc:date>
|
||||
<dc:source>https://www.gutenberg.org/files/64999/64999-h/64999-h.htm</dc:source>
|
||||
<dc:description>Book Description</dc:description>
|
||||
<dc:subject>Genre1, Genre2</dc:subject>
|
||||
<dc:creator id="creator">Junya Inoue</dc:creator>
|
||||
<meta refines="#creator" property="file-as">Inoue, Junya</meta>
|
||||
<meta refines="#creator" property="role" scheme="marc:relators">aut</meta>
|
||||
<meta name="cover" content="item1"/>
|
||||
</metadata>
|
||||
<manifest>
|
||||
<!--Image: 1000 x 1572 size=145584 -->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@images@cover.jpg" id="item1" media-type="image/jpeg"/>
|
||||
<item href="pgepub.css" id="item2" media-type="text/css"/>
|
||||
<item href="0.css" id="item3" media-type="text/css"/>
|
||||
<item href="1.css" id="item4" media-type="text/css"/>
|
||||
<!--Chunk: size=3477 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-0.htm.html" id="cover" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=3242 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-1.htm.html" id="item5" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=27802 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-2.htm.html" id="item6" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=13387 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-3.htm.html" id="item7" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=25774 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-4.htm.html" id="item8" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=17053 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-5.htm.html" id="item9" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=19590 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-6.htm.html" id="item10" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=16645 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-7.htm.html" id="item11" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=19768 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-8.htm.html" id="item12" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=30397 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-9.htm.html" id="item13" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=41064 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-10.htm.html" id="item14" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=35745 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-11.htm.html" id="item15" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=3277 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-12.htm.html" id="item16" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=5983 Split on div.chapter-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-13.htm.html" id="item17" media-type="application/xhtml+xml"/>
|
||||
<!--Chunk: size=22066-->
|
||||
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-14.htm.html" id="item18" media-type="application/xhtml+xml"/>
|
||||
<item href="toc.ncx" id="ncx" media-type="application/x-dtbncx+xml"/>
|
||||
<item href="wrap0000.html" id="coverpage-wrapper" media-type="application/xhtml+xml"/>
|
||||
</manifest>
|
||||
<spine toc="ncx">
|
||||
<itemref idref="coverpage-wrapper" linear="yes"/>
|
||||
<itemref idref="cover" linear="yes"/>
|
||||
<itemref idref="item5" linear="yes"/>
|
||||
<itemref idref="item6" linear="yes"/>
|
||||
<itemref idref="item7" linear="yes"/>
|
||||
<itemref idref="item8" linear="yes"/>
|
||||
<itemref idref="item9" linear="yes"/>
|
||||
<itemref idref="item10" linear="yes"/>
|
||||
<itemref idref="item11" linear="yes"/>
|
||||
<itemref idref="item12" linear="yes"/>
|
||||
<itemref idref="item13" linear="yes"/>
|
||||
<itemref idref="item14" linear="yes"/>
|
||||
<itemref idref="item15" linear="yes"/>
|
||||
<itemref idref="item16" linear="yes"/>
|
||||
<itemref idref="item17" linear="yes"/>
|
||||
<itemref idref="item18" linear="yes"/>
|
||||
</spine>
|
||||
<guide>
|
||||
<reference type="toc" title="CONTENTS" href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-1.htm.html#pgepubid00001"/>
|
||||
<reference type="cover" title="Cover" href="wrap0000.html"/>
|
||||
</guide>
|
||||
</package>
|
@ -1,80 +0,0 @@
|
||||
""" This script should be run on a directory which will generate a test case file
|
||||
that can be loaded into the renametest.py"""
|
||||
import os
|
||||
from pathlib import Path
|
||||
import shutil
|
||||
|
||||
verbose = False
|
||||
|
||||
def print_log(val):
|
||||
if verbose:
|
||||
print(val)
|
||||
|
||||
|
||||
def create_test_base(file, root_dir):
|
||||
""" Creates and returns a new base directory for data creation for a given testcase."""
|
||||
base_dir = os.path.split(file.split('-testcase.txt')[0])[-1]
|
||||
print_log('base_dir: {0}'.format(base_dir))
|
||||
new_dir = os.path.join(root_dir, base_dir)
|
||||
print_log('new dir: {0}'.format(new_dir))
|
||||
p = Path(new_dir)
|
||||
if not p.exists():
|
||||
os.mkdir(new_dir)
|
||||
|
||||
return new_dir
|
||||
|
||||
|
||||
|
||||
def generate_data(file, root_dir):
|
||||
''' Generates directories and fake files for testing against '''
|
||||
|
||||
base_dir = ''
|
||||
if file.endswith('-testcase.txt'):
|
||||
base_dir = create_test_base(file, root_dir)
|
||||
|
||||
files_to_create = []
|
||||
with open(file, 'r') as in_file:
|
||||
files_to_create = in_file.read().splitlines()
|
||||
|
||||
for filepath in files_to_create:
|
||||
for part in os.path.split(filepath):
|
||||
part_path = os.path.join(base_dir, part)
|
||||
print_log('Checking if {0} exists '.format(part_path))
|
||||
p = Path(part_path)
|
||||
|
||||
if not p.exists():
|
||||
print_log('Creating: {0}'.format(part))
|
||||
|
||||
if p.suffix != '':
|
||||
with open(os.path.join(root_dir, base_dir + '/' + filepath), 'w+') as f:
|
||||
f.write('')
|
||||
else:
|
||||
os.mkdir(part_path)
|
||||
|
||||
def clean_up_generated_data(root_dir):
|
||||
for root, dirs, files in os.walk(root_dir):
|
||||
for dir in dirs:
|
||||
shutil.rmtree(os.path.join(root, dir))
|
||||
for file in files:
|
||||
if not file.endswith('-testcase.txt'):
|
||||
print_log('Removing {0}'.format(os.path.join(root, file)))
|
||||
os.remove(os.path.join(root, file))
|
||||
|
||||
|
||||
def generate_test_file():
|
||||
root_dir = os.path.abspath('.')
|
||||
current_folder = os.path.split(root_dir)[-1]
|
||||
out_files = []
|
||||
for root, _, files in os.walk(root_dir):
|
||||
for file in files:
|
||||
if not file.endswith('-testcase.txt'):
|
||||
filename = os.path.join(root.replace(root_dir, ''), file) # root_dir or root_dir + '//'?
|
||||
out_files.append(filename)
|
||||
|
||||
with open(os.path.join(root_dir, current_folder + '-testcase.txt'), 'w+') as f:
|
||||
for filename in out_files:
|
||||
f.write(filename + '\n')
|
||||
|
||||
if __name__ == '__main__':
|
||||
verbose = True
|
||||
generate_test_file()
|
@ -2,7 +2,7 @@
|
||||
|
||||
<PropertyGroup>
|
||||
<AnalysisMode>Default</AnalysisMode>
|
||||
<TargetFramework>net5.0</TargetFramework>
|
||||
<TargetFramework>net6.0</TargetFramework>
|
||||
<EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild>
|
||||
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
|
||||
</PropertyGroup>
|
||||
@ -36,39 +36,41 @@
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="AutoMapper.Extensions.Microsoft.DependencyInjection" Version="8.1.1" />
|
||||
<PackageReference Include="Docnet.Core" Version="2.4.0-alpha.1" />
|
||||
<PackageReference Include="AutoMapper.Extensions.Microsoft.DependencyInjection" Version="11.0.0" />
|
||||
<PackageReference Include="Docnet.Core" Version="2.4.0-alpha.2" />
|
||||
<PackageReference Include="ExCSS" Version="4.1.0" />
|
||||
<PackageReference Include="Flurl" Version="3.0.2" />
|
||||
<PackageReference Include="Flurl.Http" Version="3.2.0" />
|
||||
<PackageReference Include="Hangfire" Version="1.7.25" />
|
||||
<PackageReference Include="Hangfire.AspNetCore" Version="1.7.25" />
|
||||
<PackageReference Include="Hangfire" Version="1.7.27" />
|
||||
<PackageReference Include="Hangfire.AspNetCore" Version="1.7.27" />
|
||||
<PackageReference Include="Hangfire.MaximumConcurrentExecutions" Version="1.1.0" />
|
||||
<PackageReference Include="Hangfire.MemoryStorage.Core" Version="1.4.0" />
|
||||
<PackageReference Include="HtmlAgilityPack" Version="1.11.37" />
|
||||
<PackageReference Include="HtmlAgilityPack" Version="1.11.38" />
|
||||
<PackageReference Include="MarkdownDeep.NET.Core" Version="1.5.0.4" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="5.0.10" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Authentication.OpenIdConnect" Version="5.0.10" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="5.0.10" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="6.0.1" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Authentication.OpenIdConnect" Version="6.0.1" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="6.0.0" />
|
||||
<PackageReference Include="Microsoft.AspNetCore.SignalR" Version="1.1.0" />
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="5.0.10">
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="6.0.0">
|
||||
<PrivateAssets>all</PrivateAssets>
|
||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
|
||||
</PackageReference>
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="5.0.10" />
|
||||
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="5.0.2" />
|
||||
<PackageReference Include="Microsoft.IO.RecyclableMemoryStream" Version="2.1.3" />
|
||||
<PackageReference Include="NetVips" Version="2.0.1" />
|
||||
<PackageReference Include="NetVips.Native" Version="8.11.4" />
|
||||
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="6.0.0" />
|
||||
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="6.0.0" />
|
||||
<PackageReference Include="Microsoft.IO.RecyclableMemoryStream" Version="2.2.0" />
|
||||
<PackageReference Include="NetVips" Version="2.1.0" />
|
||||
<PackageReference Include="NetVips.Native" Version="8.12.1" />
|
||||
<PackageReference Include="NReco.Logging.File" Version="1.1.2" />
|
||||
<PackageReference Include="SharpCompress" Version="0.30.0" />
|
||||
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.29.0.36737">
|
||||
<PackageReference Include="SharpCompress" Version="0.30.1" />
|
||||
<PackageReference Include="SixLabors.ImageSharp" Version="1.0.4" />
|
||||
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.33.0.40503">
|
||||
<PrivateAssets>all</PrivateAssets>
|
||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
|
||||
</PackageReference>
|
||||
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.2.2" />
|
||||
<PackageReference Include="System.Drawing.Common" Version="5.0.2" />
|
||||
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="6.12.2" />
|
||||
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.2.3" />
|
||||
<PackageReference Include="System.Drawing.Common" Version="6.0.0" />
|
||||
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="6.15.0" />
|
||||
<PackageReference Include="System.IO.Abstractions" Version="16.1.2" />
|
||||
<PackageReference Include="VersOne.Epub" Version="3.0.3.1" />
|
||||
</ItemGroup>
|
||||
|
||||
|
@ -1,2 +1,3 @@
|
||||
<wpf:ResourceDictionary xml:space="preserve" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:s="clr-namespace:System;assembly=mscorlib" xmlns:ss="urn:shemas-jetbrains-com:settings-storage-xaml" xmlns:wpf="http://schemas.microsoft.com/winfx/2006/xaml/presentation">
|
||||
<s:Boolean x:Key="/Default/CodeInspection/NamespaceProvider/NamespaceFoldersToSkip/=covers/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>
|
||||
<s:Boolean x:Key="/Default/CodeInspection/NamespaceProvider/NamespaceFoldersToSkip/=covers/@EntryIndexedValue">True</s:Boolean>
|
||||
<s:Boolean x:Key="/Default/CodeInspection/NamespaceProvider/NamespaceFoldersToSkip/=wwwroot/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>
|
@ -8,7 +8,7 @@ namespace API.Comparators
|
||||
public class ChapterSortComparer : IComparer<double>
|
||||
{
|
||||
/// <summary>
|
||||
/// Normal sort for 2 doubles. 0 always comes before anything else
|
||||
/// Normal sort for 2 doubles. 0 always comes last
|
||||
/// </summary>
|
||||
/// <param name="x"></param>
|
||||
/// <param name="y"></param>
|
||||
|
@ -1,106 +0,0 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Text.RegularExpressions;
|
||||
using static System.GC;
|
||||
using static System.String;
|
||||
|
||||
namespace API.Comparators
|
||||
{
|
||||
/// <summary>
|
||||
/// Attempts to emulate Windows explorer sorting
|
||||
/// </summary>
|
||||
/// <remarks>This is not thread-safe</remarks>
|
||||
public sealed class NaturalSortComparer : IComparer<string>, IDisposable
|
||||
{
|
||||
private readonly bool _isAscending;
|
||||
private Dictionary<string, string[]> _table = new();
|
||||
|
||||
private bool _disposed;
|
||||
|
||||
|
||||
public NaturalSortComparer(bool inAscendingOrder = true)
|
||||
{
|
||||
_isAscending = inAscendingOrder;
|
||||
}
|
||||
|
||||
int IComparer<string>.Compare(string x, string y)
|
||||
{
|
||||
if (x == y) return 0;
|
||||
|
||||
if (!_table.TryGetValue(x ?? Empty, out var x1))
|
||||
{
|
||||
x1 = Regex.Split(x ?? Empty, "([0-9]+)");
|
||||
_table.Add(x ?? Empty, x1);
|
||||
}
|
||||
|
||||
if (!_table.TryGetValue(y ?? Empty, out var y1))
|
||||
{
|
||||
y1 = Regex.Split(y ?? Empty, "([0-9]+)");
|
||||
_table.Add(y ?? Empty, y1);
|
||||
}
|
||||
|
||||
int returnVal;
|
||||
|
||||
for (var i = 0; i < x1.Length && i < y1.Length; i++)
|
||||
{
|
||||
if (x1[i] == y1[i]) continue;
|
||||
returnVal = PartCompare(x1[i], y1[i]);
|
||||
return _isAscending ? returnVal : -returnVal;
|
||||
}
|
||||
|
||||
if (y1.Length > x1.Length)
|
||||
{
|
||||
returnVal = 1;
|
||||
}
|
||||
else if (x1.Length > y1.Length)
|
||||
{
|
||||
returnVal = -1;
|
||||
}
|
||||
else
|
||||
{
|
||||
returnVal = 0;
|
||||
}
|
||||
|
||||
|
||||
return _isAscending ? returnVal : -returnVal;
|
||||
}
|
||||
|
||||
private static int PartCompare(string left, string right)
|
||||
{
|
||||
if (!int.TryParse(left, out var x))
|
||||
return Compare(left, right, StringComparison.Ordinal);
|
||||
|
||||
if (!int.TryParse(right, out var y))
|
||||
return Compare(left, right, StringComparison.Ordinal);
|
||||
|
||||
return x.CompareTo(y);
|
||||
}
|
||||
|
||||
private void Dispose(bool disposing)
|
||||
{
|
||||
if (!_disposed)
|
||||
{
|
||||
if (disposing)
|
||||
{
|
||||
// called via myClass.Dispose().
|
||||
_table.Clear();
|
||||
_table = null;
|
||||
}
|
||||
// Release unmanaged resources.
|
||||
// Set large fields to null.
|
||||
_disposed = true;
|
||||
}
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
Dispose(true);
|
||||
SuppressFinalize(this);
|
||||
}
|
||||
|
||||
~NaturalSortComparer() // the finalizer
|
||||
{
|
||||
Dispose(false);
|
||||
}
|
||||
}
|
||||
}
|
@ -4,12 +4,11 @@ using System.Linq;
|
||||
using System.Reflection;
|
||||
using System.Threading.Tasks;
|
||||
using API.Constants;
|
||||
using API.Data;
|
||||
using API.DTOs;
|
||||
using API.DTOs.Account;
|
||||
using API.Entities;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using AutoMapper;
|
||||
using Kavita.Common;
|
||||
@ -92,7 +91,7 @@ namespace API.Controllers
|
||||
if (registerDto.IsAdmin)
|
||||
{
|
||||
var firstTimeFlow = !(await _userManager.GetUsersInRoleAsync("Admin")).Any();
|
||||
if (!firstTimeFlow && !await _unitOfWork.UserRepository.IsUserAdmin(
|
||||
if (!firstTimeFlow && !await _unitOfWork.UserRepository.IsUserAdminAsync(
|
||||
await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername())))
|
||||
{
|
||||
return BadRequest("You are not permitted to create an admin account");
|
||||
@ -167,7 +166,7 @@ namespace API.Controllers
|
||||
|
||||
if (user == null) return Unauthorized("Invalid username");
|
||||
|
||||
var isAdmin = await _unitOfWork.UserRepository.IsUserAdmin(user);
|
||||
var isAdmin = await _unitOfWork.UserRepository.IsUserAdminAsync(user);
|
||||
var settings = await _unitOfWork.SettingsRepository.GetSettingsDtoAsync();
|
||||
if (!settings.EnableAuthentication && !isAdmin)
|
||||
{
|
||||
|
@ -1,15 +1,16 @@
|
||||
using System.Collections.Generic;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs;
|
||||
using API.DTOs.Reader;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using HtmlAgilityPack;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using VersOne.Epub;
|
||||
|
||||
@ -21,10 +22,11 @@ namespace API.Controllers
|
||||
private readonly IBookService _bookService;
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ICacheService _cacheService;
|
||||
private static readonly string BookApiUrl = "book-resources?file=";
|
||||
private const string BookApiUrl = "book-resources?file=";
|
||||
|
||||
|
||||
public BookController(ILogger<BookController> logger, IBookService bookService, IUnitOfWork unitOfWork, ICacheService cacheService)
|
||||
public BookController(ILogger<BookController> logger, IBookService bookService,
|
||||
IUnitOfWork unitOfWork, ICacheService cacheService)
|
||||
{
|
||||
_logger = logger;
|
||||
_bookService = bookService;
|
||||
@ -110,32 +112,12 @@ namespace API.Controllers
|
||||
}
|
||||
}
|
||||
|
||||
if (navigationItem.Link == null)
|
||||
{
|
||||
var item = new BookChapterItem()
|
||||
{
|
||||
Title = navigationItem.Title,
|
||||
Children = nestedChapters
|
||||
};
|
||||
if (nestedChapters.Count > 0)
|
||||
{
|
||||
item.Page = nestedChapters[0].Page;
|
||||
}
|
||||
chaptersList.Add(item);
|
||||
}
|
||||
else
|
||||
{
|
||||
var groupKey = BookService.CleanContentKeys(navigationItem.Link.ContentFileName);
|
||||
if (mappings.ContainsKey(groupKey))
|
||||
{
|
||||
chaptersList.Add(new BookChapterItem()
|
||||
{
|
||||
Title = navigationItem.Title,
|
||||
Page = mappings[groupKey],
|
||||
Children = nestedChapters
|
||||
});
|
||||
}
|
||||
}
|
||||
CreateToCChapter(navigationItem, nestedChapters, chaptersList, mappings);
|
||||
}
|
||||
|
||||
if (navigationItem.NestedItems.Count == 0)
|
||||
{
|
||||
CreateToCChapter(navigationItem, Array.Empty<BookChapterItem>(), chaptersList, mappings);
|
||||
}
|
||||
}
|
||||
|
||||
@ -188,6 +170,38 @@ namespace API.Controllers
|
||||
return Ok(chaptersList);
|
||||
}
|
||||
|
||||
private static void CreateToCChapter(EpubNavigationItemRef navigationItem, IList<BookChapterItem> nestedChapters, IList<BookChapterItem> chaptersList,
|
||||
IReadOnlyDictionary<string, int> mappings)
|
||||
{
|
||||
if (navigationItem.Link == null)
|
||||
{
|
||||
var item = new BookChapterItem()
|
||||
{
|
||||
Title = navigationItem.Title,
|
||||
Children = nestedChapters
|
||||
};
|
||||
if (nestedChapters.Count > 0)
|
||||
{
|
||||
item.Page = nestedChapters[0].Page;
|
||||
}
|
||||
|
||||
chaptersList.Add(item);
|
||||
}
|
||||
else
|
||||
{
|
||||
var groupKey = BookService.CleanContentKeys(navigationItem.Link.ContentFileName);
|
||||
if (mappings.ContainsKey(groupKey))
|
||||
{
|
||||
chaptersList.Add(new BookChapterItem()
|
||||
{
|
||||
Title = navigationItem.Title,
|
||||
Page = mappings[groupKey],
|
||||
Children = nestedChapters
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
[HttpGet("{chapterId}/book-page")]
|
||||
public async Task<ActionResult<string>> GetBookPage(int chapterId, [FromQuery] int page)
|
||||
{
|
||||
@ -200,146 +214,40 @@ namespace API.Controllers
|
||||
|
||||
var counter = 0;
|
||||
var doc = new HtmlDocument {OptionFixNestedTags = true};
|
||||
var baseUrl = Request.Scheme + "://" + Request.Host + Request.PathBase + "/api/";
|
||||
|
||||
var baseUrl = "//" + Request.Host + Request.PathBase + "/api/";
|
||||
var apiBase = baseUrl + "book/" + chapterId + "/" + BookApiUrl;
|
||||
var bookPages = await book.GetReadingOrderAsync();
|
||||
foreach (var contentFileRef in bookPages)
|
||||
{
|
||||
if (page == counter)
|
||||
if (page != counter)
|
||||
{
|
||||
var content = await contentFileRef.ReadContentAsync();
|
||||
if (contentFileRef.ContentType != EpubContentType.XHTML_1_1) return Ok(content);
|
||||
|
||||
// In more cases than not, due to this being XML not HTML, we need to escape the script tags.
|
||||
content = BookService.EscapeTags(content);
|
||||
|
||||
doc.LoadHtml(content);
|
||||
var body = doc.DocumentNode.SelectSingleNode("//body");
|
||||
|
||||
if (body == null)
|
||||
{
|
||||
if (doc.ParseErrors.Any())
|
||||
{
|
||||
LogBookErrors(book, contentFileRef, doc);
|
||||
return BadRequest("The file is malformed! Cannot read.");
|
||||
}
|
||||
_logger.LogError("{FilePath} has no body tag! Generating one for support. Book may be skewed", book.FilePath);
|
||||
doc.DocumentNode.SelectSingleNode("/html").AppendChild(HtmlNode.CreateNode("<body></body>"));
|
||||
body = doc.DocumentNode.SelectSingleNode("/html/body");
|
||||
}
|
||||
|
||||
var inlineStyles = doc.DocumentNode.SelectNodes("//style");
|
||||
if (inlineStyles != null)
|
||||
{
|
||||
foreach (var inlineStyle in inlineStyles)
|
||||
{
|
||||
var styleContent = await _bookService.ScopeStyles(inlineStyle.InnerHtml, apiBase, "", book);
|
||||
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
|
||||
}
|
||||
}
|
||||
|
||||
var styleNodes = doc.DocumentNode.SelectNodes("/html/head/link");
|
||||
if (styleNodes != null)
|
||||
{
|
||||
foreach (var styleLinks in styleNodes)
|
||||
{
|
||||
var key = BookService.CleanContentKeys(styleLinks.Attributes["href"].Value);
|
||||
// Some epubs are malformed the key in content.opf might be: content/resources/filelist_0_0.xml but the actual html links to resources/filelist_0_0.xml
|
||||
// In this case, we will do a search for the key that ends with
|
||||
if (!book.Content.Css.ContainsKey(key))
|
||||
{
|
||||
var correctedKey = book.Content.Css.Keys.SingleOrDefault(s => s.EndsWith(key));
|
||||
if (correctedKey == null)
|
||||
{
|
||||
_logger.LogError("Epub is Malformed, key: {Key} is not matching OPF file", key);
|
||||
continue;
|
||||
}
|
||||
|
||||
key = correctedKey;
|
||||
}
|
||||
|
||||
var styleContent = await _bookService.ScopeStyles(await book.Content.Css[key].ReadContentAsync(), apiBase, book.Content.Css[key].FileName, book);
|
||||
if (styleContent != null)
|
||||
{
|
||||
body.PrependChild(HtmlNode.CreateNode($"<style>{styleContent}</style>"));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var anchors = doc.DocumentNode.SelectNodes("//a");
|
||||
if (anchors != null)
|
||||
{
|
||||
foreach (var anchor in anchors)
|
||||
{
|
||||
BookService.UpdateLinks(anchor, mappings, page);
|
||||
}
|
||||
}
|
||||
|
||||
var images = doc.DocumentNode.SelectNodes("//img");
|
||||
if (images != null)
|
||||
{
|
||||
foreach (var image in images)
|
||||
{
|
||||
if (image.Name != "img") continue;
|
||||
|
||||
// Need to do for xlink:href
|
||||
if (image.Attributes["src"] != null)
|
||||
{
|
||||
var imageFile = image.Attributes["src"].Value;
|
||||
if (!book.Content.Images.ContainsKey(imageFile))
|
||||
{
|
||||
var correctedKey = book.Content.Images.Keys.SingleOrDefault(s => s.EndsWith(imageFile));
|
||||
if (correctedKey != null)
|
||||
{
|
||||
imageFile = correctedKey;
|
||||
}
|
||||
}
|
||||
image.Attributes.Remove("src");
|
||||
image.Attributes.Add("src", $"{apiBase}" + imageFile);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
images = doc.DocumentNode.SelectNodes("//image");
|
||||
if (images != null)
|
||||
{
|
||||
foreach (var image in images)
|
||||
{
|
||||
if (image.Name != "image") continue;
|
||||
|
||||
if (image.Attributes["xlink:href"] != null)
|
||||
{
|
||||
var imageFile = image.Attributes["xlink:href"].Value;
|
||||
if (!book.Content.Images.ContainsKey(imageFile))
|
||||
{
|
||||
var correctedKey = book.Content.Images.Keys.SingleOrDefault(s => s.EndsWith(imageFile));
|
||||
if (correctedKey != null)
|
||||
{
|
||||
imageFile = correctedKey;
|
||||
}
|
||||
}
|
||||
image.Attributes.Remove("xlink:href");
|
||||
image.Attributes.Add("xlink:href", $"{apiBase}" + imageFile);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check if any classes on the html node (some r2l books do this) and move them to body tag for scoping
|
||||
var htmlNode = doc.DocumentNode.SelectSingleNode("//html");
|
||||
if (htmlNode != null && htmlNode.Attributes.Contains("class"))
|
||||
{
|
||||
var bodyClasses = body.Attributes.Contains("class") ? body.Attributes["class"].Value : string.Empty;
|
||||
var classes = htmlNode.Attributes["class"].Value + " " + bodyClasses;
|
||||
body.Attributes.Add("class", $"{classes}");
|
||||
// I actually need the body tag itself for the classes, so i will create a div and put the body stuff there.
|
||||
return Ok($"<div class=\"{body.Attributes["class"].Value}\">{body.InnerHtml}</div>");
|
||||
}
|
||||
|
||||
|
||||
return Ok(body.InnerHtml);
|
||||
counter++;
|
||||
continue;
|
||||
}
|
||||
|
||||
counter++;
|
||||
var content = await contentFileRef.ReadContentAsync();
|
||||
if (contentFileRef.ContentType != EpubContentType.XHTML_1_1) return Ok(content);
|
||||
|
||||
// In more cases than not, due to this being XML not HTML, we need to escape the script tags.
|
||||
content = BookService.EscapeTags(content);
|
||||
|
||||
doc.LoadHtml(content);
|
||||
var body = doc.DocumentNode.SelectSingleNode("//body");
|
||||
|
||||
if (body == null)
|
||||
{
|
||||
if (doc.ParseErrors.Any())
|
||||
{
|
||||
LogBookErrors(book, contentFileRef, doc);
|
||||
return BadRequest("The file is malformed! Cannot read.");
|
||||
}
|
||||
_logger.LogError("{FilePath} has no body tag! Generating one for support. Book may be skewed", book.FilePath);
|
||||
doc.DocumentNode.SelectSingleNode("/html").AppendChild(HtmlNode.CreateNode("<body></body>"));
|
||||
body = doc.DocumentNode.SelectSingleNode("/html/body");
|
||||
}
|
||||
|
||||
return Ok(await _bookService.ScopePage(doc, book, apiBase, body, mappings, page));
|
||||
}
|
||||
|
||||
return BadRequest("Could not find the appropriate html for that page");
|
||||
|
@ -3,11 +3,9 @@ using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs;
|
||||
using API.DTOs.CollectionTags;
|
||||
using API.Entities;
|
||||
using API.Entities.Metadata;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
@ -34,7 +32,7 @@ namespace API.Controllers
|
||||
public async Task<IEnumerable<CollectionTagDto>> GetAllTags()
|
||||
{
|
||||
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
|
||||
var isAdmin = await _unitOfWork.UserRepository.IsUserAdmin(user);
|
||||
var isAdmin = await _unitOfWork.UserRepository.IsUserAdminAsync(user);
|
||||
if (isAdmin)
|
||||
{
|
||||
return await _unitOfWork.CollectionTagRepository.GetAllTagDtosAsync();
|
||||
|
@ -4,16 +4,17 @@ using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Comparators;
|
||||
using API.Data;
|
||||
using API.DTOs.Downloads;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using API.SignalR;
|
||||
using Kavita.Common;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
|
||||
namespace API.Controllers
|
||||
{
|
||||
@ -25,16 +26,19 @@ namespace API.Controllers
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly ICacheService _cacheService;
|
||||
private readonly IDownloadService _downloadService;
|
||||
private readonly IHubContext<MessageHub> _messageHub;
|
||||
private readonly NumericComparer _numericComparer;
|
||||
private const string DefaultContentType = "application/octet-stream";
|
||||
|
||||
public DownloadController(IUnitOfWork unitOfWork, IArchiveService archiveService, IDirectoryService directoryService, ICacheService cacheService, IDownloadService downloadService)
|
||||
public DownloadController(IUnitOfWork unitOfWork, IArchiveService archiveService, IDirectoryService directoryService,
|
||||
ICacheService cacheService, IDownloadService downloadService, IHubContext<MessageHub> messageHub)
|
||||
{
|
||||
_unitOfWork = unitOfWork;
|
||||
_archiveService = archiveService;
|
||||
_directoryService = directoryService;
|
||||
_cacheService = cacheService;
|
||||
_downloadService = downloadService;
|
||||
_messageHub = messageHub;
|
||||
_numericComparer = new NumericComparer();
|
||||
}
|
||||
|
||||
@ -42,21 +46,21 @@ namespace API.Controllers
|
||||
public async Task<ActionResult<long>> GetVolumeSize(int volumeId)
|
||||
{
|
||||
var files = await _unitOfWork.VolumeRepository.GetFilesForVolume(volumeId);
|
||||
return Ok(DirectoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
return Ok(_directoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
}
|
||||
|
||||
[HttpGet("chapter-size")]
|
||||
public async Task<ActionResult<long>> GetChapterSize(int chapterId)
|
||||
{
|
||||
var files = await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId);
|
||||
return Ok(DirectoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
return Ok(_directoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
}
|
||||
|
||||
[HttpGet("series-size")]
|
||||
public async Task<ActionResult<long>> GetSeriesSize(int seriesId)
|
||||
{
|
||||
var files = await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId);
|
||||
return Ok(DirectoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
return Ok(_directoryService.GetTotalSize(files.Select(c => c.FilePath)));
|
||||
}
|
||||
|
||||
[HttpGet("volume")]
|
||||
@ -67,13 +71,7 @@ namespace API.Controllers
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(volume.SeriesId);
|
||||
try
|
||||
{
|
||||
if (files.Count == 1)
|
||||
{
|
||||
return await GetFirstFileDownload(files);
|
||||
}
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(files.Select(c => c.FilePath),
|
||||
$"download_{User.GetUsername()}_v{volumeId}");
|
||||
return File(fileBytes, DefaultContentType, $"{series.Name} - Volume {volume.Number}.zip");
|
||||
return await DownloadFiles(files, $"download_{User.GetUsername()}_v{volumeId}", $"{series.Name} - Volume {volume.Number}.zip");
|
||||
}
|
||||
catch (KavitaException ex)
|
||||
{
|
||||
@ -96,13 +94,7 @@ namespace API.Controllers
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(volume.SeriesId);
|
||||
try
|
||||
{
|
||||
if (files.Count == 1)
|
||||
{
|
||||
return await GetFirstFileDownload(files);
|
||||
}
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(files.Select(c => c.FilePath),
|
||||
$"download_{User.GetUsername()}_c{chapterId}");
|
||||
return File(fileBytes, DefaultContentType, $"{series.Name} - Chapter {chapter.Number}.zip");
|
||||
return await DownloadFiles(files, $"download_{User.GetUsername()}_c{chapterId}", $"{series.Name} - Chapter {chapter.Number}.zip");
|
||||
}
|
||||
catch (KavitaException ex)
|
||||
{
|
||||
@ -110,6 +102,21 @@ namespace API.Controllers
|
||||
}
|
||||
}
|
||||
|
||||
private async Task<ActionResult> DownloadFiles(ICollection<MangaFile> files, string tempFolder, string downloadName)
|
||||
{
|
||||
await _messageHub.Clients.All.SendAsync(SignalREvents.DownloadProgress,
|
||||
MessageFactory.DownloadProgressEvent(User.GetUsername(), Path.GetFileNameWithoutExtension(downloadName), 0F));
|
||||
if (files.Count == 1)
|
||||
{
|
||||
return await GetFirstFileDownload(files);
|
||||
}
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(files.Select(c => c.FilePath),
|
||||
tempFolder);
|
||||
await _messageHub.Clients.All.SendAsync(SignalREvents.DownloadProgress,
|
||||
MessageFactory.DownloadProgressEvent(User.GetUsername(), Path.GetFileNameWithoutExtension(downloadName), 1F));
|
||||
return File(fileBytes, DefaultContentType, downloadName);
|
||||
}
|
||||
|
||||
[HttpGet("series")]
|
||||
public async Task<ActionResult> DownloadSeries(int seriesId)
|
||||
{
|
||||
@ -117,13 +124,7 @@ namespace API.Controllers
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(seriesId);
|
||||
try
|
||||
{
|
||||
if (files.Count == 1)
|
||||
{
|
||||
return await GetFirstFileDownload(files);
|
||||
}
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(files.Select(c => c.FilePath),
|
||||
$"download_{User.GetUsername()}_s{seriesId}");
|
||||
return File(fileBytes, DefaultContentType, $"{series.Name}.zip");
|
||||
return await DownloadFiles(files, $"download_{User.GetUsername()}_s{seriesId}", $"{series.Name}.zip");
|
||||
}
|
||||
catch (KavitaException ex)
|
||||
{
|
||||
@ -135,57 +136,20 @@ namespace API.Controllers
|
||||
public async Task<ActionResult> DownloadBookmarkPages(DownloadBookmarkDto downloadBookmarkDto)
|
||||
{
|
||||
// We know that all bookmarks will be for one single seriesId
|
||||
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(downloadBookmarkDto.Bookmarks.First().SeriesId);
|
||||
var totalFilePaths = new List<string>();
|
||||
|
||||
var tempFolder = $"download_{series.Id}_bookmarks";
|
||||
var fullExtractPath = Path.Join(DirectoryService.TempDirectory, tempFolder);
|
||||
if (new DirectoryInfo(fullExtractPath).Exists)
|
||||
{
|
||||
return BadRequest(
|
||||
"Server is currently processing this exact download. Please try again in a few minutes.");
|
||||
}
|
||||
DirectoryService.ExistOrCreate(fullExtractPath);
|
||||
var bookmarkDirectory =
|
||||
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
|
||||
var files = (await _unitOfWork.UserRepository.GetAllBookmarksByIds(downloadBookmarkDto.Bookmarks
|
||||
.Select(b => b.Id)
|
||||
.ToList()))
|
||||
.Select(b => Parser.Parser.NormalizePath(_directoryService.FileSystem.Path.Join(bookmarkDirectory, b.FileName)));
|
||||
|
||||
var uniqueChapterIds = downloadBookmarkDto.Bookmarks.Select(b => b.ChapterId).Distinct().ToList();
|
||||
|
||||
foreach (var chapterId in uniqueChapterIds)
|
||||
{
|
||||
var chapterExtractPath = Path.Join(fullExtractPath, $"{series.Id}_bookmark_{chapterId}");
|
||||
var chapterPages = downloadBookmarkDto.Bookmarks.Where(b => b.ChapterId == chapterId)
|
||||
.Select(b => b.Page).ToList();
|
||||
var mangaFiles = await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId);
|
||||
switch (series.Format)
|
||||
{
|
||||
case MangaFormat.Image:
|
||||
DirectoryService.ExistOrCreate(chapterExtractPath);
|
||||
_directoryService.CopyFilesToDirectory(mangaFiles.Select(f => f.FilePath), chapterExtractPath, $"{chapterId}_");
|
||||
break;
|
||||
case MangaFormat.Archive:
|
||||
case MangaFormat.Pdf:
|
||||
_cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
|
||||
var originalFiles = DirectoryService.GetFilesWithExtension(chapterExtractPath,
|
||||
Parser.Parser.ImageFileExtensions);
|
||||
_directoryService.CopyFilesToDirectory(originalFiles, chapterExtractPath, $"{chapterId}_");
|
||||
DirectoryService.DeleteFiles(originalFiles);
|
||||
break;
|
||||
case MangaFormat.Epub:
|
||||
return BadRequest("Series is not in a valid format.");
|
||||
default:
|
||||
return BadRequest("Series is not in a valid format. Please rescan series and try again.");
|
||||
}
|
||||
|
||||
var files = DirectoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
|
||||
// Filter out images that aren't in bookmarks
|
||||
Array.Sort(files, _numericComparer);
|
||||
totalFilePaths.AddRange(files.Where((_, i) => chapterPages.Contains(i)));
|
||||
}
|
||||
|
||||
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(totalFilePaths,
|
||||
tempFolder);
|
||||
DirectoryService.ClearAndDeleteDirectory(fullExtractPath);
|
||||
var (fileBytes, _) = await _archiveService.CreateZipForDownload(files,
|
||||
$"download_{user.Id}_{series.Id}_bookmarks");
|
||||
return File(fileBytes, DefaultContentType, $"{series.Name} - Bookmarks.zip");
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -1,5 +1,5 @@
|
||||
using System.IO;
|
||||
using API.Interfaces;
|
||||
using API.Services;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
namespace API.Controllers
|
||||
@ -12,7 +12,7 @@ namespace API.Controllers
|
||||
|
||||
public FallbackController(ITaskScheduler taskScheduler)
|
||||
{
|
||||
// This is used to load TaskScheduler on startup without having to navigate to a Controller that uses.
|
||||
// This is used to load TaskScheduler on startup without having to navigate to a Controller that uses.
|
||||
_taskScheduler = taskScheduler;
|
||||
}
|
||||
|
||||
@ -21,4 +21,4 @@ namespace API.Controllers
|
||||
return PhysicalFile(Path.Combine(Directory.GetCurrentDirectory(), "wwwroot", "index.html"), "text/HTML");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,7 +1,8 @@
|
||||
using System.IO;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Services;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
@ -13,11 +14,13 @@ namespace API.Controllers
|
||||
public class ImageController : BaseApiController
|
||||
{
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
|
||||
/// <inheritdoc />
|
||||
public ImageController(IUnitOfWork unitOfWork)
|
||||
public ImageController(IUnitOfWork unitOfWork, IDirectoryService directoryService)
|
||||
{
|
||||
_unitOfWork = unitOfWork;
|
||||
_directoryService = directoryService;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -28,12 +31,12 @@ namespace API.Controllers
|
||||
[HttpGet("chapter-cover")]
|
||||
public async Task<ActionResult> GetChapterCoverImage(int chapterId)
|
||||
{
|
||||
var path = Path.Join(DirectoryService.CoverImageDirectory, await _unitOfWork.ChapterRepository.GetChapterCoverImageAsync(chapterId));
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
var path = Path.Join(_directoryService.CoverImageDirectory, await _unitOfWork.ChapterRepository.GetChapterCoverImageAsync(chapterId));
|
||||
if (string.IsNullOrEmpty(path) || !_directoryService.FileSystem.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = _directoryService.FileSystem.Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
Response.AddCacheHeader(path);
|
||||
return PhysicalFile(path, "image/" + format, Path.GetFileName(path));
|
||||
return PhysicalFile(path, "image/" + format, _directoryService.FileSystem.Path.GetFileName(path));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -44,12 +47,12 @@ namespace API.Controllers
|
||||
[HttpGet("volume-cover")]
|
||||
public async Task<ActionResult> GetVolumeCoverImage(int volumeId)
|
||||
{
|
||||
var path = Path.Join(DirectoryService.CoverImageDirectory, await _unitOfWork.VolumeRepository.GetVolumeCoverImageAsync(volumeId));
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
var path = Path.Join(_directoryService.CoverImageDirectory, await _unitOfWork.VolumeRepository.GetVolumeCoverImageAsync(volumeId));
|
||||
if (string.IsNullOrEmpty(path) || !_directoryService.FileSystem.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = _directoryService.FileSystem.Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
Response.AddCacheHeader(path);
|
||||
return PhysicalFile(path, "image/" + format, Path.GetFileName(path));
|
||||
return PhysicalFile(path, "image/" + format, _directoryService.FileSystem.Path.GetFileName(path));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -60,12 +63,12 @@ namespace API.Controllers
|
||||
[HttpGet("series-cover")]
|
||||
public async Task<ActionResult> GetSeriesCoverImage(int seriesId)
|
||||
{
|
||||
var path = Path.Join(DirectoryService.CoverImageDirectory, await _unitOfWork.SeriesRepository.GetSeriesCoverImageAsync(seriesId));
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
var path = Path.Join(_directoryService.CoverImageDirectory, await _unitOfWork.SeriesRepository.GetSeriesCoverImageAsync(seriesId));
|
||||
if (string.IsNullOrEmpty(path) || !_directoryService.FileSystem.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = _directoryService.FileSystem.Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
Response.AddCacheHeader(path);
|
||||
return PhysicalFile(path, "image/" + format, Path.GetFileName(path));
|
||||
return PhysicalFile(path, "image/" + format, _directoryService.FileSystem.Path.GetFileName(path));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -76,12 +79,34 @@ namespace API.Controllers
|
||||
[HttpGet("collection-cover")]
|
||||
public async Task<ActionResult> GetCollectionCoverImage(int collectionTagId)
|
||||
{
|
||||
var path = Path.Join(DirectoryService.CoverImageDirectory, await _unitOfWork.CollectionTagRepository.GetCoverImageAsync(collectionTagId));
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
var path = Path.Join(_directoryService.CoverImageDirectory, await _unitOfWork.CollectionTagRepository.GetCoverImageAsync(collectionTagId));
|
||||
if (string.IsNullOrEmpty(path) || !_directoryService.FileSystem.File.Exists(path)) return BadRequest($"No cover image");
|
||||
var format = _directoryService.FileSystem.Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
Response.AddCacheHeader(path);
|
||||
return PhysicalFile(path, "image/" + format, Path.GetFileName(path));
|
||||
return PhysicalFile(path, "image/" + format, _directoryService.FileSystem.Path.GetFileName(path));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Returns image for a given bookmark page
|
||||
/// </summary>
|
||||
/// <remarks>This request is served unauthenticated, but user must be passed via api key to validate</remarks>
|
||||
/// <param name="chapterId"></param>
|
||||
/// <param name="pageNum">Starts at 0</param>
|
||||
/// <param name="apiKey">API Key for user. Needed to authenticate request</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("bookmark")]
|
||||
public async Task<ActionResult> GetBookmarkImage(int chapterId, int pageNum, string apiKey)
|
||||
{
|
||||
var userId = await _unitOfWork.UserRepository.GetUserIdByApiKeyAsync(apiKey);
|
||||
var bookmark = await _unitOfWork.UserRepository.GetBookmarkForPage(pageNum, chapterId, userId);
|
||||
if (bookmark == null) return BadRequest("Bookmark does not exist");
|
||||
|
||||
var bookmarkDirectory =
|
||||
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
|
||||
var file = new FileInfo(Path.Join(bookmarkDirectory, bookmark.FileName));
|
||||
var format = Path.GetExtension(file.FullName).Replace(".", "");
|
||||
return PhysicalFile(file.FullName, "image/" + format, Path.GetFileName(file.FullName));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -3,13 +3,13 @@ using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Repositories;
|
||||
using API.DTOs;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using AutoMapper;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
140
API/Controllers/MetadataController.cs
Normal file
140
API/Controllers/MetadataController.cs
Normal file
@ -0,0 +1,140 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Globalization;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs;
|
||||
using API.DTOs.Filtering;
|
||||
using API.DTOs.Metadata;
|
||||
using API.Entities.Enums;
|
||||
using Kavita.Common.Extensions;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
namespace API.Controllers;
|
||||
|
||||
|
||||
public class MetadataController : BaseApiController
|
||||
{
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
|
||||
public MetadataController(IUnitOfWork unitOfWork)
|
||||
{
|
||||
_unitOfWork = unitOfWork;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches genres from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all genres</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("genres")]
|
||||
public async Task<ActionResult<IList<GenreTagDto>>> GetAllGenres(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.GenreRepository.GetAllGenreDtosForLibrariesAsync(ids));
|
||||
}
|
||||
|
||||
return Ok(await _unitOfWork.GenreRepository.GetAllGenreDtosAsync());
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches people from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all people</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("people")]
|
||||
public async Task<ActionResult<IList<PersonDto>>> GetAllPeople(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.PersonRepository.GetAllPeopleDtosForLibrariesAsync(ids));
|
||||
}
|
||||
return Ok(await _unitOfWork.PersonRepository.GetAllPeople());
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches all tags from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all tags</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("tags")]
|
||||
public async Task<ActionResult<IList<TagDto>>> GetAllTags(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.TagRepository.GetAllTagDtosForLibrariesAsync(ids));
|
||||
}
|
||||
return Ok(await _unitOfWork.TagRepository.GetAllTagDtosAsync());
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches all age ratings from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all ratings</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("age-ratings")]
|
||||
public async Task<ActionResult<IList<AgeRatingDto>>> GetAllAgeRatings(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.SeriesRepository.GetAllAgeRatingsDtosForLibrariesAsync(ids));
|
||||
}
|
||||
|
||||
return Ok(Enum.GetValues<AgeRating>().Select(t => new AgeRatingDto()
|
||||
{
|
||||
Title = t.ToDescription(),
|
||||
Value = t
|
||||
}));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches all publication status' from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all publication status</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("publication-status")]
|
||||
public async Task<ActionResult<IList<AgeRatingDto>>> GetAllPublicationStatus(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.SeriesRepository.GetAllPublicationStatusesDtosForLibrariesAsync(ids));
|
||||
}
|
||||
|
||||
return Ok(Enum.GetValues<PublicationStatus>().Select(t => new PublicationStatusDto()
|
||||
{
|
||||
Title = t.ToDescription(),
|
||||
Value = t
|
||||
}));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches all age ratings from the instance
|
||||
/// </summary>
|
||||
/// <param name="libraryIds">String separated libraryIds or null for all ratings</param>
|
||||
/// <returns></returns>
|
||||
[HttpGet("languages")]
|
||||
public async Task<ActionResult<IList<LanguageDto>>> GetAllLanguages(string? libraryIds)
|
||||
{
|
||||
var ids = libraryIds?.Split(",").Select(int.Parse).ToList();
|
||||
if (ids != null && ids.Count > 0)
|
||||
{
|
||||
return Ok(await _unitOfWork.SeriesRepository.GetAllLanguagesForLibrariesAsync(ids));
|
||||
}
|
||||
|
||||
return Ok(new List<LanguageDto>()
|
||||
{
|
||||
new ()
|
||||
{
|
||||
Title = CultureInfo.GetCultureInfo("en").DisplayName,
|
||||
IsoCode = "en"
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -1,7 +1,7 @@
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
|
@ -3,13 +3,15 @@ using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Repositories;
|
||||
using API.DTOs;
|
||||
using API.DTOs.Reader;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
@ -24,15 +26,21 @@ namespace API.Controllers
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ILogger<ReaderController> _logger;
|
||||
private readonly IReaderService _readerService;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly ICleanupService _cleanupService;
|
||||
|
||||
/// <inheritdoc />
|
||||
public ReaderController(ICacheService cacheService,
|
||||
IUnitOfWork unitOfWork, ILogger<ReaderController> logger, IReaderService readerService)
|
||||
IUnitOfWork unitOfWork, ILogger<ReaderController> logger,
|
||||
IReaderService readerService, IDirectoryService directoryService,
|
||||
ICleanupService cleanupService)
|
||||
{
|
||||
_cacheService = cacheService;
|
||||
_unitOfWork = unitOfWork;
|
||||
_logger = logger;
|
||||
_readerService = readerService;
|
||||
_directoryService = directoryService;
|
||||
_cleanupService = cleanupService;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -50,7 +58,7 @@ namespace API.Controllers
|
||||
|
||||
try
|
||||
{
|
||||
var (path, _) = await _cacheService.GetCachedPagePath(chapter, page);
|
||||
var path = _cacheService.GetCachedPagePath(chapter, page);
|
||||
if (string.IsNullOrEmpty(path) || !System.IO.File.Exists(path)) return BadRequest($"No such image for page {page}");
|
||||
var format = Path.GetExtension(path).Replace(".", "");
|
||||
|
||||
@ -75,6 +83,7 @@ namespace API.Controllers
|
||||
if (chapter == null) return BadRequest("Could not find Chapter");
|
||||
|
||||
var dto = await _unitOfWork.ChapterRepository.GetChapterInfoDtoAsync(chapterId);
|
||||
if (dto == null) return BadRequest("Please perform a scan on this series or library and try again");
|
||||
var mangaFile = (await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId)).First();
|
||||
|
||||
return Ok(new ChapterInfoDto()
|
||||
@ -89,6 +98,7 @@ namespace API.Controllers
|
||||
LibraryId = dto.LibraryId,
|
||||
IsSpecial = dto.IsSpecial,
|
||||
Pages = dto.Pages,
|
||||
ChapterTitle = dto.ChapterTitle ?? string.Empty
|
||||
});
|
||||
}
|
||||
|
||||
@ -129,15 +139,7 @@ namespace API.Controllers
|
||||
user.Progresses ??= new List<AppUserProgress>();
|
||||
foreach (var volume in volumes)
|
||||
{
|
||||
foreach (var chapter in volume.Chapters)
|
||||
{
|
||||
var userProgress = ReaderService.GetUserProgressForChapter(user, chapter);
|
||||
|
||||
if (userProgress == null) continue;
|
||||
userProgress.PagesRead = 0;
|
||||
userProgress.SeriesId = markReadDto.SeriesId;
|
||||
userProgress.VolumeId = volume.Id;
|
||||
}
|
||||
_readerService.MarkChaptersAsUnread(user, markReadDto.SeriesId, volume.Chapters);
|
||||
}
|
||||
|
||||
_unitOfWork.UserRepository.Update(user);
|
||||
@ -396,6 +398,14 @@ namespace API.Controllers
|
||||
|
||||
if (await _unitOfWork.CommitAsync())
|
||||
{
|
||||
try
|
||||
{
|
||||
await _cleanupService.CleanupBookmarks();
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogError(ex, "There was an issue cleaning up old bookmarks");
|
||||
}
|
||||
return Ok();
|
||||
}
|
||||
}
|
||||
@ -453,6 +463,18 @@ namespace API.Controllers
|
||||
var userBookmark =
|
||||
await _unitOfWork.UserRepository.GetBookmarkForPage(bookmarkDto.Page, bookmarkDto.ChapterId, user.Id);
|
||||
|
||||
// We need to get the image
|
||||
var chapter = await _cacheService.Ensure(bookmarkDto.ChapterId);
|
||||
if (chapter == null) return BadRequest("There was an issue finding image file for reading");
|
||||
var path = _cacheService.GetCachedPagePath(chapter, bookmarkDto.Page);
|
||||
var fileInfo = new FileInfo(path);
|
||||
|
||||
var bookmarkDirectory =
|
||||
(await _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory)).Value;
|
||||
_directoryService.CopyFileToDirectory(path, Path.Join(bookmarkDirectory,
|
||||
$"{user.Id}", $"{bookmarkDto.SeriesId}", $"{bookmarkDto.ChapterId}"));
|
||||
|
||||
|
||||
if (userBookmark == null)
|
||||
{
|
||||
user.Bookmarks ??= new List<AppUserBookmark>();
|
||||
@ -462,22 +484,21 @@ namespace API.Controllers
|
||||
VolumeId = bookmarkDto.VolumeId,
|
||||
SeriesId = bookmarkDto.SeriesId,
|
||||
ChapterId = bookmarkDto.ChapterId,
|
||||
FileName = Path.Join($"{user.Id}", $"{bookmarkDto.SeriesId}", $"{bookmarkDto.ChapterId}", fileInfo.Name)
|
||||
|
||||
});
|
||||
_unitOfWork.UserRepository.Update(user);
|
||||
}
|
||||
|
||||
|
||||
if (await _unitOfWork.CommitAsync())
|
||||
{
|
||||
return Ok();
|
||||
}
|
||||
await _unitOfWork.CommitAsync();
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
await _unitOfWork.RollbackAsync();
|
||||
return BadRequest("Could not save bookmark");
|
||||
}
|
||||
|
||||
return BadRequest("Could not save bookmark");
|
||||
return Ok();
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
|
@ -1,13 +1,12 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Comparators;
|
||||
using API.Data;
|
||||
using API.DTOs.ReadingLists;
|
||||
using API.Entities;
|
||||
using API.Extensions;
|
||||
using API.Helpers;
|
||||
using API.Interfaces;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
namespace API.Controllers
|
||||
|
@ -3,15 +3,18 @@ using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Metadata;
|
||||
using API.Data.Repositories;
|
||||
using API.DTOs;
|
||||
using API.DTOs.Filtering;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Helpers;
|
||||
using API.Interfaces;
|
||||
using API.Services;
|
||||
using API.SignalR;
|
||||
using Kavita.Common;
|
||||
using Kavita.Common.Extensions;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
using Microsoft.AspNetCore.SignalR;
|
||||
@ -80,6 +83,8 @@ namespace API.Controllers
|
||||
var username = User.GetUsername();
|
||||
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
|
||||
|
||||
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(seriesId);
|
||||
|
||||
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
|
||||
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
|
||||
|
||||
@ -89,6 +94,8 @@ namespace API.Controllers
|
||||
await _unitOfWork.CollectionTagRepository.RemoveTagsWithoutSeries();
|
||||
await _unitOfWork.CommitAsync();
|
||||
_taskScheduler.CleanupChapters(chapterIds);
|
||||
await _messageHub.Clients.All.SendAsync(SignalREvents.SeriesRemoved,
|
||||
MessageFactory.SeriesRemovedEvent(seriesId, series.Name, series.LibraryId));
|
||||
}
|
||||
return Ok(result);
|
||||
}
|
||||
@ -151,7 +158,7 @@ namespace API.Controllers
|
||||
public async Task<ActionResult> UpdateSeriesRating(UpdateSeriesRatingDto updateSeriesRatingDto)
|
||||
{
|
||||
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername(), AppUserIncludes.Ratings);
|
||||
var userRating = await _unitOfWork.UserRepository.GetUserRating(updateSeriesRatingDto.SeriesId, user.Id) ??
|
||||
var userRating = await _unitOfWork.UserRepository.GetUserRatingAsync(updateSeriesRatingDto.SeriesId, user.Id) ??
|
||||
new AppUserRating();
|
||||
|
||||
userRating.Rating = updateSeriesRatingDto.UserRating;
|
||||
@ -187,7 +194,7 @@ namespace API.Controllers
|
||||
series.Name = updateSeries.Name.Trim();
|
||||
series.LocalizedName = updateSeries.LocalizedName.Trim();
|
||||
series.SortName = updateSeries.SortName?.Trim();
|
||||
series.Summary = updateSeries.Summary?.Trim();
|
||||
series.Metadata.Summary = updateSeries.Summary?.Trim();
|
||||
|
||||
var needsRefreshMetadata = false;
|
||||
// This is when you hit Reset
|
||||
@ -230,6 +237,23 @@ namespace API.Controllers
|
||||
return Ok(series);
|
||||
}
|
||||
|
||||
[HttpPost("all")]
|
||||
public async Task<ActionResult<IEnumerable<SeriesDto>>> GetAllSeries(FilterDto filterDto, [FromQuery] UserParams userParams, [FromQuery] int libraryId = 0)
|
||||
{
|
||||
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
|
||||
var series =
|
||||
await _unitOfWork.SeriesRepository.GetSeriesDtoForLibraryIdAsync(libraryId, userId, userParams, filterDto);
|
||||
|
||||
// Apply progress/rating information (I can't work out how to do this in initial query)
|
||||
if (series == null) return BadRequest("Could not get series");
|
||||
|
||||
await _unitOfWork.SeriesRepository.AddSeriesModifiers(userId, series);
|
||||
|
||||
Response.AddPaginationHeader(series.CurrentPage, series.PageSize, series.TotalCount, series.TotalPages);
|
||||
|
||||
return Ok(series);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fetches series that are on deck aka have progress on them.
|
||||
/// </summary>
|
||||
@ -294,6 +318,7 @@ namespace API.Controllers
|
||||
else
|
||||
{
|
||||
series.Metadata.CollectionTags ??= new List<CollectionTag>();
|
||||
// TODO: Move this merging logic into a reusable code as it can be used for any Tag
|
||||
var newTags = new List<CollectionTag>();
|
||||
|
||||
// I want a union of these 2 lists. Return only elements that are in both lists, but the list types are different
|
||||
@ -313,7 +338,7 @@ namespace API.Controllers
|
||||
var existingTag = allTags.SingleOrDefault(t => t.Title == tag.Title);
|
||||
if (existingTag != null)
|
||||
{
|
||||
if (!series.Metadata.CollectionTags.Any(t => t.Title == tag.Title))
|
||||
if (series.Metadata.CollectionTags.All(t => t.Title != tag.Title))
|
||||
{
|
||||
newTags.Add(existingTag);
|
||||
}
|
||||
@ -392,6 +417,12 @@ namespace API.Controllers
|
||||
return Ok(await _unitOfWork.SeriesRepository.GetSeriesDtoForIdsAsync(dto.SeriesIds, userId));
|
||||
}
|
||||
|
||||
[HttpGet("age-rating")]
|
||||
public ActionResult<string> GetAgeRating(int ageRating)
|
||||
{
|
||||
var val = (AgeRating) ageRating;
|
||||
|
||||
return Ok(val.ToDescription());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -5,7 +5,7 @@ using System.Threading.Tasks;
|
||||
using API.DTOs.Stats;
|
||||
using API.DTOs.Update;
|
||||
using API.Extensions;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using API.Services.Tasks;
|
||||
using Kavita.Common;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
@ -27,10 +27,11 @@ namespace API.Controllers
|
||||
private readonly ICacheService _cacheService;
|
||||
private readonly IVersionUpdaterService _versionUpdaterService;
|
||||
private readonly IStatsService _statsService;
|
||||
private readonly ICleanupService _cleanupService;
|
||||
|
||||
public ServerController(IHostApplicationLifetime applicationLifetime, ILogger<ServerController> logger, IConfiguration config,
|
||||
IBackupService backupService, IArchiveService archiveService, ICacheService cacheService,
|
||||
IVersionUpdaterService versionUpdaterService, IStatsService statsService)
|
||||
IVersionUpdaterService versionUpdaterService, IStatsService statsService, ICleanupService cleanupService)
|
||||
{
|
||||
_applicationLifetime = applicationLifetime;
|
||||
_logger = logger;
|
||||
@ -40,6 +41,7 @@ namespace API.Controllers
|
||||
_cacheService = cacheService;
|
||||
_versionUpdaterService = versionUpdaterService;
|
||||
_statsService = statsService;
|
||||
_cleanupService = cleanupService;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
@ -63,7 +65,7 @@ namespace API.Controllers
|
||||
public ActionResult ClearCache()
|
||||
{
|
||||
_logger.LogInformation("{UserName} is clearing cache of server from admin dashboard", User.GetUsername());
|
||||
_cacheService.Cleanup();
|
||||
_cleanupService.CleanupCacheDirectory();
|
||||
|
||||
return Ok();
|
||||
}
|
||||
@ -94,7 +96,7 @@ namespace API.Controllers
|
||||
[HttpGet("logs")]
|
||||
public async Task<ActionResult> GetLogs()
|
||||
{
|
||||
var files = _backupService.LogFiles(_config.GetMaxRollingFiles(), _config.GetLoggingFileName());
|
||||
var files = _backupService.GetLogFiles(_config.GetMaxRollingFiles(), _config.GetLoggingFileName());
|
||||
try
|
||||
{
|
||||
var (fileBytes, zipPath) = await _archiveService.CreateZipForDownload(files, "logs");
|
||||
|
@ -3,13 +3,13 @@ using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs.Settings;
|
||||
using API.Entities.Enums;
|
||||
using API.Extensions;
|
||||
using API.Helpers.Converters;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using AutoMapper;
|
||||
using Kavita.Common;
|
||||
using Kavita.Common.Extensions;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
@ -24,13 +24,18 @@ namespace API.Controllers
|
||||
private readonly IUnitOfWork _unitOfWork;
|
||||
private readonly ITaskScheduler _taskScheduler;
|
||||
private readonly IAccountService _accountService;
|
||||
private readonly IDirectoryService _directoryService;
|
||||
private readonly IMapper _mapper;
|
||||
|
||||
public SettingsController(ILogger<SettingsController> logger, IUnitOfWork unitOfWork, ITaskScheduler taskScheduler, IAccountService accountService)
|
||||
public SettingsController(ILogger<SettingsController> logger, IUnitOfWork unitOfWork, ITaskScheduler taskScheduler,
|
||||
IAccountService accountService, IDirectoryService directoryService, IMapper mapper)
|
||||
{
|
||||
_logger = logger;
|
||||
_unitOfWork = unitOfWork;
|
||||
_taskScheduler = taskScheduler;
|
||||
_accountService = accountService;
|
||||
_directoryService = directoryService;
|
||||
_mapper = mapper;
|
||||
}
|
||||
|
||||
[AllowAnonymous]
|
||||
@ -46,11 +51,21 @@ namespace API.Controllers
|
||||
public async Task<ActionResult<ServerSettingDto>> GetSettings()
|
||||
{
|
||||
var settingsDto = await _unitOfWork.SettingsRepository.GetSettingsDtoAsync();
|
||||
// TODO: Is this needed as it gets updated in the DB on startup
|
||||
settingsDto.Port = Configuration.Port;
|
||||
settingsDto.LoggingLevel = Configuration.LogLevel;
|
||||
return Ok(settingsDto);
|
||||
}
|
||||
|
||||
[Authorize(Policy = "RequireAdminRole")]
|
||||
[HttpPost("reset")]
|
||||
public async Task<ActionResult<ServerSettingDto>> ResetSettings()
|
||||
{
|
||||
_logger.LogInformation("{UserName} is resetting Server Settings", User.GetUsername());
|
||||
|
||||
return await UpdateSettings(_mapper.Map<ServerSettingDto>(Seed.DefaultSettings));
|
||||
}
|
||||
|
||||
[Authorize(Policy = "RequireAdminRole")]
|
||||
[HttpPost]
|
||||
public async Task<ActionResult<ServerSettingDto>> UpdateSettings(ServerSettingDto updateSettingsDto)
|
||||
@ -70,6 +85,20 @@ namespace API.Controllers
|
||||
// We do not allow CacheDirectory changes, so we will ignore.
|
||||
var currentSettings = await _unitOfWork.SettingsRepository.GetSettingsAsync();
|
||||
var updateAuthentication = false;
|
||||
var updateBookmarks = false;
|
||||
var originalBookmarkDirectory = _directoryService.BookmarkDirectory;
|
||||
|
||||
var bookmarkDirectory = updateSettingsDto.BookmarksDirectory;
|
||||
if (!updateSettingsDto.BookmarksDirectory.EndsWith("bookmarks") &&
|
||||
!updateSettingsDto.BookmarksDirectory.EndsWith("bookmarks/"))
|
||||
{
|
||||
bookmarkDirectory = _directoryService.FileSystem.Path.Join(updateSettingsDto.BookmarksDirectory, "bookmarks");
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(updateSettingsDto.BookmarksDirectory))
|
||||
{
|
||||
bookmarkDirectory = _directoryService.BookmarkDirectory;
|
||||
}
|
||||
|
||||
foreach (var setting in currentSettings)
|
||||
{
|
||||
@ -118,6 +147,22 @@ namespace API.Controllers
|
||||
_unitOfWork.SettingsRepository.Update(setting);
|
||||
}
|
||||
|
||||
if (setting.Key == ServerSettingKey.BookmarkDirectory && bookmarkDirectory != setting.Value)
|
||||
{
|
||||
// Validate new directory can be used
|
||||
if (!await _directoryService.CheckWriteAccess(bookmarkDirectory))
|
||||
{
|
||||
return BadRequest("Bookmark Directory does not have correct permissions for Kavita to use");
|
||||
}
|
||||
|
||||
originalBookmarkDirectory = setting.Value;
|
||||
// Normalize the path deliminators. Just to look nice in DB, no functionality
|
||||
setting.Value = _directoryService.FileSystem.Path.GetFullPath(bookmarkDirectory);
|
||||
_unitOfWork.SettingsRepository.Update(setting);
|
||||
updateBookmarks = true;
|
||||
|
||||
}
|
||||
|
||||
if (setting.Key == ServerSettingKey.EnableAuthentication && updateSettingsDto.EnableAuthentication + string.Empty != setting.Value)
|
||||
{
|
||||
setting.Value = updateSettingsDto.EnableAuthentication + string.Empty;
|
||||
@ -160,6 +205,13 @@ namespace API.Controllers
|
||||
|
||||
_logger.LogInformation("Server authentication changed. Updated all non-admins to default password");
|
||||
}
|
||||
|
||||
if (updateBookmarks)
|
||||
{
|
||||
_directoryService.ExistOrCreate(bookmarkDirectory);
|
||||
_directoryService.CopyDirectoryToDirectory(originalBookmarkDirectory, bookmarkDirectory);
|
||||
_directoryService.ClearAndDeleteDirectory(originalBookmarkDirectory);
|
||||
}
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
@ -170,7 +222,7 @@ namespace API.Controllers
|
||||
|
||||
|
||||
_logger.LogInformation("Server Settings updated");
|
||||
_taskScheduler.ScheduleTasks();
|
||||
await _taskScheduler.ScheduleTasks();
|
||||
return Ok(updateSettingsDto);
|
||||
}
|
||||
|
||||
|
@ -1,8 +1,7 @@
|
||||
using System;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.DTOs.Uploads;
|
||||
using API.Interfaces;
|
||||
using API.Interfaces.Services;
|
||||
using API.Services;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
@ -1,10 +1,10 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Data;
|
||||
using API.Data.Repositories;
|
||||
using API.DTOs;
|
||||
using API.Extensions;
|
||||
using API.Interfaces;
|
||||
using Microsoft.AspNetCore.Authorization;
|
||||
using Microsoft.AspNetCore.Mvc;
|
||||
|
||||
|
@ -1,5 +1,7 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using API.DTOs.Metadata;
|
||||
using API.Entities;
|
||||
|
||||
namespace API.DTOs
|
||||
{
|
||||
@ -50,5 +52,41 @@ namespace API.DTOs
|
||||
/// When chapter was created
|
||||
/// </summary>
|
||||
public DateTime Created { get; init; }
|
||||
/// <summary>
|
||||
/// When the chapter was released.
|
||||
/// </summary>
|
||||
/// <remarks>Metadata field</remarks>
|
||||
public DateTime ReleaseDate { get; init; }
|
||||
/// <summary>
|
||||
/// Title of the Chapter/Issue
|
||||
/// </summary>
|
||||
/// <remarks>Metadata field</remarks>
|
||||
public string TitleName { get; set; }
|
||||
/// <summary>
|
||||
/// Summary for the Chapter/Issue
|
||||
/// </summary>
|
||||
public string Summary { get; set; }
|
||||
/// <summary>
|
||||
/// Language for the Chapter/Issue
|
||||
/// </summary>
|
||||
public string Language { get; set; }
|
||||
/// <summary>
|
||||
/// Number in the TotalCount of issues
|
||||
/// </summary>
|
||||
public int Count { get; set; }
|
||||
/// <summary>
|
||||
/// Total number of issues for the series
|
||||
/// </summary>
|
||||
public int TotalCount { get; set; }
|
||||
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Penciller { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Inker { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Colorist { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Letterer { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> CoverArtist { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Editor { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Publisher { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Translators { get; set; } = new List<PersonDto>();
|
||||
public ICollection<TagDto> Tags { get; set; } = new List<TagDto>();
|
||||
}
|
||||
}
|
||||
|
@ -1,13 +1,98 @@
|
||||
using API.Entities.Enums;
|
||||
using System.Collections.Generic;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
|
||||
namespace API.DTOs.Filtering
|
||||
{
|
||||
public class FilterDto
|
||||
{
|
||||
/// <summary>
|
||||
/// Pass null if you want all formats
|
||||
/// The type of Formats you want to be returned. An empty list will return all formats back
|
||||
/// </summary>
|
||||
public MangaFormat? MangaFormat { get; init; } = null;
|
||||
public IList<MangaFormat> Formats { get; init; } = new List<MangaFormat>();
|
||||
|
||||
/// <summary>
|
||||
/// The progress you want to be returned. This can be bitwise manipulated. Defaults to all applicable states.
|
||||
/// </summary>
|
||||
public ReadStatus ReadStatus { get; init; } = new ReadStatus();
|
||||
|
||||
/// <summary>
|
||||
/// A list of library ids to restrict search to. Defaults to all libraries by passing empty list
|
||||
/// </summary>
|
||||
public IList<int> Libraries { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Genre ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Genres { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Writers to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Writers { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Penciller ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Penciller { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Inker ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Inker { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Colorist ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Colorist { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Letterer ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Letterer { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of CoverArtist ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> CoverArtist { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Editor ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Editor { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Publisher ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Publisher { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Character ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Character { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Translator ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Translators { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Collection Tag ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> CollectionTags { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// A list of Tag ids to restrict search to. Defaults to all genres by passing an empty list
|
||||
/// </summary>
|
||||
public IList<int> Tags { get; init; } = new List<int>();
|
||||
/// <summary>
|
||||
/// Will return back everything with the rating and above
|
||||
/// <see cref="AppUserRating.Rating"/>
|
||||
/// </summary>
|
||||
public int Rating { get; init; }
|
||||
/// <summary>
|
||||
/// Sorting Options for a query. Defaults to null, which uses the queries natural sorting order
|
||||
/// </summary>
|
||||
public SortOptions SortOptions { get; init; } = null;
|
||||
/// <summary>
|
||||
/// Age Ratings. Empty list will return everything back
|
||||
/// </summary>
|
||||
public IList<AgeRating> AgeRating { get; init; } = new List<AgeRating>();
|
||||
/// <summary>
|
||||
/// Languages (ISO 639-1 code) to filter by. Empty list will return everything back
|
||||
/// </summary>
|
||||
public IList<string> Languages { get; init; } = new List<string>();
|
||||
/// <summary>
|
||||
/// Publication statuses to filter by. Empty list will return everything back
|
||||
/// </summary>
|
||||
public IList<PublicationStatus> PublicationStatus { get; init; } = new List<PublicationStatus>();
|
||||
|
||||
}
|
||||
}
|
||||
|
7
API/DTOs/Filtering/LanguageDto.cs
Normal file
7
API/DTOs/Filtering/LanguageDto.cs
Normal file
@ -0,0 +1,7 @@
|
||||
namespace API.DTOs.Filtering;
|
||||
|
||||
public class LanguageDto
|
||||
{
|
||||
public string IsoCode { get; set; }
|
||||
public string Title { get; set; }
|
||||
}
|
13
API/DTOs/Filtering/ReadStatus.cs
Normal file
13
API/DTOs/Filtering/ReadStatus.cs
Normal file
@ -0,0 +1,13 @@
|
||||
using System;
|
||||
|
||||
namespace API.DTOs.Filtering;
|
||||
|
||||
/// <summary>
|
||||
/// Represents the Reading Status. This is a flag and allows multiple statues
|
||||
/// </summary>
|
||||
public class ReadStatus
|
||||
{
|
||||
public bool NotRead { get; set; } = true;
|
||||
public bool InProgress { get; set; } = true;
|
||||
public bool Read { get; set; } = true;
|
||||
}
|
8
API/DTOs/Filtering/SortField.cs
Normal file
8
API/DTOs/Filtering/SortField.cs
Normal file
@ -0,0 +1,8 @@
|
||||
namespace API.DTOs.Filtering;
|
||||
|
||||
public enum SortField
|
||||
{
|
||||
SortName = 1,
|
||||
CreatedDate = 2,
|
||||
LastModifiedDate = 3,
|
||||
}
|
10
API/DTOs/Filtering/SortOptions.cs
Normal file
10
API/DTOs/Filtering/SortOptions.cs
Normal file
@ -0,0 +1,10 @@
|
||||
namespace API.DTOs.Filtering;
|
||||
|
||||
/// <summary>
|
||||
/// Sorting Options for a query
|
||||
/// </summary>
|
||||
public class SortOptions
|
||||
{
|
||||
public SortField SortField { get; set; }
|
||||
public bool IsAscending { get; set; } = true;
|
||||
}
|
9
API/DTOs/Metadata/AgeRatingDto.cs
Normal file
9
API/DTOs/Metadata/AgeRatingDto.cs
Normal file
@ -0,0 +1,9 @@
|
||||
using API.Entities.Enums;
|
||||
|
||||
namespace API.DTOs.Metadata;
|
||||
|
||||
public class AgeRatingDto
|
||||
{
|
||||
public AgeRating Value { get; set; }
|
||||
public string Title { get; set; }
|
||||
}
|
19
API/DTOs/Metadata/ChapterMetadataDto.cs
Normal file
19
API/DTOs/Metadata/ChapterMetadataDto.cs
Normal file
@ -0,0 +1,19 @@
|
||||
using System.Collections.Generic;
|
||||
|
||||
namespace API.DTOs.Metadata
|
||||
{
|
||||
public class ChapterMetadataDto
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public string Title { get; set; }
|
||||
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Penciller { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Inker { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Colorist { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Letterer { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> CoverArtist { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Editor { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Publisher { get; set; } = new List<PersonDto>();
|
||||
public int ChapterId { get; set; }
|
||||
}
|
||||
}
|
8
API/DTOs/Metadata/GenreTagDto.cs
Normal file
8
API/DTOs/Metadata/GenreTagDto.cs
Normal file
@ -0,0 +1,8 @@
|
||||
namespace API.DTOs.Metadata
|
||||
{
|
||||
public class GenreTagDto
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public string Title { get; set; }
|
||||
}
|
||||
}
|
9
API/DTOs/Metadata/PublicationStatusDto.cs
Normal file
9
API/DTOs/Metadata/PublicationStatusDto.cs
Normal file
@ -0,0 +1,9 @@
|
||||
using API.Entities.Enums;
|
||||
|
||||
namespace API.DTOs.Metadata;
|
||||
|
||||
public class PublicationStatusDto
|
||||
{
|
||||
public PublicationStatus Value { get; set; }
|
||||
public string Title { get; set; }
|
||||
}
|
7
API/DTOs/Metadata/TagDto.cs
Normal file
7
API/DTOs/Metadata/TagDto.cs
Normal file
@ -0,0 +1,7 @@
|
||||
namespace API.DTOs.Metadata;
|
||||
|
||||
public class TagDto
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public string Title { get; set; }
|
||||
}
|
@ -23,7 +23,7 @@ namespace API.DTOs.OPDS
|
||||
public string Icon { get; set; } = "/favicon.ico";
|
||||
|
||||
[XmlElement("author")]
|
||||
public Author Author { get; set; } = new Author()
|
||||
public FeedAuthor Author { get; set; } = new FeedAuthor()
|
||||
{
|
||||
Name = "Kavita",
|
||||
Uri = "https://kavitareader.com"
|
||||
|
@ -2,7 +2,7 @@
|
||||
|
||||
namespace API.DTOs.OPDS
|
||||
{
|
||||
public class Author
|
||||
public class FeedAuthor
|
||||
{
|
||||
[XmlElement("name")]
|
||||
public string Name { get; set; }
|
@ -4,7 +4,8 @@ namespace API.DTOs
|
||||
{
|
||||
public class PersonDto
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public string Name { get; set; }
|
||||
public PersonRole Role { get; set; }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -14,5 +14,6 @@ namespace API.DTOs.Reader
|
||||
public int LibraryId { get; set; }
|
||||
public int Pages { get; set; }
|
||||
public bool IsSpecial { get; set; }
|
||||
public string ChapterTitle { get; set; }
|
||||
}
|
||||
}
|
||||
|
@ -1,4 +1,5 @@
|
||||
using API.Entities.Enums;
|
||||
using System;
|
||||
using API.Entities.Enums;
|
||||
|
||||
namespace API.DTOs.Reader
|
||||
{
|
||||
@ -12,7 +13,7 @@ namespace API.DTOs.Reader
|
||||
public MangaFormat SeriesFormat { get; set; }
|
||||
public int SeriesId { get; set; }
|
||||
public int LibraryId { get; set; }
|
||||
public string ChapterTitle { get; set; } = "";
|
||||
public string ChapterTitle { get; set; } = string.Empty;
|
||||
public int Pages { get; set; }
|
||||
public string FileName { get; set; }
|
||||
public bool IsSpecial { get; set; }
|
||||
|
@ -13,6 +13,7 @@ namespace API.DTOs.Reader
|
||||
public int LibraryId { get; set; }
|
||||
public int Pages { get; set; }
|
||||
public bool IsSpecial { get; set; }
|
||||
public string ChapterTitle { get; set; }
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -1,16 +1,61 @@
|
||||
using System.Collections.Generic;
|
||||
using API.DTOs.CollectionTags;
|
||||
using API.Entities;
|
||||
using API.DTOs.Metadata;
|
||||
using API.Entities.Enums;
|
||||
|
||||
namespace API.DTOs
|
||||
{
|
||||
public class SeriesMetadataDto
|
||||
{
|
||||
public int Id { get; set; }
|
||||
public ICollection<string> Genres { get; set; }
|
||||
public ICollection<CollectionTagDto> Tags { get; set; }
|
||||
public ICollection<Person> Persons { get; set; }
|
||||
public string Publisher { get; set; }
|
||||
public string Summary { get; set; }
|
||||
/// <summary>
|
||||
/// Collections the Series belongs to
|
||||
/// </summary>
|
||||
public ICollection<CollectionTagDto> CollectionTags { get; set; }
|
||||
/// <summary>
|
||||
/// Genres for the Series
|
||||
/// </summary>
|
||||
public ICollection<GenreTagDto> Genres { get; set; }
|
||||
/// <summary>
|
||||
/// Collection of all Tags from underlying chapters for a Series
|
||||
/// </summary>
|
||||
public ICollection<TagDto> Tags { get; set; }
|
||||
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> CoverArtists { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Publishers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Characters { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Pencillers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Inkers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Colorists { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Letterers { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Editors { get; set; } = new List<PersonDto>();
|
||||
public ICollection<PersonDto> Translators { get; set; } = new List<PersonDto>();
|
||||
/// <summary>
|
||||
/// Highest Age Rating from all Chapters
|
||||
/// </summary>
|
||||
public AgeRating AgeRating { get; set; } = AgeRating.Unknown;
|
||||
/// <summary>
|
||||
/// Earliest Year from all chapters
|
||||
/// </summary>
|
||||
public int ReleaseYear { get; set; }
|
||||
/// <summary>
|
||||
/// Language of the content (BCP-47 code)
|
||||
/// </summary>
|
||||
public string Language { get; set; } = string.Empty;
|
||||
/// <summary>
|
||||
/// Number in the TotalCount of issues
|
||||
/// </summary>
|
||||
public int Count { get; set; }
|
||||
/// <summary>
|
||||
/// Total number of issues for the series
|
||||
/// </summary>
|
||||
public int TotalCount { get; set; }
|
||||
/// <summary>
|
||||
/// Publication status of the Series
|
||||
/// </summary>
|
||||
public PublicationStatus PublicationStatus { get; set; }
|
||||
|
||||
public int SeriesId { get; set; }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,4 +1,6 @@
|
||||
namespace API.DTOs.Settings
|
||||
using API.Services;
|
||||
|
||||
namespace API.DTOs.Settings
|
||||
{
|
||||
public class ServerSettingDto
|
||||
{
|
||||
@ -30,5 +32,10 @@
|
||||
/// Base Url for the kavita. Requires restart to take effect.
|
||||
/// </summary>
|
||||
public string BaseUrl { get; set; }
|
||||
/// <summary>
|
||||
/// Where Bookmarks are stored.
|
||||
/// </summary>
|
||||
/// <remarks>If null or empty string, will default back to default install setting aka <see cref="DirectoryService.BookmarkDirectory"/></remarks>
|
||||
public string BookmarksDirectory { get; set; }
|
||||
}
|
||||
}
|
||||
|
@ -1,6 +1,4 @@
|
||||
using System;
|
||||
|
||||
namespace API.DTOs.Update
|
||||
namespace API.DTOs.Update
|
||||
{
|
||||
/// <summary>
|
||||
/// Update Notification denoting a new release available for user to update to
|
||||
|
@ -18,4 +18,4 @@ namespace API.DTOs
|
||||
public ReadingDirection BookReaderReadingDirection { get; set; }
|
||||
public bool SiteDarkMode { get; set; }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -4,6 +4,7 @@ using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using API.Entities;
|
||||
using API.Entities.Interfaces;
|
||||
using API.Entities.Metadata;
|
||||
using Microsoft.AspNetCore.Identity;
|
||||
using Microsoft.AspNetCore.Identity.EntityFrameworkCore;
|
||||
using Microsoft.EntityFrameworkCore;
|
||||
@ -23,7 +24,6 @@ namespace API.Data
|
||||
|
||||
public DbSet<Library> Library { get; set; }
|
||||
public DbSet<Series> Series { get; set; }
|
||||
|
||||
public DbSet<Chapter> Chapter { get; set; }
|
||||
public DbSet<Volume> Volume { get; set; }
|
||||
public DbSet<AppUser> AppUser { get; set; }
|
||||
@ -37,6 +37,9 @@ namespace API.Data
|
||||
public DbSet<AppUserBookmark> AppUserBookmark { get; set; }
|
||||
public DbSet<ReadingList> ReadingList { get; set; }
|
||||
public DbSet<ReadingListItem> ReadingListItem { get; set; }
|
||||
public DbSet<Person> Person { get; set; }
|
||||
public DbSet<Genre> Genre { get; set; }
|
||||
public DbSet<Tag> Tag { get; set; }
|
||||
|
||||
|
||||
protected override void OnModelCreating(ModelBuilder builder)
|
||||
|
@ -1,7 +1,11 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using API.Data.Metadata;
|
||||
using API.Entities;
|
||||
using API.Entities.Enums;
|
||||
using API.Entities.Metadata;
|
||||
using API.Extensions;
|
||||
using API.Parser;
|
||||
using API.Services.Tasks;
|
||||
|
||||
@ -21,12 +25,16 @@ namespace API.Data
|
||||
LocalizedName = name,
|
||||
NormalizedName = Parser.Parser.Normalize(name),
|
||||
SortName = name,
|
||||
Summary = string.Empty,
|
||||
Volumes = new List<Volume>(),
|
||||
Metadata = SeriesMetadata(Array.Empty<CollectionTag>())
|
||||
};
|
||||
}
|
||||
|
||||
public static SeriesMetadata SeriesMetadata(ComicInfo info)
|
||||
{
|
||||
return SeriesMetadata(Array.Empty<CollectionTag>());
|
||||
}
|
||||
|
||||
public static Volume Volume(string volumeNumber)
|
||||
{
|
||||
return new Volume()
|
||||
@ -57,7 +65,8 @@ namespace API.Data
|
||||
{
|
||||
return new SeriesMetadata()
|
||||
{
|
||||
CollectionTags = collectionTags
|
||||
CollectionTags = collectionTags,
|
||||
Summary = string.Empty
|
||||
};
|
||||
}
|
||||
|
||||
@ -72,5 +81,47 @@ namespace API.Data
|
||||
Promoted = promoted
|
||||
};
|
||||
}
|
||||
|
||||
public static Genre Genre(string name, bool external)
|
||||
{
|
||||
return new Genre()
|
||||
{
|
||||
Title = name.Trim().SentenceCase(),
|
||||
NormalizedTitle = Parser.Parser.Normalize(name),
|
||||
ExternalTag = external
|
||||
};
|
||||
}
|
||||
|
||||
public static Tag Tag(string name, bool external)
|
||||
{
|
||||
return new Tag()
|
||||
{
|
||||
Title = name.Trim().SentenceCase(),
|
||||
NormalizedTitle = Parser.Parser.Normalize(name),
|
||||
ExternalTag = external
|
||||
};
|
||||
}
|
||||
|
||||
public static Person Person(string name, PersonRole role)
|
||||
{
|
||||
return new Person()
|
||||
{
|
||||
Name = name.Trim(),
|
||||
NormalizedName = Parser.Parser.Normalize(name),
|
||||
Role = role
|
||||
};
|
||||
}
|
||||
|
||||
public static MangaFile MangaFile(string filePath, MangaFormat format, int pages)
|
||||
{
|
||||
return new MangaFile()
|
||||
{
|
||||
FilePath = filePath,
|
||||
Format = format,
|
||||
Pages = pages,
|
||||
LastModified = File.GetLastWriteTime(filePath) // NOTE: Changed this from DateTime.Now
|
||||
};
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
@ -1,51 +1,121 @@
|
||||
namespace API.Data.Metadata
|
||||
using System;
|
||||
using System.Linq;
|
||||
using API.Entities.Enums;
|
||||
using Kavita.Common.Extensions;
|
||||
|
||||
namespace API.Data.Metadata
|
||||
{
|
||||
/// <summary>
|
||||
/// A representation of a ComicInfo.xml file
|
||||
/// </summary>
|
||||
/// <remarks>See reference of the loose spec here: https://github.com/Kussie/ComicInfoStandard/blob/main/ComicInfo.xsd</remarks>
|
||||
/// <remarks>See reference of the loose spec here: https://anansi-project.github.io/docs/comicinfo/documentation</remarks>
|
||||
public class ComicInfo
|
||||
{
|
||||
public string Summary { get; set; }
|
||||
public string Title { get; set; }
|
||||
public string Series { get; set; }
|
||||
public string Number { get; set; }
|
||||
public string Volume { get; set; }
|
||||
public string Notes { get; set; }
|
||||
public string Genre { get; set; }
|
||||
public string Summary { get; set; } = string.Empty;
|
||||
public string Title { get; set; } = string.Empty;
|
||||
public string Series { get; set; } = string.Empty;
|
||||
public string Number { get; set; } = string.Empty;
|
||||
/// <summary>
|
||||
/// The total number of items in the series.
|
||||
/// </summary>
|
||||
public int Count { get; set; } = 0;
|
||||
public string Volume { get; set; } = string.Empty;
|
||||
public string Notes { get; set; } = string.Empty;
|
||||
public string Genre { get; set; } = string.Empty;
|
||||
public int PageCount { get; set; }
|
||||
// ReSharper disable once InconsistentNaming
|
||||
public string LanguageISO { get; set; }
|
||||
public string Web { get; set; }
|
||||
public int Month { get; set; }
|
||||
public int Year { get; set; }
|
||||
/// <summary>
|
||||
/// Rating based on the content. Think PG-13, R for movies
|
||||
/// ISO 639-1 Code to represent the language of the content
|
||||
/// </summary>
|
||||
public string AgeRating { get; set; }
|
||||
public string LanguageISO { get; set; } = string.Empty;
|
||||
/// <summary>
|
||||
/// This is the link to where the data was scraped from
|
||||
/// </summary>
|
||||
public string Web { get; set; } = string.Empty;
|
||||
public int Day { get; set; } = 0;
|
||||
public int Month { get; set; } = 0;
|
||||
public int Year { get; set; } = 0;
|
||||
|
||||
|
||||
/// <summary>
|
||||
/// Rating based on the content. Think PG-13, R for movies. See <see cref="AgeRating"/> for valid types
|
||||
/// </summary>
|
||||
public string AgeRating { get; set; } = string.Empty;
|
||||
/// <summary>
|
||||
/// User's rating of the content
|
||||
/// </summary>
|
||||
public float UserRating { get; set; }
|
||||
|
||||
public string AlternateSeries { get; set; }
|
||||
public string StoryArc { get; set; }
|
||||
public string SeriesGroup { get; set; }
|
||||
public string AlternativeSeries { get; set; }
|
||||
public string AlternativeNumber { get; set; }
|
||||
public string AlternateSeries { get; set; } = string.Empty;
|
||||
public string StoryArc { get; set; } = string.Empty;
|
||||
public string SeriesGroup { get; set; } = string.Empty;
|
||||
public string AlternativeSeries { get; set; } = string.Empty;
|
||||
public string AlternativeNumber { get; set; } = string.Empty;
|
||||
|
||||
/// <summary>
|
||||
/// This is Epub only: calibre:title_sort
|
||||
/// Represents the sort order for the title
|
||||
/// </summary>
|
||||
public string TitleSort { get; set; } = string.Empty;
|
||||
|
||||
/// <summary>
|
||||
/// The translator, can be comma separated. This is part of ComicInfo.xml draft v2.1
|
||||
/// </summary>
|
||||
/// See https://github.com/anansi-project/comicinfo/issues/2 for information about this tag
|
||||
public string Translator { get; set; } = string.Empty;
|
||||
/// <summary>
|
||||
/// Misc tags. This is part of ComicInfo.xml draft v2.1
|
||||
/// </summary>
|
||||
/// See https://github.com/anansi-project/comicinfo/issues/1 for information about this tag
|
||||
public string Tags { get; set; } = string.Empty;
|
||||
|
||||
/// <summary>
|
||||
/// This is the Author. For Books, we map creator tag in OPF to this field. Comma separated if multiple.
|
||||
/// </summary>
|
||||
public string Writer { get; set; } // TODO: Validate if we should make this a list of writers
|
||||
public string Penciller { get; set; }
|
||||
public string Inker { get; set; }
|
||||
public string Colorist { get; set; }
|
||||
public string Letterer { get; set; }
|
||||
public string CoverArtist { get; set; }
|
||||
public string Editor { get; set; }
|
||||
public string Publisher { get; set; }
|
||||
public string Writer { get; set; } = string.Empty;
|
||||
public string Penciller { get; set; } = string.Empty;
|
||||
public string Inker { get; set; } = string.Empty;
|
||||
public string Colorist { get; set; } = string.Empty;
|
||||
public string Letterer { get; set; } = string.Empty;
|
||||
public string CoverArtist { get; set; } = string.Empty;
|
||||
public string Editor { get; set; } = string.Empty;
|
||||
public string Publisher { get; set; } = string.Empty;
|
||||
public string Characters { get; set; } = string.Empty;
|
||||
|
||||
public static AgeRating ConvertAgeRatingToEnum(string value)
|
||||
{
|
||||
if (string.IsNullOrEmpty(value)) return Entities.Enums.AgeRating.Unknown;
|
||||
return Enum.GetValues<AgeRating>()
|
||||
.SingleOrDefault(t => t.ToDescription().ToUpperInvariant().Equals(value.ToUpperInvariant()), Entities.Enums.AgeRating.Unknown);
|
||||
}
|
||||
|
||||
public static void CleanComicInfo(ComicInfo info)
|
||||
{
|
||||
if (info == null) return;
|
||||
|
||||
info.Writer = Parser.Parser.CleanAuthor(info.Writer);
|
||||
info.Colorist = Parser.Parser.CleanAuthor(info.Colorist);
|
||||
info.Editor = Parser.Parser.CleanAuthor(info.Editor);
|
||||
info.Inker = Parser.Parser.CleanAuthor(info.Inker);
|
||||
info.Letterer = Parser.Parser.CleanAuthor(info.Letterer);
|
||||
info.Penciller = Parser.Parser.CleanAuthor(info.Penciller);
|
||||
info.Publisher = Parser.Parser.CleanAuthor(info.Publisher);
|
||||
info.Characters = Parser.Parser.CleanAuthor(info.Characters);
|
||||
info.Translator = Parser.Parser.CleanAuthor(info.Translator);
|
||||
info.CoverArtist = Parser.Parser.CleanAuthor(info.CoverArtist);
|
||||
|
||||
|
||||
// if (!string.IsNullOrEmpty(info.Web))
|
||||
// {
|
||||
// // ComicVine stores the Issue number in Number field and does not use Volume.
|
||||
// if (!info.Web.Contains("https://comicvine.gamespot.com/")) return;
|
||||
// if (info.Volume.Equals("1"))
|
||||
// {
|
||||
// info.Volume = Parser.Parser.DefaultVolume;
|
||||
// }
|
||||
// }
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
}
|
||||
|
102
API/Data/MigrateBookmarks.cs
Normal file
102
API/Data/MigrateBookmarks.cs
Normal file
@ -0,0 +1,102 @@
|
||||
using System;
|
||||
using System.Linq;
|
||||
using System.Threading.Tasks;
|
||||
using API.Comparators;
|
||||
using API.Entities.Enums;
|
||||
using API.Services;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace API.Data;
|
||||
|
||||
/// <summary>
|
||||
/// Responsible to migrate existing bookmarks to files
|
||||
/// </summary>
|
||||
public static class MigrateBookmarks
|
||||
{
|
||||
private static readonly Version VersionBookmarksChanged = new Version(0, 4, 9, 27);
|
||||
/// <summary>
|
||||
/// This will migrate existing bookmarks to bookmark folder based
|
||||
/// </summary>
|
||||
/// <remarks>Bookmark directory is configurable. This will always use the default bookmark directory.</remarks>
|
||||
/// <param name="directoryService"></param>
|
||||
/// <returns></returns>
|
||||
public static async Task Migrate(IDirectoryService directoryService, IUnitOfWork unitOfWork,
|
||||
ILogger<Program> logger, ICacheService cacheService)
|
||||
{
|
||||
var bookmarkDirectory = (await unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.BookmarkDirectory))
|
||||
.Value;
|
||||
if (string.IsNullOrEmpty(bookmarkDirectory))
|
||||
{
|
||||
bookmarkDirectory = directoryService.BookmarkDirectory;
|
||||
}
|
||||
|
||||
if (directoryService.Exists(bookmarkDirectory)) return;
|
||||
|
||||
logger.LogInformation("Bookmark migration is needed....This may take some time");
|
||||
|
||||
var allBookmarks = (await unitOfWork.UserRepository.GetAllBookmarksAsync()).ToList();
|
||||
|
||||
var uniqueChapterIds = allBookmarks.Select(b => b.ChapterId).Distinct().ToList();
|
||||
var uniqueUserIds = allBookmarks.Select(b => b.AppUserId).Distinct().ToList();
|
||||
foreach (var userId in uniqueUserIds)
|
||||
{
|
||||
foreach (var chapterId in uniqueChapterIds)
|
||||
{
|
||||
var chapterBookmarks = allBookmarks.Where(b => b.ChapterId == chapterId).ToList();
|
||||
var chapterPages = chapterBookmarks
|
||||
.Select(b => b.Page).ToList();
|
||||
var seriesId = chapterBookmarks
|
||||
.Select(b => b.SeriesId).First();
|
||||
var mangaFiles = await unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId);
|
||||
var chapterExtractPath = directoryService.FileSystem.Path.Join(directoryService.TempDirectory, $"bookmark_c{chapterId}_u{userId}_s{seriesId}");
|
||||
|
||||
var numericComparer = new NumericComparer();
|
||||
if (!mangaFiles.Any()) continue;
|
||||
|
||||
switch (mangaFiles.First().Format)
|
||||
{
|
||||
case MangaFormat.Image:
|
||||
directoryService.ExistOrCreate(chapterExtractPath);
|
||||
directoryService.CopyFilesToDirectory(mangaFiles.Select(f => f.FilePath), chapterExtractPath);
|
||||
break;
|
||||
case MangaFormat.Archive:
|
||||
case MangaFormat.Pdf:
|
||||
cacheService.ExtractChapterFiles(chapterExtractPath, mangaFiles.ToList());
|
||||
break;
|
||||
case MangaFormat.Epub:
|
||||
continue;
|
||||
default:
|
||||
continue;
|
||||
}
|
||||
|
||||
var files = directoryService.GetFilesWithExtension(chapterExtractPath, Parser.Parser.ImageFileExtensions);
|
||||
// Filter out images that aren't in bookmarks
|
||||
Array.Sort(files, numericComparer);
|
||||
foreach (var chapterPage in chapterPages)
|
||||
{
|
||||
var file = files.ElementAt(chapterPage);
|
||||
var bookmark = allBookmarks.FirstOrDefault(b =>
|
||||
b.ChapterId == chapterId && b.SeriesId == seriesId && b.AppUserId == userId &&
|
||||
b.Page == chapterPage);
|
||||
if (bookmark == null) continue;
|
||||
|
||||
var filename = directoryService.FileSystem.Path.GetFileName(file);
|
||||
var newLocation = directoryService.FileSystem.Path.Join(
|
||||
ReaderService.FormatBookmarkFolderPath(String.Empty, userId, seriesId, chapterId),
|
||||
filename);
|
||||
bookmark.FileName = newLocation;
|
||||
directoryService.CopyFileToDirectory(file,
|
||||
ReaderService.FormatBookmarkFolderPath(bookmarkDirectory, userId, seriesId, chapterId));
|
||||
unitOfWork.UserRepository.Update(bookmark);
|
||||
}
|
||||
}
|
||||
// Clear temp after each user to avoid too much space being eaten
|
||||
directoryService.ClearDirectory(directoryService.TempDirectory);
|
||||
}
|
||||
|
||||
await unitOfWork.CommitAsync();
|
||||
// Run CleanupService as we cache a ton of files
|
||||
directoryService.ClearDirectory(directoryService.TempDirectory);
|
||||
|
||||
}
|
||||
}
|
@ -1,12 +1,16 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.IO.Abstractions;
|
||||
using System.Linq;
|
||||
using API.Services;
|
||||
using Kavita.Common;
|
||||
|
||||
namespace API.Data
|
||||
{
|
||||
/// <summary>
|
||||
/// A Migration to migrate config related files to the config/ directory for installs prior to v0.4.9.
|
||||
/// </summary>
|
||||
public static class MigrateConfigFiles
|
||||
{
|
||||
private static readonly List<string> LooseLeafFiles = new List<string>()
|
||||
@ -31,7 +35,7 @@ namespace API.Data
|
||||
/// In v0.4.8 we moved all config files to config/ to match with how docker was setup. This will move all config files from current directory
|
||||
/// to config/
|
||||
/// </summary>
|
||||
public static void Migrate(bool isDocker)
|
||||
public static void Migrate(bool isDocker, IDirectoryService directoryService)
|
||||
{
|
||||
Console.WriteLine("Checking if migration to config/ is needed");
|
||||
|
||||
@ -46,8 +50,8 @@ namespace API.Data
|
||||
Console.WriteLine(
|
||||
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
|
||||
|
||||
CopyAppFolders();
|
||||
DeleteAppFolders();
|
||||
CopyAppFolders(directoryService);
|
||||
DeleteAppFolders(directoryService);
|
||||
|
||||
UpdateConfiguration();
|
||||
|
||||
@ -64,14 +68,14 @@ namespace API.Data
|
||||
Console.WriteLine(
|
||||
"Migrating files from pre-v0.4.8. All Kavita config files are now located in config/");
|
||||
|
||||
Console.WriteLine($"Creating {DirectoryService.ConfigDirectory}");
|
||||
DirectoryService.ExistOrCreate(DirectoryService.ConfigDirectory);
|
||||
Console.WriteLine($"Creating {directoryService.ConfigDirectory}");
|
||||
directoryService.ExistOrCreate(directoryService.ConfigDirectory);
|
||||
|
||||
try
|
||||
{
|
||||
CopyLooseLeafFiles();
|
||||
CopyLooseLeafFiles(directoryService);
|
||||
|
||||
CopyAppFolders();
|
||||
CopyAppFolders(directoryService);
|
||||
|
||||
// Then we need to update the config file to point to the new DB file
|
||||
UpdateConfiguration();
|
||||
@ -84,43 +88,43 @@ namespace API.Data
|
||||
|
||||
// Finally delete everything in the source directory
|
||||
Console.WriteLine("Removing old files");
|
||||
DeleteLooseFiles();
|
||||
DeleteAppFolders();
|
||||
DeleteLooseFiles(directoryService);
|
||||
DeleteAppFolders(directoryService);
|
||||
Console.WriteLine("Removing old files...DONE");
|
||||
|
||||
Console.WriteLine("Migration complete. All config files are now in config/ directory");
|
||||
}
|
||||
|
||||
private static void DeleteAppFolders()
|
||||
private static void DeleteAppFolders(IDirectoryService directoryService)
|
||||
{
|
||||
foreach (var folderToDelete in AppFolders)
|
||||
{
|
||||
if (!new DirectoryInfo(Path.Join(Directory.GetCurrentDirectory(), folderToDelete)).Exists) continue;
|
||||
|
||||
DirectoryService.ClearAndDeleteDirectory(Path.Join(Directory.GetCurrentDirectory(), folderToDelete));
|
||||
directoryService.ClearAndDeleteDirectory(Path.Join(Directory.GetCurrentDirectory(), folderToDelete));
|
||||
}
|
||||
}
|
||||
|
||||
private static void DeleteLooseFiles()
|
||||
private static void DeleteLooseFiles(IDirectoryService directoryService)
|
||||
{
|
||||
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
|
||||
.Where(f => f.Exists);
|
||||
DirectoryService.DeleteFiles(configFiles.Select(f => f.FullName));
|
||||
directoryService.DeleteFiles(configFiles.Select(f => f.FullName));
|
||||
}
|
||||
|
||||
private static void CopyAppFolders()
|
||||
private static void CopyAppFolders(IDirectoryService directoryService)
|
||||
{
|
||||
Console.WriteLine("Moving folders to config");
|
||||
|
||||
foreach (var folderToMove in AppFolders)
|
||||
{
|
||||
if (new DirectoryInfo(Path.Join(DirectoryService.ConfigDirectory, folderToMove)).Exists) continue;
|
||||
if (new DirectoryInfo(Path.Join(directoryService.ConfigDirectory, folderToMove)).Exists) continue;
|
||||
|
||||
try
|
||||
{
|
||||
DirectoryService.CopyDirectoryToDirectory(
|
||||
Path.Join(Directory.GetCurrentDirectory(), folderToMove),
|
||||
Path.Join(DirectoryService.ConfigDirectory, folderToMove));
|
||||
directoryService.CopyDirectoryToDirectory(
|
||||
Path.Join(directoryService.FileSystem.Directory.GetCurrentDirectory(), folderToMove),
|
||||
Path.Join(directoryService.ConfigDirectory, folderToMove));
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
@ -132,9 +136,9 @@ namespace API.Data
|
||||
Console.WriteLine("Moving folders to config...DONE");
|
||||
}
|
||||
|
||||
private static void CopyLooseLeafFiles()
|
||||
private static void CopyLooseLeafFiles(IDirectoryService directoryService)
|
||||
{
|
||||
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(Directory.GetCurrentDirectory(), file)))
|
||||
var configFiles = LooseLeafFiles.Select(file => new FileInfo(Path.Join(directoryService.FileSystem.Directory.GetCurrentDirectory(), file)))
|
||||
.Where(f => f.Exists);
|
||||
// First step is to move all the files
|
||||
Console.WriteLine("Moving files to config/");
|
||||
@ -142,7 +146,7 @@ namespace API.Data
|
||||
{
|
||||
try
|
||||
{
|
||||
fileInfo.CopyTo(Path.Join(DirectoryService.ConfigDirectory, fileInfo.Name));
|
||||
fileInfo.CopyTo(Path.Join(directoryService.ConfigDirectory, fileInfo.Name));
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
|
@ -29,10 +29,10 @@ namespace API.Data
|
||||
/// <summary>
|
||||
/// Run first. Will extract byte[]s from DB and write them to the cover directory.
|
||||
/// </summary>
|
||||
public static void ExtractToImages(DbContext context)
|
||||
public static void ExtractToImages(DbContext context, IDirectoryService directoryService, IImageService imageService)
|
||||
{
|
||||
Console.WriteLine("Migrating Cover Images to disk. Expect delay.");
|
||||
DirectoryService.ExistOrCreate(DirectoryService.CoverImageDirectory);
|
||||
directoryService.ExistOrCreate(directoryService.CoverImageDirectory);
|
||||
|
||||
Console.WriteLine("Extracting cover images for Series");
|
||||
var lockedSeries = SqlHelper.RawSqlQuery(context, "Select Id, CoverImage From Series Where CoverImage IS NOT NULL", x =>
|
||||
@ -45,14 +45,14 @@ namespace API.Data
|
||||
foreach (var series in lockedSeries)
|
||||
{
|
||||
if (series.CoverImage == null || !series.CoverImage.Any()) continue;
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetSeriesFormat(int.Parse(series.Id))}.png"))) continue;
|
||||
|
||||
try
|
||||
{
|
||||
var stream = new MemoryStream(series.CoverImage);
|
||||
stream.Position = 0;
|
||||
ImageService.WriteCoverThumbnail(stream, ImageService.GetSeriesFormat(int.Parse(series.Id)));
|
||||
imageService.WriteCoverThumbnail(stream, ImageService.GetSeriesFormat(int.Parse(series.Id)), directoryService.CoverImageDirectory);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
@ -71,14 +71,14 @@ namespace API.Data
|
||||
foreach (var chapter in chapters)
|
||||
{
|
||||
if (chapter.CoverImage == null || !chapter.CoverImage.Any()) continue;
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (directoryService.FileSystem.File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetChapterFormat(int.Parse(chapter.Id), int.Parse(chapter.ParentId))}.png"))) continue;
|
||||
|
||||
try
|
||||
{
|
||||
var stream = new MemoryStream(chapter.CoverImage);
|
||||
stream.Position = 0;
|
||||
ImageService.WriteCoverThumbnail(stream, $"{ImageService.GetChapterFormat(int.Parse(chapter.Id), int.Parse(chapter.ParentId))}");
|
||||
imageService.WriteCoverThumbnail(stream, $"{ImageService.GetChapterFormat(int.Parse(chapter.Id), int.Parse(chapter.ParentId))}", directoryService.CoverImageDirectory);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
@ -97,13 +97,13 @@ namespace API.Data
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
if (tag.CoverImage == null || !tag.CoverImage.Any()) continue;
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (directoryService.FileSystem.File.Exists(Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetCollectionTagFormat(int.Parse(tag.Id))}.png"))) continue;
|
||||
try
|
||||
{
|
||||
var stream = new MemoryStream(tag.CoverImage);
|
||||
stream.Position = 0;
|
||||
ImageService.WriteCoverThumbnail(stream, $"{ImageService.GetCollectionTagFormat(int.Parse(tag.Id))}");
|
||||
imageService.WriteCoverThumbnail(stream, $"{ImageService.GetCollectionTagFormat(int.Parse(tag.Id))}", directoryService.CoverImageDirectory);
|
||||
}
|
||||
catch (Exception e)
|
||||
{
|
||||
@ -116,13 +116,13 @@ namespace API.Data
|
||||
/// Run after <see cref="ExtractToImages"/>. Will update the DB with names of files that were extracted.
|
||||
/// </summary>
|
||||
/// <param name="context"></param>
|
||||
public static async Task UpdateDatabaseWithImages(DataContext context)
|
||||
public static async Task UpdateDatabaseWithImages(DataContext context, IDirectoryService directoryService)
|
||||
{
|
||||
Console.WriteLine("Updating Series entities");
|
||||
var seriesCovers = await context.Series.Where(s => !string.IsNullOrEmpty(s.CoverImage)).ToListAsync();
|
||||
foreach (var series in seriesCovers)
|
||||
{
|
||||
if (!File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (!directoryService.FileSystem.File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetSeriesFormat(series.Id)}.png"))) continue;
|
||||
series.CoverImage = $"{ImageService.GetSeriesFormat(series.Id)}.png";
|
||||
}
|
||||
@ -131,9 +131,10 @@ namespace API.Data
|
||||
|
||||
Console.WriteLine("Updating Chapter entities");
|
||||
var chapters = await context.Chapter.ToListAsync();
|
||||
// ReSharper disable once ForeachCanBePartlyConvertedToQueryUsingAnotherGetEnumerator
|
||||
foreach (var chapter in chapters)
|
||||
{
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (directoryService.FileSystem.File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId)}.png")))
|
||||
{
|
||||
chapter.CoverImage = $"{ImageService.GetChapterFormat(chapter.Id, chapter.VolumeId)}.png";
|
||||
@ -149,7 +150,7 @@ namespace API.Data
|
||||
{
|
||||
var firstChapter = volume.Chapters.OrderBy(x => double.Parse(x.Number), ChapterSortComparerForInChapterSorting).FirstOrDefault();
|
||||
if (firstChapter == null) continue;
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (directoryService.FileSystem.File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetChapterFormat(firstChapter.Id, firstChapter.VolumeId)}.png")))
|
||||
{
|
||||
volume.CoverImage = $"{ImageService.GetChapterFormat(firstChapter.Id, firstChapter.VolumeId)}.png";
|
||||
@ -161,9 +162,10 @@ namespace API.Data
|
||||
|
||||
Console.WriteLine("Updating Collection Tag entities");
|
||||
var tags = await context.CollectionTag.ToListAsync();
|
||||
// ReSharper disable once ForeachCanBePartlyConvertedToQueryUsingAnotherGetEnumerator
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
if (File.Exists(Path.Join(DirectoryService.CoverImageDirectory,
|
||||
if (directoryService.FileSystem.File.Exists(directoryService.FileSystem.Path.Join(directoryService.CoverImageDirectory,
|
||||
$"{ImageService.GetCollectionTagFormat(tag.Id)}.png")))
|
||||
{
|
||||
tag.CoverImage = $"{ImageService.GetCollectionTagFormat(tag.Id)}.png";
|
||||
|
1215
API/Data/Migrations/20211127200244_MetadataFoundation.Designer.cs
generated
Normal file
1215
API/Data/Migrations/20211127200244_MetadataFoundation.Designer.cs
generated
Normal file
File diff suppressed because it is too large
Load Diff
203
API/Data/Migrations/20211127200244_MetadataFoundation.cs
Normal file
203
API/Data/Migrations/20211127200244_MetadataFoundation.cs
Normal file
@ -0,0 +1,203 @@
|
||||
using Microsoft.EntityFrameworkCore.Migrations;
|
||||
|
||||
#nullable disable
|
||||
|
||||
namespace API.Data.Migrations
|
||||
{
|
||||
public partial class MetadataFoundation : Migration
|
||||
{
|
||||
protected override void Up(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.DropColumn(
|
||||
name: "Summary",
|
||||
table: "Series");
|
||||
|
||||
migrationBuilder.AddColumn<string>(
|
||||
name: "Summary",
|
||||
table: "SeriesMetadata",
|
||||
type: "TEXT",
|
||||
nullable: true);
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "ChapterMetadata",
|
||||
columns: table => new
|
||||
{
|
||||
Id = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
.Annotation("Sqlite:Autoincrement", true),
|
||||
Title = table.Column<string>(type: "TEXT", nullable: true),
|
||||
Year = table.Column<string>(type: "TEXT", nullable: true),
|
||||
StoryArc = table.Column<string>(type: "TEXT", nullable: true),
|
||||
ChapterId = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_ChapterMetadata", x => x.Id);
|
||||
table.ForeignKey(
|
||||
name: "FK_ChapterMetadata_Chapter_ChapterId",
|
||||
column: x => x.ChapterId,
|
||||
principalTable: "Chapter",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "Genre",
|
||||
columns: table => new
|
||||
{
|
||||
Id = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
.Annotation("Sqlite:Autoincrement", true),
|
||||
Name = table.Column<string>(type: "TEXT", nullable: true),
|
||||
NormalizedName = table.Column<string>(type: "TEXT", nullable: true)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_Genre", x => x.Id);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "Person",
|
||||
columns: table => new
|
||||
{
|
||||
Id = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
.Annotation("Sqlite:Autoincrement", true),
|
||||
Name = table.Column<string>(type: "TEXT", nullable: true),
|
||||
NormalizedName = table.Column<string>(type: "TEXT", nullable: true),
|
||||
Role = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_Person", x => x.Id);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "GenreSeriesMetadata",
|
||||
columns: table => new
|
||||
{
|
||||
GenresId = table.Column<int>(type: "INTEGER", nullable: false),
|
||||
SeriesMetadatasId = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_GenreSeriesMetadata", x => new { x.GenresId, x.SeriesMetadatasId });
|
||||
table.ForeignKey(
|
||||
name: "FK_GenreSeriesMetadata_Genre_GenresId",
|
||||
column: x => x.GenresId,
|
||||
principalTable: "Genre",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
table.ForeignKey(
|
||||
name: "FK_GenreSeriesMetadata_SeriesMetadata_SeriesMetadatasId",
|
||||
column: x => x.SeriesMetadatasId,
|
||||
principalTable: "SeriesMetadata",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "ChapterMetadataPerson",
|
||||
columns: table => new
|
||||
{
|
||||
ChapterMetadatasId = table.Column<int>(type: "INTEGER", nullable: false),
|
||||
PeopleId = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_ChapterMetadataPerson", x => new { x.ChapterMetadatasId, x.PeopleId });
|
||||
table.ForeignKey(
|
||||
name: "FK_ChapterMetadataPerson_ChapterMetadata_ChapterMetadatasId",
|
||||
column: x => x.ChapterMetadatasId,
|
||||
principalTable: "ChapterMetadata",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
table.ForeignKey(
|
||||
name: "FK_ChapterMetadataPerson_Person_PeopleId",
|
||||
column: x => x.PeopleId,
|
||||
principalTable: "Person",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateTable(
|
||||
name: "PersonSeriesMetadata",
|
||||
columns: table => new
|
||||
{
|
||||
PeopleId = table.Column<int>(type: "INTEGER", nullable: false),
|
||||
SeriesMetadatasId = table.Column<int>(type: "INTEGER", nullable: false)
|
||||
},
|
||||
constraints: table =>
|
||||
{
|
||||
table.PrimaryKey("PK_PersonSeriesMetadata", x => new { x.PeopleId, x.SeriesMetadatasId });
|
||||
table.ForeignKey(
|
||||
name: "FK_PersonSeriesMetadata_Person_PeopleId",
|
||||
column: x => x.PeopleId,
|
||||
principalTable: "Person",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
table.ForeignKey(
|
||||
name: "FK_PersonSeriesMetadata_SeriesMetadata_SeriesMetadatasId",
|
||||
column: x => x.SeriesMetadatasId,
|
||||
principalTable: "SeriesMetadata",
|
||||
principalColumn: "Id",
|
||||
onDelete: ReferentialAction.Cascade);
|
||||
});
|
||||
|
||||
migrationBuilder.CreateIndex(
|
||||
name: "IX_ChapterMetadata_ChapterId",
|
||||
table: "ChapterMetadata",
|
||||
column: "ChapterId",
|
||||
unique: true);
|
||||
|
||||
migrationBuilder.CreateIndex(
|
||||
name: "IX_ChapterMetadataPerson_PeopleId",
|
||||
table: "ChapterMetadataPerson",
|
||||
column: "PeopleId");
|
||||
|
||||
migrationBuilder.CreateIndex(
|
||||
name: "IX_Genre_NormalizedName",
|
||||
table: "Genre",
|
||||
column: "NormalizedName",
|
||||
unique: true);
|
||||
|
||||
migrationBuilder.CreateIndex(
|
||||
name: "IX_GenreSeriesMetadata_SeriesMetadatasId",
|
||||
table: "GenreSeriesMetadata",
|
||||
column: "SeriesMetadatasId");
|
||||
|
||||
migrationBuilder.CreateIndex(
|
||||
name: "IX_PersonSeriesMetadata_SeriesMetadatasId",
|
||||
table: "PersonSeriesMetadata",
|
||||
column: "SeriesMetadatasId");
|
||||
}
|
||||
|
||||
protected override void Down(MigrationBuilder migrationBuilder)
|
||||
{
|
||||
migrationBuilder.DropTable(
|
||||
name: "ChapterMetadataPerson");
|
||||
|
||||
migrationBuilder.DropTable(
|
||||
name: "GenreSeriesMetadata");
|
||||
|
||||
migrationBuilder.DropTable(
|
||||
name: "PersonSeriesMetadata");
|
||||
|
||||
migrationBuilder.DropTable(
|
||||
name: "ChapterMetadata");
|
||||
|
||||
migrationBuilder.DropTable(
|
||||
name: "Genre");
|
||||
|
||||
migrationBuilder.DropTable(
|
||||
name: "Person");
|
||||
|
||||
migrationBuilder.DropColumn(
|
||||
name: "Summary",
|
||||
table: "SeriesMetadata");
|
||||
|
||||
migrationBuilder.AddColumn<string>(
|
||||
name: "Summary",
|
||||
table: "Series",
|
||||
type: "TEXT",
|
||||
nullable: true);
|
||||
}
|
||||
}
|
||||
}
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user