The big one (#396)

* Refactored library card to have a custom implemenation using icons rather than images. In addition, swapped out font awesome with official version.

* Replaced pull-right with float-right due to updated bootstrap version.

* Added a new section to admin dashboard

* Added some menu system for reader, fit to width, height or original. Temp hack to make background black.

* Ability to set nav bar completely off from some pages. Removed test case that isn't used.

* Restore nav bar after reading

* Implemented ability to delete a series directly and scan from a series.

* Implemented some basic prefetching capabilities (just next page) and implemented proper reading direction support with a toggle.

* Added a no connection route for if backend goes down. Removed go to page functionality as it isn't really needed and overly complicated.

* Implemented ability to track progress and view it at a series level

* Read status enhancements, cleaned up card code a bit, styling changes on nav bar dropdown, read or continue functionality for series detail page.

* Fixed a few bugs around registering and refactored APIs to match backend.

* Lots of cleanup of the code and TODOs. Improved responsiveness on series detail page.

* Missed some changes

* Implemented ability to rate a series. review text will come in v0.2.

* Reverted some debug code for reader menu always being open. Added loader to reader as well.

* Setup for building prod for releasing Kavita server.

* After we create an admin for first time flow, refresh page so they can login.

* Small change to help user get to server settings to setup libraries

* Implemented ability to save what tab you are on or link directly to tab for admin dashboard.

* Implemented ability to reset another users password. Tweaked how error interceptor reacted to OK messages.

* Implemented general settings. Have ability to change cache directory, but disabled on BE.

* Remove SSL

* Implemented Volume 0's for series detail.

* Compressed image-placeholder and implemented refresh metadata. Refresh metadata will update cover images while scan library will just fix matching, etc.

* Refactored for backened architectural changes. Fixed some bugs around read progress off by one.

* Fixed some styling around grid layout for volume then chapters.

* On unauthorized, force logout then redirect to login page.

* Don't throw multiple toasters when somthing goes wrong due to backend going down.

* Implemented the ability to see and update server settings.

* Implemented user preferences and ability to update them. Fixed a bug in production code such that API requests are made on current domain.

* Small fixes around the app and server setting for port.

* Fixed some styling to look better on mobile devices and overflow text to eclipse.

* Cleanup and implemented card details for Volume/Chapters.

* Small tweak to card details

* Mark as Read/unread on Volumes now implemented.

* Cleaned up some code, integrated user settings into manga reader, took care of some todos.

* Forgot to sort chapters

* Fixed issue in card details with string concatentation

* Updated the Manga Reader to be better looking and simplier (code) on Desktop devices.

* Added more responsive breakpoints for overlay to look much better on more screen sizes

* Some changes for tablet. Clear out localStorage that is older than 1 page of what you're reading.

* Fix bug for continuing where you last left off.

* Fixed a bug where continue reading didn't take into account chapters.

* Cleaned up imports and added autocomplete skeleton.

* Small UX enhancements

* Moved manga-reader into it's own module to lessen default package size

* Removed route guards from reader module as it is handled by parent module.

* Responsive pass through on Series Detail page.

* Cleaned up containers and tooltips.

* Tooltip for icon only buttons

* Library editor modal cleanup

* Implemented nav bar for directory picker.

* Removed console.log

* Implemented a basic search for Kavita. Along the way did some CSS changes and error interceptor messages are better.

* Implemented a re-usable base64 image that can be styled. Not as easy as using inline styling, but easy to use.

* View encapsulation off so we can easily size these images.

* Implemented typeahead search for nav bar.

* Fix a bug when route parameters change, the series detail page wasn't updating with new information

* Implemented page splitting

* Cleaned up Card Details and split into 2 separate versions with unified Look and Feel.

* Implemented ability to mark a series as read/unread.

* Implemented Jump to First/Last page functionality as shortcuts to goToPage.

* Implemented pagination on Library Detail page

* Restore scroll position to the top on page route change

* Not sure if this changes anything, but IDE doesn't complain

* Added a cutsom favicon and small tweak on UI for library pagination controls.

* Bugfix to take into account currently reading chapter for read/continue button

* Implemented user reviews

* Forgot to hook up one click handler

* Only admins can edit a series

* Implemented edit series page. Many fields are not yet supported for modification.

* Hooked in Edit Series into backend. Fixed an ngIf on edit review button.

* Switched over existing series info modal to use the new edit one.

* Partially implemented download logs. Removed some files not needed and trialing css changes on actions menu for card items.

* Integrated Jest for Unit Testing and added one test case. Will expand and integrate into work flow.

* Cleaned up some mobile breakpoint styles. Looks much better on a phone.

* A bit more css around phones to make reader menu useable.

* Removed series-card-detail since it's been replaced with edit-series-modal.

* Implemented save logs

* Small cleanup

* More responsive breakpoint tweaks for nav bar.

* Fetching logs is fixed

* Bugfix: Search bar was visible for non-authenticated users

* Implemented the ability to correct (manually) a series title and for it to persist between scans. Small QoL changes throughout codebase.

* Added some broken test setup.

* Disable comments must start with space lint rule.

* Fixed issue where tablets wouldn't be able to render all images.

* Migrated code off localStorage and used one api to get information about chapter.

* Cleaned up the code now that we are loading images differently.

* Use circular array to cache image requests so that we can ensure next image is instantaneously available.

* Some fixes around ensuring we don't prefetch when going back a page and ensuring prefetch doesn't fetch more pages than there are left.

* Fixed #70: When marking as read from volume level, completion was off by 1 thus series level didn't show as completed.

* Fixed #72. Missing an else statement which allowed another navigate to override the correct code. Refactored hasReadingProgress to be set in setContinuePoint

* Cleaned up the User button in nav bar to be cleaner

* Implemented a custom confirm/alert service so that I have complete control over style.

* Missed LGTM exception

* First pass at removing base64 strings for images and using lazy loaded binary file images.

* Only load images that are within view (scroll port)

* Not connected message needs more top margin

* Move image handling to it's own service. Add transition for loading images and some cleanup

* Misc cleanup

* Refactored action items to a factory

* Refactored the card item actionables into factory and moved actionable rendering into one component

* Added an optional btn class input to allow styling menu as a button.

* Implemented the ability to reset your individual password.

* Wrong reset after resetting password

* Don't let user set log level. Not sure it's possible to implement in ASP.NET

* Implemented a carousel for streams component. Still needs some CSS and tweaking. Added some temp API endpoints for streams. Fixed a bug where after editing name on series underlying card didn't reflect.

* Everything but the css is done

* CSS done. Carousel components implemented

* More CSS stuff

* Small css change

* Some cleanup on code

* Add  aria-hidden="true" on icons

* Fixed css issue due to missing class

* Made scrolling carousel feel better on more screen sizes

* Fixed bug where confirm default buttons would all be cancel.

* Replaced placeholder image with a kavita placeholder. Added a theme folder for standardizing colors. Cleaned up some css here and there.

* Removed a dependency no longer needed. Implemented history based pagination for library detail page.

* Changed MangaFile numberOfPages to Page to match with new migration

* Fixed issue where if no chapters, we were just doing console.error instead of informing the user (should never happen)

* Add a todo for a future feature

* Implemented loading on series-detail volume section and fixed an issue where the whole series is just one volume and it's a special aka we can't parse vol/chapter from it, so it renders appropriately

* Fixed a rare issue where split pages would quickly flash both sides due to previously that page being fetched via onload and thus when render called, render got recalled.

* Fixed an off by 1 issue due to the fact that reading is 0-based and everything else is 1 based. (#94)

* Fixed an off by 1 issue due to the fact that reading is 0-based and everything else is 1 based. (#94) (#95)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials" (#96)

* Bugfixes! (#99)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials"

* Fixed a typo resulting in pages not rendering on edit series modal. Ensure chapters are sorted on edit series and card details modal.

* Fixed the date format showing days before months.

* Fixed a bug with scrollable modals for context info modals.

* Fixed a bug where adding a folder to a library added a / before the path, thus breaking on linux. (#101)

* Bugfixes (#103)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials"

* Fixed a typo resulting in pages not rendering on edit series modal. Ensure chapters are sorted on edit series and card details modal.

* Fixed the date format showing days before months.

* Fixed a bug with scrollable modals for context info modals.

* Last modified on chapters didn't really s how well and made no sense to show, removed it.

* Preparing for feature

* CSS adjustment for admin dashboard

* First flow, ensure we go directly to library tab

* When a user registers for first time, put them on login instead of relying on home page redirection.

* Fixed an issue with directory picker where the path separators didn't work for both linux and windows systems.

* Implement comic support (#104)

* Implement comic support

* Any disabled controls should show not-allowed pointer.

* Fixed a scroll bug on modal

* On connection lost, restore to previous page (#106)

* Implement comic support

* Any disabled controls should show not-allowed pointer.

* Fixed a scroll bug on modal

* If the server goes down between sessions and we go to not-connected page, try to restore previous route when connection regained.

* Fixed an issue where context menus weren't resetting when an admin logs out and a new non-admin logs in. (#108)

* Error Cards (#110)

* Fixed an issue where context menus weren't resetting when an admin logs out and a new non-admin logs in.

* Implemented a marker to inform the user that some archives can't be parsed.

* Don't show scrollbar if we don't have enough height to overflow

* Shows an error card when the underlying archive could not be read at all.

* Changed the card up

* Special grouping (#115)

* Implemented splitting specials into their own section for individual reading. Requires up to date backend for db changes.

* Cleaned up the code

* Replace underscores on specials if they exist. A simple name cleaning.

* Lots of Fixes (#126)

* Fixed After editing a user's library access, the Sharing details aren't updating without a refresh #116

* Fixed Series Summary & Review do not respect newline characters #114

* Default to non-specials tab and don't destroy DOM between tab changes

* Align UI api with backend

* Library icon should be "manga" for comic and Manga

* Fixed Mark Series as Read in series detail page doesn't update the volume/chapter cards unless page is refreshed. #118

* Fixed Defect: 2 Split pages in a row causes second page to not split #112

* Fixed an issue if last page is a splitpage, we wouldn't be able to see the other side of the split.

* When jumping to begining and end and both first page and last page are splitpages, make sure we set the paging direction so user can view both pages.

* Make sure we take into account splits when we try jump to first page then try to go "back" to the other split.

* Cleaned up split code a bit

* Fixed Go to Page is off by one #124

* Fixed Read button is showing continue when a show doesn't have any progress on it #121

* Implemented Read more component (Fixes #117)

* Fixed a bug in gotopage where if you went to maxPages or greater, we would always - 1 from page number.

* Forgot to commit this for Readmore component

* tslint cleanup

* Implemented Refactor Review to be bound to the star control rather than having a text review so the user can review without rating. #125

* Fixes #119 - 0 Volumes with 0 chapters were showing as specials, but should be in the special tab.

* Fixed an issue from reverting scanSeries code.

* Handle specials with a little more care

* Fixed #138. Search wasn't showing localizedName due to a rendering issue.

* Fixed an issue where L2R didn't handle multiple split pages in a row.

* Code smells

* Ensure we wipe context actions for library between login/logouts

* Fixed loading series after marking searies unread/read (#135)

* Removed isSpecial from volume (#137)

* Bugfix/gotopage (#139)

* Fixed #138

* Fixed #131 - getStem no longer removes the substring if lastPath is same as path.

* Implements Issue #129 - There is now a close book button on the menu

* Book Support (#141)

* Refactored Library Type dropdown to use an API so UI/Backend is in sync.

* Implemented the ability to load the book reader

* Book support but none of this works. Just here to keep track as I switch to other bugs

* Basic iframe implementation is now working

* Needed changes to load the content into native div rather than via iframe.

* We now have the ability to customize how we render the text.

* Removed console.log

* Implemented the ability to loadpages from remapped anchors from backend.

* Removed epubjs references and implemented table of contents api.

* Code now works for chapters with nested chapters

* Lots of changes, most of the reader is half baked, but foundation is there.

* Changed styles up a bit

* Implemented the ability to scroll to a part within a book. Added a custom font to test out.

* Show active page with a bolding when there are nested chapters

* Chapter group titles are now clickable

* Added the ability to set top offset in drawer

* Finally got style overrides to work and some other stuff

* User can now toggle menu with space

* Ensure styles don't leak. Drawer bottom was cutoff. On phone devices, default margins should be 0%.

* Use smooth scrolling when navigating between pages with scroll offset

* Added some code for checking when all images on page are loaded, added a fade in animation (doesnt work) and some media queries for top bar.

* Refactored all data structures in application into shared module

* CSS changes

* Fixed part selector query due to improper ids, now we use a more robust query type. Implemented a stack for adhoc clicks, so user can explore a bit but pagination is based on current page.

* Reverted sidenav changes. Fixed scrollTo to be more reliable with how the content comes into view.

* When you make an adhoc link jump, we now restore page and scroll position.

* Hooked in basic preferences for books and force margin settings for mobile devices.

* Book overrides now work all the time. Added a bunch of fonts for users to try out.

* Added all font faces

* A bit hacky, but darkMode now works for the reader.

* Remove styles on destroy

* First time users will have their menu open automatically

* Book format now shows on card details modal

* changed how margin updates to make more sense

* Fixed flashing by applying an opacity transition on page change.

* Code cleanup

* Reverted changes to unify series-detail page. Added some extra accessibility for book reader.

* Implement the ability to close drawer by clicking into the reader area

* Don't let the user page past where they should be able to

* Allow user to see the underlying values of customizations and when they save them, actually reset to their preferences

* Responsive top for sticky header

* Code smells

* Implemented the ability to update book settings from user settings

* code smells

* Code smells and max/mins on reader should match the user pref area

* Feature/feats and fixes (#144)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Feature/feats and fixes (#146)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Book Feedback and Fixes (#147)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Cleaned up carousel and fixed breakpoints so we always at least show 2 cards.

* Cleaned up menu for manga reader and changed how automatic scaling works, based on ratio of width to height rather than raw numbers.

* Fixed an issue where using left/right keys on book reader wouldn't behave like clicking left/right pagination buttons.

* Dark mode and click to paginate was conflicting. The hint overlay still doesn't work when dark mode is on.

* Book Feedback (#148)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Cleaned up carousel and fixed breakpoints so we always at least show 2 cards.

* Cleaned up menu for manga reader and changed how automatic scaling works, based on ratio of width to height rather than raw numbers.

* Fixed an issue where using left/right keys on book reader wouldn't behave like clicking left/right pagination buttons.

* Dark mode and click to paginate was conflicting. The hint overlay still doesn't work when dark mode is on.

* Fixed issue where errors from login flow would not throw a toastr

* Moved the progress bar and go to page into the side drawer to be less distracting when reading

* Removed console.logs

* Cleaned up styles on slider to be closer to size of cards

* Fixed an issue with swiper not allowing use of touch (#149)

* Fixed in progress by on last page incrementing to maxPages itself, thus ensuring it matches the sum of pages. (#151)

* Bugfix/in progress (#156)

* Fixed in progress by on last page incrementing to maxPages itself, thus ensuring it matches the sum of pages.

* Actually fix in progress by only incrementing page num on bookmark when we are on the last page

* Impleents tap to paginate user setting. (#157)

* Feature/manga reader (#160)

* Implemented pressing G to open go to page. Enhanced the dialog to give how many pages you can go to. On page splitting button press, if the current page needs splitting, we will re-render with the new option.

* Added gotopage shortcut key for book reader

* Setup for new feature

* Swiper now respects card sizes

* Fixes #51 and updates dependencies for security vulnerabilities

* Implemented back to top button

* Remove the - 1 hack from series-detail

* Remove hack from carad item

* Fix a regression where book reader would +1 pageNum for bookmarking on last page, but because books don't start at 0 for page nums, it isn't necessariy

* Implemented the ability to move between volumes automatically

* Additional security fix

* Code smells

* Cleaned up the implementation to properly prevent pagination when loading next chapter/volume

* v0.4 Last touches (#162)

* PurgeCSS integration

* Changed some icons to have titles

* Automatic scaling changes

* Removed 2 font families that didn't make the release cut. Fixed an off by 1 regression with setContinuePoint

* Backed out purge css after testing

* Some cleanup of the package

* Automatic scaling adjustments

* Bugfix/release shakeout (#164)

* Fixed body color not being reset due to capturing it too late

* Removed some dead code

* v0.4 merge to stable (#165)

* Fixed an off by 1 issue due to the fact that reading is 0-based and everything else is 1 based. (#94)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials" (#96)

* Bugfixes! (#99)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials"

* Fixed a typo resulting in pages not rendering on edit series modal. Ensure chapters are sorted on edit series and card details modal.

* Fixed the date format showing days before months.

* Fixed a bug with scrollable modals for context info modals.

* Fixed a bug where adding a folder to a library added a / before the path, thus breaking on linux. (#101)

* Bugfixes (#103)

* Fixed an issue where special case that handles no volumes was showing also when chapters also existed. Renamed "Chapter 0" as "Specials"

* Fixed a typo resulting in pages not rendering on edit series modal. Ensure chapters are sorted on edit series and card details modal.

* Fixed the date format showing days before months.

* Fixed a bug with scrollable modals for context info modals.

* Last modified on chapters didn't really s how well and made no sense to show, removed it.

* Preparing for feature

* CSS adjustment for admin dashboard

* First flow, ensure we go directly to library tab

* When a user registers for first time, put them on login instead of relying on home page redirection.

* Fixed an issue with directory picker where the path separators didn't work for both linux and windows systems.

* Implement comic support (#104)

* Implement comic support

* Any disabled controls should show not-allowed pointer.

* Fixed a scroll bug on modal

* On connection lost, restore to previous page (#106)

* Implement comic support

* Any disabled controls should show not-allowed pointer.

* Fixed a scroll bug on modal

* If the server goes down between sessions and we go to not-connected page, try to restore previous route when connection regained.

* Fixed an issue where context menus weren't resetting when an admin logs out and a new non-admin logs in. (#108)

* Error Cards (#110)

* Fixed an issue where context menus weren't resetting when an admin logs out and a new non-admin logs in.

* Implemented a marker to inform the user that some archives can't be parsed.

* Don't show scrollbar if we don't have enough height to overflow

* Shows an error card when the underlying archive could not be read at all.

* Changed the card up

* Special grouping (#115)

* Implemented splitting specials into their own section for individual reading. Requires up to date backend for db changes.

* Cleaned up the code

* Replace underscores on specials if they exist. A simple name cleaning.

* Lots of Fixes (#126)

* Fixed After editing a user's library access, the Sharing details aren't updating without a refresh #116

* Fixed Series Summary & Review do not respect newline characters #114

* Default to non-specials tab and don't destroy DOM between tab changes

* Align UI api with backend

* Library icon should be "manga" for comic and Manga

* Fixed Mark Series as Read in series detail page doesn't update the volume/chapter cards unless page is refreshed. #118

* Fixed Defect: 2 Split pages in a row causes second page to not split #112

* Fixed an issue if last page is a splitpage, we wouldn't be able to see the other side of the split.

* When jumping to begining and end and both first page and last page are splitpages, make sure we set the paging direction so user can view both pages.

* Make sure we take into account splits when we try jump to first page then try to go "back" to the other split.

* Cleaned up split code a bit

* Fixed Go to Page is off by one #124

* Fixed Read button is showing continue when a show doesn't have any progress on it #121

* Implemented Read more component (Fixes #117)

* Fixed a bug in gotopage where if you went to maxPages or greater, we would always - 1 from page number.

* Forgot to commit this for Readmore component

* tslint cleanup

* Implemented Refactor Review to be bound to the star control rather than having a text review so the user can review without rating. #125

* Fixes #119 - 0 Volumes with 0 chapters were showing as specials, but should be in the special tab.

* Fixed an issue from reverting scanSeries code.

* Handle specials with a little more care

* Fixed #138. Search wasn't showing localizedName due to a rendering issue.

* Fixed an issue where L2R didn't handle multiple split pages in a row.

* Code smells

* Ensure we wipe context actions for library between login/logouts

* Fixed loading series after marking searies unread/read (#135)

* Removed isSpecial from volume (#137)

* Bugfix/gotopage (#139)

* Fixed #138

* Fixed #131 - getStem no longer removes the substring if lastPath is same as path.

* Implements Issue #129 - There is now a close book button on the menu

* Book Support (#141)

* Refactored Library Type dropdown to use an API so UI/Backend is in sync.

* Implemented the ability to load the book reader

* Book support but none of this works. Just here to keep track as I switch to other bugs

* Basic iframe implementation is now working

* Needed changes to load the content into native div rather than via iframe.

* We now have the ability to customize how we render the text.

* Removed console.log

* Implemented the ability to loadpages from remapped anchors from backend.

* Removed epubjs references and implemented table of contents api.

* Code now works for chapters with nested chapters

* Lots of changes, most of the reader is half baked, but foundation is there.

* Changed styles up a bit

* Implemented the ability to scroll to a part within a book. Added a custom font to test out.

* Show active page with a bolding when there are nested chapters

* Chapter group titles are now clickable

* Added the ability to set top offset in drawer

* Finally got style overrides to work and some other stuff

* User can now toggle menu with space

* Ensure styles don't leak. Drawer bottom was cutoff. On phone devices, default margins should be 0%.

* Use smooth scrolling when navigating between pages with scroll offset

* Added some code for checking when all images on page are loaded, added a fade in animation (doesnt work) and some media queries for top bar.

* Refactored all data structures in application into shared module

* CSS changes

* Fixed part selector query due to improper ids, now we use a more robust query type. Implemented a stack for adhoc clicks, so user can explore a bit but pagination is based on current page.

* Reverted sidenav changes. Fixed scrollTo to be more reliable with how the content comes into view.

* When you make an adhoc link jump, we now restore page and scroll position.

* Hooked in basic preferences for books and force margin settings for mobile devices.

* Book overrides now work all the time. Added a bunch of fonts for users to try out.

* Added all font faces

* A bit hacky, but darkMode now works for the reader.

* Remove styles on destroy

* First time users will have their menu open automatically

* Book format now shows on card details modal

* changed how margin updates to make more sense

* Fixed flashing by applying an opacity transition on page change.

* Code cleanup

* Reverted changes to unify series-detail page. Added some extra accessibility for book reader.

* Implement the ability to close drawer by clicking into the reader area

* Don't let the user page past where they should be able to

* Allow user to see the underlying values of customizations and when they save them, actually reset to their preferences

* Responsive top for sticky header

* Code smells

* Implemented the ability to update book settings from user settings

* code smells

* Code smells and max/mins on reader should match the user pref area

* Feature/feats and fixes (#144)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Feature/feats and fixes (#146)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Book Feedback and Fixes (#147)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Cleaned up carousel and fixed breakpoints so we always at least show 2 cards.

* Cleaned up menu for manga reader and changed how automatic scaling works, based on ratio of width to height rather than raw numbers.

* Fixed an issue where using left/right keys on book reader wouldn't behave like clicking left/right pagination buttons.

* Dark mode and click to paginate was conflicting. The hint overlay still doesn't work when dark mode is on.

* Book Feedback (#148)

* In case a migration is poorly implemented, default on first load of bookreader.

* If there is no table of contents in epub file, inform the user

* Fixed #143 by ensuring we properly flatten the correct property when catching errors.

* Fixed #140. Search bar in nav is now more responsive than ever and properly scales down to even the smallest phone sizes (less than 300px)

* For Cards, moved the action menu into the bottom area, added Library link that said series belongs to.

* Added library to the series detail card

* Implemented the ability to automatically scale the manga reader based on screen size.

* Fix code smells

* Use margin-top instead of top for offsetting top

* Add a little extra spacing just in case

* Updated carousel to use a swpier

* Increased the budget and changed how vendor module is created

* Added some todos

* Implemented the ability to suppress library link on cards

* Fixed an issue with top offset for reading section

* Added the action bar to the bottom when user scrolls all the way down (Feedback)

* Added in a skip to content link for top nav

* After performing an action on library page, refresh the data on page.

* Implemented the ability to refresh metadata of a single series directly

* Implemented a progress bar for reading and a go to page by clicking the progress bar

* Only show the bottom action bar when there is a scrollbar

* Implemented the ability to tap the sides of book reader to paginate

* Cleaned up carousel and fixed breakpoints so we always at least show 2 cards.

* Cleaned up menu for manga reader and changed how automatic scaling works, based on ratio of width to height rather than raw numbers.

* Fixed an issue where using left/right keys on book reader wouldn't behave like clicking left/right pagination buttons.

* Dark mode and click to paginate was conflicting. The hint overlay still doesn't work when dark mode is on.

* Fixed issue where errors from login flow would not throw a toastr

* Moved the progress bar and go to page into the side drawer to be less distracting when reading

* Removed console.logs

* Cleaned up styles on slider to be closer to size of cards

* Fixed an issue with swiper not allowing use of touch (#149)

* Fixed in progress by on last page incrementing to maxPages itself, thus ensuring it matches the sum of pages. (#151)

* Bugfix/in progress (#156)

* Fixed in progress by on last page incrementing to maxPages itself, thus ensuring it matches the sum of pages.

* Actually fix in progress by only incrementing page num on bookmark when we are on the last page

* Impleents tap to paginate user setting. (#157)

* Feature/manga reader (#160)

* Implemented pressing G to open go to page. Enhanced the dialog to give how many pages you can go to. On page splitting button press, if the current page needs splitting, we will re-render with the new option.

* Added gotopage shortcut key for book reader

* Setup for new feature

* Swiper now respects card sizes

* Fixes #51 and updates dependencies for security vulnerabilities

* Implemented back to top button

* Remove the - 1 hack from series-detail

* Remove hack from carad item

* Fix a regression where book reader would +1 pageNum for bookmarking on last page, but because books don't start at 0 for page nums, it isn't necessariy

* Implemented the ability to move between volumes automatically

* Additional security fix

* Code smells

* Cleaned up the implementation to properly prevent pagination when loading next chapter/volume

* v0.4 Last touches (#162)

* PurgeCSS integration

* Changed some icons to have titles

* Automatic scaling changes

* Removed 2 font families that didn't make the release cut. Fixed an off by 1 regression with setContinuePoint

* Backed out purge css after testing

* Some cleanup of the package

* Automatic scaling adjustments

* Bugfix/release shakeout (#164)

* Fixed body color not being reset due to capturing it too late

* Removed some dead code

* Implemented dark mode (#166)

* Implemented dark mode

* Bump version to v0.4.1, moved dark styles to own stylesheet (some files need dark overrides) and ensured all pages are styled correctly.

* Switched the code over to use bootstrap theme with Kavita color

* Bugfix/manga issues (#169)

* Fixes #168

* Fixed a bug on the manga reader that caused the background color to inherit from body rather than be forced black.
Fixed an issue where a long filename on a phone could make it hard to close menu once open.

* Sentry Integration (#170)

* Basic version of sentry is implemented

* Enhanced continuous reading to show a warning when we couldn't find the next reading point. This will also short circuit after the first warning is shown

* Implemented Sentry. Currently src maps aren't uploading

* Bugfixes/misc (#174)

* Fixes #171

* Ensure btn-information is properly styled in dark mode

* no trace sampling for UI

* Fixed an issue where when we had no read progress, when choosing firs… (#176)

* Fixed an issue where when we had no read progress, when choosing first volume, we'd use first chapter, but sometimes chapters wouldn't be ordered.

* Code smell

* Collection Support (#179)

* Home button should go to library page, so we can use back and return to where we clicked from.

* Implemented Collection Support

* Fixed an issue for search in nav bar in darkmode

* Move loading to the top of the book reader

* Added DOMHelper to help with accessibility

* Implemented a re-usable layout component for all card layout screens. Handles pagination.

* Fixes #175

* Additional RBS check for tags where the tag fragment is invalid or there are no libraries that a user has access to

* Introduced an edit collection tag modal and actionables for collection tags.

* Bump version of Sentry SDK.

* Ability to remove series from a tag in a bulk manner.

* Continue Reading Regression (#186)

* Added a dark placeholder image for dark mode and hooked it up to Image service to load correct placeholder

* Fixed #181. Rewrote the continue logic to only check chapters and removed the concept of volumes (since every volume has a chapter). Opening a volume now does it's own check if there is progress on the volume, it will open to where the user left off. Otherwise, it will grab the first chapter and start at the beginning.

* Added dark error placeholder image (#187)

* Bugfix/misc (#188)

* Fixed an issue where carousel series cards scan library would kick off for wrong library id.

* Refactored the tab code to be dynamic based on the volume/chapter/specials of the data. The correct tab will be default selected and tabs that don't need to exist wont.

* Some css adjustments for typeaheads

* Move the loader out of the action bar so if settings menu is open when navigating pages, the math doesn't break

* Fixed a bug where highlight wasn't updating correctly when we type or after we add a tag via keyboard

* Fix an exception when tags are null (due to a bug in release)

* Accessibility bugs

* Collection Tweaks (#190)

* Fixed an issue where carousel series cards scan library would kick off for wrong library id.

* Refactored the tab code to be dynamic based on the volume/chapter/specials of the data. The correct tab will be default selected and tabs that don't need to exist wont.

* Some css adjustments for typeaheads

* Move the loader out of the action bar so if settings menu is open when navigating pages, the math doesn't break

* Fixed a bug where highlight wasn't updating correctly when we type or after we add a tag via keyboard

* Fix an exception when tags are null (due to a bug in release)

* Accessibility bugs

* Fixed #189 and cleaned up series pagination.

* Major cleanup of the typeahead code. One bug remaining

* Fixed highlight issue

* Fixed #183. When using continuous manga reading, moving to another chapter within the reader now updates the url. (#191)

* Book Parity: Reading direction for books (#192)

* Fixed pagination issue on library-detail

* Implemented left to right/right to left reading mode in book reader

* feat: remove Webtoon option from Library Types (#194)

#251

* Book Reading Progress Enhancement (#196)

* Implemented the ability to bookmark and restore reading progress (scroll) for books.

* Check to make sure we have something to search before we perform a querySelectorAll

* Don't reload a page when we've hit the boundaries of min/max pages and are trying to spam left/right key.

* Fixed a bug where if kavita-part marker was on the same page, the conditional was always true, meaning that when it was on a different one, we wouldn't load it up.

* Bugfix/tab refactor (#197)

* Fixed a logic bug which hid the specials tab too aggressively

* Unsubscribe from observables on destroy of account service

* Recently Added Page (#198)

* Recently Added Page
* Changed default pagination to 30

* Update to CSS for homepage section title links (#201)

* Update to CSS for homepage section title links

* Adding :active and :focus selectors

- :active for accessibility best practice and UX.
- :focus for mobile.

* Fixed #202 - Scope list item hover styles in darkmode to only typeahead (#204)

* Double Flashing Fix (#206)

* Fixed #202 - Scope list item hover styles in darkmode to only typeahead

* Fixed #199 - Flickering when paginating

* Fixed an issue with Continue Reading not working after manually updating a volume with multiple chapters as read/unread (#211)

* Directory Picker UX Enhancements (#214)

* Added a filter and some css to the directory picker to make it more useable

* Fixed a bug where last goBack didn't reload the disks and kept the directories from the last selected node.

* Allow user to change port (#215)

* Allow the admin to configure the log level from the UI. Add a warning informing them restart is required for port and log level. (#217)

* Cleaned up some console.logs and tweaked the logic of scroll position remembering. Now the side nav chapter list will show what part you are on (if applicable). (#220)

* Specials Sort (#223)

* Implemented a natural sort (same as BE) to sort the specials so the order isn't completely random

* Added ability to push source maps tagged to a release on push to main and develop (#219)

* Create Library Feedback (#224)

# Added
- Library type and number of folders shared is now visible on Manage Libraries page

# Changed
- Directory Picker will now let you share the current folder from any time in the picker flow
- Headings are now consistent between User Preferences and Admin screen

* Fixing folder structure for sentry github action (#225)

* Updating workflow environment (#226)

Sentry workflow was giving an error: "Error: Container action is only supported on Linux"

* Fixing build dist path for sentry (#227)

* Updating workflow environment

Sentry workflow was giving an error: "Error: Container action is only supported on Linux"

* update build dist path for sentry

* fix: unable to select lib type when creating a new lib (#231)

* fix: unable to select lib type when creating a new lib

fixed #230

* fix: able to change lib type after it's creation

* Download Support (#229)

* Implemented the ability to download series/chapter/volume from server. Uses RBS to determine if a user can or cannot download.

* Safety Checks (#233)

* Fixes a safety check from Sentry ANGULAR-1Z

* Fixed a build issue from downloading branch

* Fix/234 235 login redirection and dark theme not working (#236)

* fix: login redirection not happening

#234

* fix: dark theme not working after logout

#235

* Remove SP marker from specials and also remove extension from specials. (#238)

* Remove SP marker from specials and also remove extension from specials.

* Sort first so we can take advantage of the SP number

* Error Handling Rework (#237)

* Updated ngx-toastr version (includes new styles), updated style.scss to be cleaner. Began adding Title service for accessibility.

* Reworked error interceptor and toastr service to reduce duplicates and properly show errors.

Co-authored-by: Milazzo, Joseph (jm520e) <jm520e@us.att.com>

* Fixed a prod only issue due to multi: true for provider (#243)

* Feat/usage stats collection (#245)

* feat: add client anonymous data collection

* fix: sonar issues

* Implemented a server setting to opt-out of usage collection

Co-authored-by: Joseph Milazzo <joseph.v.milazzo@gmail.com>

* Book Progress Enhancements (#250)

* Implemented the ability to bookmark any part of the book, not just the chapter parts

* Added total pages label to the book reader

* Manga Reader Redesign + Webtoon Reader (#247)

# New
- Bottom drawer has a scroller to jump to pages, jump to first/last page, and jump between volume/chapters. 
- Quick actions easily available to change reading direction, change reader mode, color tones, and access extended settings
- Extended settings area for settings unlikely changed
- Ability to auto close menu (setting)
- Ability to apply extra darkness or a sepia tone to reduce blue light
- New reader modes: Left/Right, Up/Down, Webtoon (scroll up and down)
- Information about the volume/chapter you are reading is now showed in the top drawer

# Changed
- When applying reader modes or reading directions, the clickable areas will now show an overlay to help you understand where to click.
- Image scaling and Image splitting now show some contextual icons to help the user understand what they do
- Close book button is now in the top drawer menu

* Bugfix/toastr css updates (#249)

* CSS Updates

- Removed BS4 toastr styles
- Reinstituted default non-BS4 toastr styles
- Centered login (accounting for header)
- Adjusted the carousel section heading font-size.
- Added a small padding (5px) on top of the padding for the nav, so the text isn't so close to the nav.

* Login refresh & toaster styles

- Added new font for login
- Updated login styles
- Hide nav bar on logout
- show nav bar on login
- Added images for new login
- dark styles for login
- dark styles for toastr

* minified images

* sonar bug fix

* updating style url for minified asset

* Fixes and code smells

- fix for login bg image showing up elsewhere
- fix for code smells
- added font family to nav bar

* Fixed missing label/input linking

* resized, compressed, and minified bg image

- change opacity to dark mode login

* Changed Spartan font files to variable weight

* Change requests

- Added font license
- Renamed image used for login bg
- Fixed path in styles where above file was used
- Removed now unused bs4 toastr style import

Co-authored-by: Joseph Milazzo <joseph.v.milazzo@gmail.com>

* Fix a bad version number

* hotfix for docker build issue (#251)

* updating angular.json

changing output folder

* change path

* Fixed build issues (#252)

* Bugs! (#254)

* Fix style issue where bootstrap components weren't taking kavita overrides

* Fixed a bug where after updating certain things within a card on library page, the feeds wouldn't update

* Fixed a bug where chapter sort would not behave the same way as on chrome

* Release canidate bugs (#255)


  *  Auto Close menu wasn't updating within reader
  *  (Book Reader) Enhanced scroll part code to limit elements we count for bookmarking, only calculating intersection once fully visible and saving when scroll ends
 *   Removed Image Smoothing option (chrome only) from this release. No noticeable difference having it.
 * Fixed a page reload when clicking on In Progress section title on home page

* Bugfix/webtoons (#256)

* Fixed issue where first load would not start capturing scroll events due to not knowing the scroll to an element finished.

* Changed how to figure out when to end scrolling event by calculating if the target element is visible in the viewport.

* Seems to be working pretty well. Cleaned up some of the messages for debugging.

* Simplified the intersection logic drastically

* Fixed a color issue on slider for non-dark mode

* Disable first/last page buttons if we are already on those respective pages

* Added documentation to circular array class

* Some debug code but scrolling no longer results in jank due to scrollToPage getting executed too often

* Backing out ability to use webtoon reader

* Css fix for book reader progress and light mode toastr (#257)

* Changing dark mode to default (#262)

- Changed user-preferences site dark mode to default true

* added logo and css for logo (#260)

* added logo and css for logo

- max-height is to prevent the image from increasing the height of the navbar.
- middle middle vertical align didn't look to match up as expected, so a top middle was implemented based on chrome and firefox renderings.

* Adding requested accessibility changes

* Added Kavita-webui repo to UI/Web

* Special parsing issues (#361)

* Update README.md

Added demo link to Readme and tweaked Sentry icon

* Adds some regex cases from manga downloaded through FMD2. For parsing specials, if the marker is found, try to overwrite the series with the folder the manga is in, rather than what we parse from filename.

* Version bump

* Changed company to point to our domain

* Fixed copyright to point to our domain

* Adding test github workflow and test build file (#362)

* Fixing copy fail in monorepo-test workflow

* fixing shell script to be executable

* fixing permission issue

* Folder Parsing (#366)

* New: Ability to parse volume and chapter from directory tree, rather than exclusively from filename. (#313)
* Fixed: Fixed an edge case where GetFoldersTillRoot if given a non-existent root in the file path, would result in an infinite loop.

* Book Reader Bugs (#367)

* Fixed:  Fixed an issue where when tap to paginate is on, clicking off the settings menu doesn't close it.
* Fixed: Fixed the tint color on book reader being different from manga reader.
* Fixed: Reworked the clickable overlay for tap to paginate so links are still clickable when tap to paginate is on.

* Build on monorepo

* Book Reader Intersection Handler not firing  (#369)

* Fixed: Fixed an issue where intersection observer wouldn't be triggered when book page had no images (Book reader bookmark not firing while scrolling #360)

* Raw Image Support (#375)

* New: Ability to add Raw Image folders to Kavita via new library Types Images (Comic) and Images (Manga). Images must belong to a folder, they cannot exist in the root directory. It is important to at least put a second folder (minimum) with a Volume of Chapter, else you will end up with each image as a special which is not easily readable.
* Changed: When caching images for raw images, do it much faster and return earlier if the files have already been cached.


Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>

* Fixed a bug in the circular array which would not properly roll index over for applyFor (#377)

* Fixed: Manga reader's prefetching buffer had issues with rolling over index, which would require a manual image load every 7 pages. (#372)

* Adding new ui dist folder to gitignore

* Added stats folder persistence (#382)

* Added demo link to Readme and tweaked Sentry icon

* Added a symbolic link to persist the stats folder between docker container updates.

Co-authored-by: Joseph Milazzo <joseph.v.milazzo@gmail.com>

* Lots of UI fixes and changes (#383)

* After we flatten, remove any non image files as we shouldn't iterate over them

* Fixed a case for next/prev chapter where if we have a volume then chapters attached afterwards, there would be improper movement due to how sorting works.

* Fixed an issue where no-connection would not resume where the loss of connection occured

* Fixed an issue where after creating a new user, their Last Active Date would show as a weird date, instead of "Never"

* Sort files in card detail to mimic reading order

* Implemented a single source of executing actions (card or main page) and added actionables on library detail page.

* Refactored series actions into action service

* Implemented common handling for entity actions in a dedicated service

* Fixed build script for new monorepo layout.

* Cleaned up nav header subscriptions

* Updated the favicon/icons to work on different devices as home screen shortcuts


* Fixed: Fixed issue where if you had a volume with 1 volume based file and a chapter file, the next/prev chapters wouldn't work (Fixes #380)
* Fixed: When connection is lost to backend, saving current page url and resuming when connection reestablished was not working (Fixes #379)
* Fixed: When creating a new user, a strange date format was shown in Last Active due to not having been active. Now "Never" shows (Fixes #376)
* Fixed: When showing files for a volume/chapter, the files are now sorted in the order you will read them in (Fixes #378)
* Added: Library detail now has actionable menu next to header, so you can kick off a scan or metadata refresh (Closes #363)
* Changed: When performing actions like marking as read/unread on series detail page, the actionable button will disable until the request finishes. (Closes #381)
* Changed: Favicon and Icons have been updated so when saving webpage to home screen, it should show a proper icon (Closes #356)

* Lots of Bugfixes and Tweaks (#387)

* Fixed: Fixed a bug in how we get images. When dealing with raw images, we need special logic (Monorepo)
* Added: (Manga Reader) When we are within 10 pages of the beginning of a manga, prefetch the prev chapter
* Fixed: (Manga Reader) The slider would sometime skip pages and would have leftover track on last page. 
* Fixed: (Raw Images) When calculating cover image for Raw Image entities, only select image files
* Fixed: Fixed a logic bug where raw image based entities wouldn't send back the correct page (Monorepo)
* Changed: When deleting a library, it can take a long time. Disable delete buttons until the deletion finishes
* Added: (Parser) Added a regex case for "Series - Ch. 20 - Part"
* Changed: Try to show the files in volume/chapter detail modal in the reading order. 
* Fixed: Next/Previous chapter was not working in all cases from previous Monorepo commit.

* Bugfix/locked name reset (#389)

* Fixed: Fixed an issue where if you manually rename a series, then remove/modify an entity related to the series, the series would be deleted and re-created with the original, parsed name.

* Scan Series (#390)

* Refactored Library delete to use a transaction.

* Ensure we parse "Series Name - Chapter XXX" before "Series Name - Vol XXX"

* Ensure if GetFoldersTillRoot is called with a fullPath containing a file, that we ignore the file for returned folders.

* Changed: From the series actionable menu, instead of scan library, which would kick off a filesystem scan on the library the series belonged to, instead we have "scan series" which will scan the folders represented by that series. If that series has files in the root of the library, the library root is scanned, but only said series files will be processed. This can make a refresh occur in under 500 ms (Fixes #371)
* Fixed: Fixed a bad parsing case for "Series Name - Vol.01 Chapter 029 8 Years Ago" where somehow the chapter would parse as "029 8", thus making the file a special rather than chapter 29.

* Fixes a bug where the root path and the full path share a common word, like root: "/Test library" and full path "/Test library/Test" which caused "/Test" to be taken out of root and thus GetFoldersTillRoot would never finish

* About Section (#394)


* Added: Added an about section with version, links to discord, github, donations, etc.
* Fixed: Fixed some parsing issues that caused "Series Name - Volume X Chapter Y" to parse as "Series Name - Volume X" from a previous change in develop.

* Cleaning up monorepo build files

* Fixing permission issues

Co-authored-by: Leonardo Dias <leo.rock14@gmail.com>
Co-authored-by: Robbie Davis <robbie@therobbiedavis.com>
Co-authored-by: Leonardo Dias <contato.leonardod@yahoo.com>
Co-authored-by: Milazzo, Joseph (jm520e) <jm520e@us.att.com>
Co-authored-by: Chris Plaatjes <kizaing@gmail.com>
This commit is contained in:
Joseph Milazzo 2021-07-17 14:03:11 -05:00 committed by GitHub
parent 2da77da51b
commit 2a34fe4cc7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
374 changed files with 53069 additions and 617 deletions

17
.browserslistrc Normal file
View File

@ -0,0 +1,17 @@
# This file is used by the build system to adjust CSS and JS output to support the specified browsers below.
# For additional information regarding the format and rule options, please see:
# https://github.com/browserslist/browserslist#queries
# For the full list of supported browsers by the Angular framework, please see:
# https://angular.io/guide/browser-support
# You can see what browsers were selected by your queries by running:
# npx browserslist
last 1 Chrome version
last 1 Firefox version
last 2 Edge major versions
last 2 Safari major versions
last 2 iOS major versions
Firefox ESR
not IE 11 # Angular supports IE 11 only as an opt-in. To opt-in, remove the 'not' prefix on this line.

21
.editorconfig Normal file
View File

@ -0,0 +1,21 @@
# Editor configuration, see https://editorconfig.org
root = true
[*]
charset = utf-8
indent_style = space
indent_size = 4
insert_final_newline = true
trim_trailing_whitespace = true
[*.cs]
indent_size = 3
[*.ts]
quote_type = single
indent_size = 2
[*.md]
max_line_length = off
trim_trailing_whitespace = false

View File

@ -57,4 +57,4 @@ jobs:
dotnet build --configuration Release
.\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
- name: Test
run: dotnet test --no-restore --verbosity normal
run: dotnet test --no-restore --verbosity normal

38
.github/workflows/monorepo-test.yml vendored Normal file
View File

@ -0,0 +1,38 @@
name: Monorepo Build Test
on:
push:
branches:
- 'feature/monorepo'
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Check Out Repo
uses: actions/checkout@v2
- name: NodeJS to Compile WebUI
uses: actions/setup-node@v2.1.5
with:
node-version: '14'
- run: |
cd UI/Web || exit
echo 'Installing web dependencies'
npm install
echo 'Building UI'
npm run prod
echo 'Copying back to Kavita wwwroot'
rsync -a dist/ ../../API/wwwroot/
cd ../ || exit
- name: Compile dotnet app
uses: actions/setup-dotnet@v1
with:
dotnet-version: '5.0.x'
- run: ./monorepo-build.sh

View File

@ -13,20 +13,13 @@ jobs:
- name: Check Out Repo
uses: actions/checkout@v2
- name: Check Out WebUI
uses: actions/checkout@v2
with:
repository: Kareadita/Kavita-webui
ref: develop
path: Kavita-webui/
- name: NodeJS to Compile WebUI
uses: actions/setup-node@v2.1.5
with:
node-version: '14'
- run: |
cd Kavita-webui/ || exit
cd UI/Web || exit
echo 'Installing web dependencies'
npm install
@ -34,7 +27,7 @@ jobs:
npm run prod
echo 'Copying back to Kavita wwwroot'
rsync -a dist/ ../API/wwwroot/
rsync -a dist/ ../../API/wwwroot/
cd ../ || exit
@ -42,8 +35,8 @@ jobs:
uses: actions/setup-dotnet@v1
with:
dotnet-version: '5.0.x'
- run: ./action-build.sh
- run: ./monorepo-build.sh
- name: Login to Docker Hub
uses: docker/login-action@v1
with:

42
.github/workflows/sentry-release.yml vendored Normal file
View File

@ -0,0 +1,42 @@
name: Sentry Release
on:
push:
branches: [ main, develop, feature/sentry-release ]
jobs:
build:
name: Setup Sentry CLI
runs-on: ubuntu-latest
steps:
- uses: mathieu-bour/setup-sentry-cli@1.2.0
with:
version: latest
token: ${{ SECRETS.SENTRY_TOKEN }}
organization: kavita-7n
project: angular
- name: Checkout
uses: actions/checkout@v2
- name: Install NodeJS
uses: actions/setup-node@v2
with:
node-version: '12'
- name: Cache dependencies
uses: actions/cache@v2
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Install Dependencies
run: npm install
- name: Build Angular
run: npm run prod
- name: get-npm-version
id: package-version
uses: martinbeentjes/npm-get-version-action@master
- name: Create Release
run: sentry-cli releases new ${{ steps.package-version.outputs.current-version }}
- name: Upload Source Maps
run: sentry-cli releases files ${{ steps.package-version.outputs.current-version }} upload-sourcemaps ./dist
- name: Finalize Release
run: sentry-cli releases finalize ${{ steps.package-version.outputs.current-version }}

View File

@ -13,20 +13,13 @@ jobs:
- name: Check Out Repo
uses: actions/checkout@v2
- name: Check Out WebUI
uses: actions/checkout@v2
with:
repository: Kareadita/Kavita-webui
ref: main
path: Kavita-webui/
- name: NodeJS to Compile WebUI
uses: actions/setup-node@v2.1.5
with:
node-version: '14'
- run: |
cd Kavita-webui/ || exit
cd UI/Web || exit
echo 'Installing web dependencies'
npm install
@ -34,7 +27,7 @@ jobs:
npm run prod
echo 'Copying back to Kavita wwwroot'
rsync -a dist/ ../API/wwwroot/
rsync -a dist/ ../../API/wwwroot/
cd ../ || exit
@ -42,7 +35,7 @@ jobs:
uses: actions/setup-dotnet@v1
with:
dotnet-version: '5.0.x'
- run: ./action-build.sh
- run: ./monorepo-build.sh
- name: Login to Docker Hub
uses: docker/login-action@v1

46
.gitignore vendored
View File

@ -435,11 +435,54 @@ $RECYCLE.BIN/
##
## Visual Studio Code
##
# See http://help.github.com/ignore-files/ for more about ignoring files.
# compiled output
/UI/Web/dist
/tmp
/out-tsc
# Only exists if Bazel was run
/bazel-out
# dependencies
/node_modules
# profiling files
chrome-profiler-events*.json
speed-measure-plugin*.json
# IDEs and editors
/.idea
.project
.classpath
.c9/
*.launch
.settings/
*.sublime-workspace
# IDE - VSCode
.vscode/*
!.vscode/settings.json
!.vscode/tasks.json
!.vscode/launch.json
!.vscode/extensions.json
.history/*
# misc
/.sass-cache
/connect.lock
/coverage
/libpeerconnection.log
npm-debug.log
yarn-error.log
testem.log
/typings
# System Files
.DS_Store
Thumbs.db
ssl/
# App specific
appsettings.json
@ -454,4 +497,5 @@ cache/
/API/temp/
_temp/
_output/
stats/
stats/
UI/Web/dist/

50
.vscode/settings.json vendored Normal file
View File

@ -0,0 +1,50 @@
{
"better-comments.tags": [
{
"tag": "note",
"color": "#FF2D00",
"strikethrough": false,
"underline": false,
"backgroundColor": "transparent",
"bold": true,
"italic": false
},
{
"tag": "?",
"color": "#3498DB",
"strikethrough": false,
"underline": false,
"backgroundColor": "transparent",
"bold": false,
"italic": false
},
{
"tag": "//",
"color": "#474747",
"strikethrough": true,
"underline": false,
"backgroundColor": "transparent",
"bold": false,
"italic": false
},
{
"tag": "todo",
"color": "#FF8C00",
"strikethrough": false,
"underline": false,
"backgroundColor": "transparent",
"bold": true,
"italic": false
},
{
"tag": "*",
"color": "#98C379",
"strikethrough": false,
"underline": false,
"backgroundColor": "transparent",
"bold": false,
"italic": false
}
]
}

View File

@ -1,5 +1,6 @@
using API.Entities;
using API.Extensions;
using API.Parser;
using Xunit;
namespace API.Tests.Extensions
@ -15,6 +16,7 @@ namespace API.Tests.Extensions
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot"}, new [] {"salem's lot"}, true)]
// Different normalizations pass as we check normalization against an on-the-fly calculation so we don't delete series just because we change how normalization works
[InlineData(new [] {"Salem's Lot", "Salem's Lot", "Salem's Lot", "salems lot"}, new [] {"salem's lot"}, true)]
[InlineData(new [] {"Rent-a-Girlfriend", "Rent-a-Girlfriend", "Kanojo, Okarishimasu", "rentagirlfriend"}, new [] {"Kanojo, Okarishimasu"}, true)]
public void NameInListTest(string[] seriesInput, string[] list, bool expected)
{
var series = new Series()
@ -25,8 +27,30 @@ namespace API.Tests.Extensions
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
Metadata = new SeriesMetadata()
};
Assert.Equal(expected, series.NameInList(list));
}
[Theory]
[InlineData(new [] {"Darker than Black", "Darker Than Black", "Darker than Black"}, "Darker than Black", true)]
[InlineData(new [] {"Rent-a-Girlfriend", "Rent-a-Girlfriend", "Kanojo, Okarishimasu", "rentagirlfriend"}, "Kanojo, Okarishimasu", true)]
[InlineData(new [] {"Rent-a-Girlfriend", "Rent-a-Girlfriend", "Kanojo, Okarishimasu", "rentagirlfriend"}, "Rent", false)]
public void NameInParserInfoTest(string[] seriesInput, string parserSeries, bool expected)
{
var series = new Series()
{
Name = seriesInput[0],
LocalizedName = seriesInput[1],
OriginalName = seriesInput[2],
NormalizedName = seriesInput.Length == 4 ? seriesInput[3] : API.Parser.Parser.Normalize(seriesInput[0]),
Metadata = new SeriesMetadata()
};
var info = new ParserInfo();
info.Series = parserSeries;
Assert.Equal(expected, series.NameInParserInfo(info));
}
}
}
}

View File

@ -9,7 +9,7 @@ namespace API.Tests.Parser
public class MangaParserTests
{
private readonly ITestOutputHelper _testOutputHelper;
public MangaParserTests(ITestOutputHelper testOutputHelper)
{
_testOutputHelper = testOutputHelper;
@ -68,7 +68,7 @@ namespace API.Tests.Parser
{
Assert.Equal(expected, API.Parser.Parser.ParseVolume(filename));
}
[Theory]
[InlineData("Killing Bites Vol. 0001 Ch. 0001 - Galactica Scanlations (gb)", "Killing Bites")]
[InlineData("My Girlfriend Is Shobitch v01 - ch. 09 - pg. 008.png", "My Girlfriend Is Shobitch")]
@ -146,11 +146,19 @@ namespace API.Tests.Parser
[InlineData("Kodoja #001 (March 2016)", "Kodoja")]
[InlineData("Boku No Kokoro No Yabai Yatsu - Chapter 054 I Prayed At The Shrine (V0).cbz", "Boku No Kokoro No Yabai Yatsu")]
[InlineData("Kiss x Sis - Ch.36 - A Cold Home Visit.cbz", "Kiss x Sis")]
[InlineData("Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ)", "Seraph of the End - Vampire Reign")]
[InlineData("Grand Blue Dreaming - SP02 Extra (2019) (Digital) (danke-Empire).cbz", "Grand Blue Dreaming")]
[InlineData("Yuusha Ga Shinda! - Vol.tbd Chapter 27.001 V2 Infection ①.cbz", "Yuusha Ga Shinda!")]
[InlineData("Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz", "Seraph of the End - Vampire Reign")]
[InlineData("Getsuyoubi no Tawawa - Ch. 001 - Ai-chan, Part 1", "Getsuyoubi no Tawawa")]
[InlineData("Please Go Home, Akutsu-San! - Chapter 038.5 - Volume Announcement.cbz", "Please Go Home, Akutsu-San!")]
[InlineData("Killing Bites - Vol 11 Chapter 050 Save Me, Nunupi!.cbz", "Killing Bites")]
[InlineData("Mad Chimera World - Volume 005 - Chapter 026.cbz", "Mad Chimera World")]
public void ParseSeriesTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseSeries(filename));
}
[Theory]
[InlineData("Killing Bites Vol. 0001 Ch. 0001 - Galactica Scanlations (gb)", "1")]
[InlineData("My Girlfriend Is Shobitch v01 - ch. 09 - pg. 008.png", "9")]
@ -206,13 +214,14 @@ namespace API.Tests.Parser
[InlineData("Kiss x Sis - Ch.00 - Let's Start from 0.cbz", "0")]
[InlineData("[Hidoi]_Amaenaideyo_MS_vol01_chp02.rar", "2")]
[InlineData("Okusama wa Shougakusei c003 (v01) [bokuwaNEET]", "3")]
[InlineData("Kiss x Sis - Ch.15 - The Angst of a 15 Year Old Boy.cbz", "15")]
[InlineData("Tomogui Kyoushitsu - Chapter 006 Game 005 - Fingernails On Right Hand (Part 002).cbz", "6")]
[InlineData("Noblesse - Episode 406 (52 Pages).7z", "406")]
[InlineData("X-Men v1 #201 (September 2007).cbz", "201")]
[InlineData("Kodoja #001 (March 2016)", "1")]
[InlineData("Noblesse - Episode 429 (74 Pages).7z", "429")]
[InlineData("Boku No Kokoro No Yabai Yatsu - Chapter 054 I Prayed At The Shrine (V0).cbz", "54")]
[InlineData("Ijousha No Ai - Vol.01 Chapter 029 8 Years Ago", "29")]
[InlineData("Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz", "9")]
public void ParseChaptersTest(string filename, string expected)
{
Assert.Equal(expected, API.Parser.Parser.ParseChapter(filename));
@ -249,7 +258,7 @@ namespace API.Tests.Parser
{
Assert.Equal(expected, !string.IsNullOrEmpty(API.Parser.Parser.ParseMangaSpecial(input)));
}
[Theory]
[InlineData("image.png", MangaFormat.Image)]
[InlineData("image.cbz", MangaFormat.Archive)]
@ -266,6 +275,32 @@ namespace API.Tests.Parser
Assert.Equal(expected, API.Parser.Parser.ParseMangaSpecial(inputFile));
}
private static ParserInfo CreateParserInfo(string series, string chapter, string volume, bool isSpecial = false)
{
return new ParserInfo()
{
Chapters = chapter,
Volumes = volume,
IsSpecial = isSpecial,
Series = series,
};
}
[Theory]
[InlineData("/manga/Btooom!/Vol.1/Chapter 1/1.cbz", "Btooom!~1~1")]
[InlineData("/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Btooom!~1~2")]
public void ParseFromFallbackFoldersTest(string inputFile, string expectedParseInfo)
{
const string rootDirectory = "/manga/";
var tokens = expectedParseInfo.Split("~");
var actual = new ParserInfo {Chapters = "0", Volumes = "0"};
API.Parser.Parser.ParseFromFallbackFolders(inputFile, rootDirectory, LibraryType.Manga, ref actual);
Assert.Equal(tokens[0], actual.Series);
Assert.Equal(tokens[1], actual.Volumes);
Assert.Equal(tokens[2], actual.Chapters);
}
[Fact]
public void ParseInfoTest()
{
@ -278,7 +313,7 @@ namespace API.Tests.Parser
Chapters = "76", Filename = "Mujaki no Rakuen Vol12 ch76.cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:/Manga/Shimoneta to Iu Gainen ga Sonzai Shinai Taikutsu na Sekai Man-hen/Vol 1.cbz";
expected.Add(filepath, new ParserInfo
{
@ -286,7 +321,7 @@ namespace API.Tests.Parser
Chapters = "0", Filename = "Vol 1.cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\Beelzebub\Beelzebub_01_[Noodles].zip";
expected.Add(filepath, new ParserInfo
{
@ -294,7 +329,7 @@ namespace API.Tests.Parser
Chapters = "1", Filename = "Beelzebub_01_[Noodles].zip", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\Ichinensei ni Nacchattara\Ichinensei_ni_Nacchattara_v01_ch01_[Taruby]_v1.1.zip";
expected.Add(filepath, new ParserInfo
{
@ -309,8 +344,8 @@ namespace API.Tests.Parser
Series = "Tenjo Tenge", Volumes = "1", Edition = "Full Contact Edition",
Chapters = "0", Filename = "Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
});
filepath = @"E:\Manga\Akame ga KILL! ZERO (2016-2019) (Digital) (LuCaZ)\Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz";
expected.Add(filepath, new ParserInfo
{
@ -318,7 +353,7 @@ namespace API.Tests.Parser
Chapters = "0", Filename = "Akame ga KILL! ZERO v01 (2016) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\Dorohedoro\Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz";
expected.Add(filepath, new ParserInfo
{
@ -326,7 +361,7 @@ namespace API.Tests.Parser
Chapters = "0", Filename = "Dorohedoro v01 (2010) (Digital) (LostNerevarine-Empire).cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\APOSIMZ\APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz";
expected.Add(filepath, new ParserInfo
{
@ -334,7 +369,7 @@ namespace API.Tests.Parser
Chapters = "40", Filename = "APOSIMZ 040 (2020) (Digital) (danke-Empire).cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\Corpse Party Musume\Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz";
expected.Add(filepath, new ParserInfo
{
@ -342,7 +377,7 @@ namespace API.Tests.Parser
Chapters = "9", Filename = "Kedouin Makoto - Corpse Party Musume, Chapter 09.cbz", Format = MangaFormat.Archive,
FullFilePath = filepath
});
filepath = @"E:\Manga\Goblin Slayer\Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire).cbz";
expected.Add(filepath, new ParserInfo
{
@ -351,6 +386,22 @@ namespace API.Tests.Parser
FullFilePath = filepath
});
filepath = @"E:\Manga\Summer Time Rendering\Specials\Record 014 (between chapter 083 and ch084) SP11.cbr";
expected.Add(filepath, new ParserInfo
{
Series = "Summer Time Rendering", Volumes = "0", Edition = "",
Chapters = "0", Filename = "Record 014 (between chapter 083 and ch084) SP11.cbr", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = true
});
filepath = @"E:\Manga\Seraph of the End\Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz";
expected.Add(filepath, new ParserInfo
{
Series = "Seraph of the End - Vampire Reign", Volumes = "0", Edition = "",
Chapters = "93", Filename = "Seraph of the End - Vampire Reign 093 (2020) (Digital) (LuCaZ).cbz", Format = MangaFormat.Archive,
FullFilePath = filepath, IsSpecial = false
});
foreach (var file in expected.Keys)
{
@ -380,4 +431,4 @@ namespace API.Tests.Parser
}
}
}
}
}

View File

@ -1,5 +1,5 @@
using System.IO;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services;
using Microsoft.Extensions.Logging;
using NSubstitute;
@ -28,4 +28,4 @@ namespace API.Tests.Services
}
}
}
}

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using API.Services;
@ -26,7 +27,7 @@ namespace API.Tests.Services
var files = new List<string>();
var fileCount = DirectoryService.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
API.Parser.Parser.ArchiveFileExtensions, _logger);
Assert.Equal(28, fileCount);
}
@ -37,7 +38,7 @@ namespace API.Tests.Services
var files = _directoryService.GetFiles(testDirectory, @"file\d*.txt");
Assert.Equal(2, files.Count());
}
[Fact]
public void GetFiles_TopLevel_ShouldBeEmpty_Test()
{
@ -45,7 +46,7 @@ namespace API.Tests.Services
var files = _directoryService.GetFiles(testDirectory);
Assert.Empty(files);
}
[Fact]
public void GetFilesWithExtensions_ShouldBeEmpty_Test()
{
@ -53,7 +54,7 @@ namespace API.Tests.Services
var files = _directoryService.GetFiles(testDirectory, "*.txt");
Assert.Empty(files);
}
[Fact]
public void GetFilesWithExtensions_Test()
{
@ -61,7 +62,7 @@ namespace API.Tests.Services
var files = _directoryService.GetFiles(testDirectory, ".cbz|.rar");
Assert.Equal(3, files.Count());
}
[Fact]
public void GetFilesWithExtensions_BadDirectory_ShouldBeEmpty_Test()
{
@ -78,7 +79,7 @@ namespace API.Tests.Services
Assert.Contains(dirs, s => s.Contains("regex"));
}
[Fact]
public void ListDirectory_NoSubDirectory_Test()
{
@ -93,10 +94,21 @@ namespace API.Tests.Services
[InlineData("C:/Manga", "C:/Manga/Love Hina/Specials/Omake/", "Omake,Specials,Love Hina")]
[InlineData("C:/Manga", @"C:\Manga\Love Hina\Specials\Omake\", "Omake,Specials,Love Hina")]
[InlineData(@"/manga/", @"/manga/Love Hina/Specials/Omake/", "Omake,Specials,Love Hina")]
[InlineData(@"/manga/", @"/manga/", "")]
[InlineData(@"E:\test", @"E:\test\Sweet X Trouble\Sweet X Trouble - Chapter 001.cbz", "Sweet X Trouble")]
[InlineData(@"C:\/mount/gdrive/Library/Test Library/Comics/", @"C:\/mount/gdrive/Library/Test Library/Comics\godzilla rivals vs hedorah\vol 1\", "vol 1,godzilla rivals vs hedorah")]
[InlineData(@"/manga/", @"/manga/Btooom!/Vol.1 Chapter 2/1.cbz", "Vol.1 Chapter 2,Btooom!")]
[InlineData(@"C:/", @"C://Btooom!/Vol.1 Chapter 2/1.cbz", "Vol.1 Chapter 2,Btooom!")]
[InlineData(@"C:\\", @"C://Btooom!/Vol.1 Chapter 2/1.cbz", "Vol.1 Chapter 2,Btooom!")]
[InlineData(@"C://mount/gdrive/Library/Test Library/Comics", @"C://mount/gdrive/Library/Test Library/Comics/Dragon Age/Test", "Test,Dragon Age")]
public void GetFoldersTillRoot_Test(string rootPath, string fullpath, string expectedArray)
{
var expected = expectedArray.Split(",");
if (expectedArray.Equals(string.Empty))
{
expected = Array.Empty<string>();
}
Assert.Equal(expected, DirectoryService.GetFoldersTillRoot(rootPath, fullpath));
}
}
}
}

View File

@ -20,24 +20,24 @@ using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.Extensions.Logging;
using NSubstitute;
using Xunit;
using Xunit.Abstractions;
namespace API.Tests.Services
{
public class ScannerServiceTests : IDisposable
{
private readonly ITestOutputHelper _testOutputHelper;
private readonly ScannerService _scannerService;
private readonly ILogger<ScannerService> _logger = Substitute.For<ILogger<ScannerService>>();
private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
private readonly IBookService _bookService = Substitute.For<IBookService>();
private readonly IImageService _imageService = Substitute.For<IImageService>();
private readonly ILogger<MetadataService> _metadataLogger = Substitute.For<ILogger<MetadataService>>();
private readonly IDirectoryService _directoryService = Substitute.For<IDirectoryService>();
private readonly DbConnection _connection;
private readonly DataContext _context;
public ScannerServiceTests(ITestOutputHelper testOutputHelper)
public ScannerServiceTests()
{
var contextOptions = new DbContextOptionsBuilder()
.UseSqlite(CreateInMemoryDatabase())
@ -46,21 +46,20 @@ namespace API.Tests.Services
_context = new DataContext(contextOptions);
Task.Run(SeedDb).GetAwaiter().GetResult();
//BackgroundJob.Enqueue is what I need to mock or something (it's static...)
// ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
// IUnitOfWork unitOfWork, IMetadataService metadataService, IBackupService backupService, ICleanupService cleanupService,
// ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
// IUnitOfWork unitOfWork, IMetadataService metadataService, IBackupService backupService, ICleanupService cleanupService,
// IBackgroundJobClient jobClient
//var taskScheduler = new TaskScheduler(Substitute.For<ICacheService>(), Substitute.For<ILogger<TaskScheduler>>(), Substitute.For<)
// Substitute.For<UserManager<AppUser>>() - Not needed because only for UserService
IUnitOfWork unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
_testOutputHelper = testOutputHelper;
IMetadataService metadataService = Substitute.For<MetadataService>(unitOfWork, _metadataLogger, _archiveService, _bookService);
IMetadataService metadataService = Substitute.For<MetadataService>(unitOfWork, _metadataLogger, _archiveService, _bookService, _directoryService, _imageService);
_scannerService = new ScannerService(unitOfWork, _logger, _archiveService, metadataService, _bookService);
}
@ -90,12 +89,12 @@ namespace API.Tests.Services
//
// var series = _unitOfWork.LibraryRepository.GetLibraryForIdAsync(1).Result.Series;
// }
[Fact]
public void FindSeriesNotOnDisk_Should_RemoveNothing_Test()
{
var infos = new Dictionary<string, List<ParserInfo>>();
AddToParsedInfo(infos, new ParserInfo() {Series = "Darker than Black"});
AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "1"});
AddToParsedInfo(infos, new ParserInfo() {Series = "Cage of Eden", Volumes = "10"});
@ -140,7 +139,7 @@ namespace API.Tests.Services
{
Series = parsedInfoName
});
Assert.Equal(expected, actualName);
}
@ -158,7 +157,7 @@ namespace API.Tests.Services
EntityFactory.CreateSeries("Darker than Black Vol 1"),
};
existingSeries = ScannerService.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
Assert.Equal(missingSeries.Count, removeCount);
}
@ -194,12 +193,12 @@ namespace API.Tests.Services
collectedSeries[info.Series] = list;
}
}
}
// [Fact]
// public void ExistingOrDefault_Should_BeFromLibrary()
@ -257,7 +256,7 @@ namespace API.Tests.Services
// _testOutputHelper.WriteLine(_libraryMock.ToString());
Assert.True(true);
}
private static DbConnection CreateInMemoryDatabase()
{
var connection = new SqliteConnection("Filename=:memory:");
@ -269,4 +268,4 @@ namespace API.Tests.Services
public void Dispose() => _connection.Dispose();
}
}
}

View File

@ -16,7 +16,7 @@
<PropertyGroup>
<Product>Kavita</Product>
<Company>kareadita.github.io</Company>
<Copyright>Copyright 2020-$([System.DateTime]::Now.ToString('yyyy')) kareadita.github.io (GNU General Public v3)</Copyright>
<Copyright>Copyright 2020-$([System.DateTime]::Now.ToString('yyyy')) kavitareader.com (GNU General Public v3)</Copyright>
<!-- Should be replaced by CI -->
<AssemblyVersion>0.4.1</AssemblyVersion>
@ -50,9 +50,9 @@
<PackageReference Include="NetVips" Version="2.0.0" />
<PackageReference Include="NetVips.Native" Version="8.10.6" />
<PackageReference Include="NReco.Logging.File" Version="1.1.1" />
<PackageReference Include="Sentry.AspNetCore" Version="3.3.4" />
<PackageReference Include="SharpCompress" Version="0.28.1" />
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.20.0.28934">
<PackageReference Include="Sentry.AspNetCore" Version="3.7.0" />
<PackageReference Include="SharpCompress" Version="0.28.3" />
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.25.0.33663">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
@ -61,6 +61,12 @@
<PackageReference Include="VersOne.Epub" Version="3.0.3.1" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Kavita.Common\Kavita.Common.csproj" />
</ItemGroup>
<ItemGroup>
<None Remove="Hangfire-log.db" />
<None Remove="obj\**" />
@ -207,8 +213,4 @@
<_ContentIncludedByDefault Remove="wwwroot\vendor.6b2a0912ae80e6fd297f.js.map" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Kavita.Common\Kavita.Common.csproj" />
</ItemGroup>
</Project>

View File

@ -15,4 +15,25 @@ namespace API.Comparators
return x.CompareTo(y);
}
}
}
/// <summary>
/// This is a special case comparer used exclusively for sorting chapters within a single Volume for reading order.
/// <example>
/// Volume 10 has "Series - Vol 10" and "Series - Vol 10 Chapter 81". In this case, for reading order, the order is Vol 10, Vol 10 Chapter 81.
/// This is represented by Chapter 0, Chapter 81.
/// </example>
/// </summary>
public class ChapterSortComparerZeroFirst : IComparer<double>
{
public int Compare(double x, double y)
{
if (x == 0.0 && y == 0.0) return 0;
// if x is 0, it comes first
if (x == 0.0) return -1;
// if y is 0, it comes first
if (y == 0.0) return 1;
return x.CompareTo(y);
}
}
}

View File

View File

@ -4,6 +4,7 @@ using System.Threading.Tasks;
using API.DTOs;
using API.Extensions;
using API.Interfaces;
using API.Interfaces.Services;
using API.Services;
using HtmlAgilityPack;
using Microsoft.AspNetCore.Mvc;

View File

@ -25,8 +25,8 @@ namespace API.Controllers
private readonly ITaskScheduler _taskScheduler;
private readonly IUnitOfWork _unitOfWork;
public LibraryController(IDirectoryService directoryService,
ILogger<LibraryController> logger, IMapper mapper, ITaskScheduler taskScheduler,
public LibraryController(IDirectoryService directoryService,
ILogger<LibraryController> logger, IMapper mapper, ITaskScheduler taskScheduler,
IUnitOfWork unitOfWork)
{
_directoryService = directoryService;
@ -35,7 +35,7 @@ namespace API.Controllers
_taskScheduler = taskScheduler;
_unitOfWork = unitOfWork;
}
/// <summary>
/// Creates a new Library. Upon library creation, adds new library to all Admin accounts.
/// </summary>
@ -49,7 +49,7 @@ namespace API.Controllers
{
return BadRequest("Library name already exists. Please choose a unique name to the server.");
}
var library = new Library
{
Name = createLibraryDto.Name,
@ -58,14 +58,14 @@ namespace API.Controllers
};
_unitOfWork.LibraryRepository.Add(library);
var admins = (await _unitOfWork.UserRepository.GetAdminUsersAsync()).ToList();
foreach (var admin in admins)
{
admin.Libraries ??= new List<Library>();
admin.Libraries.Add(library);
}
if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical issue. Please try again.");
@ -92,7 +92,7 @@ namespace API.Controllers
return Ok(_directoryService.ListDirectory(path));
}
[HttpGet]
public async Task<ActionResult<IEnumerable<LibraryDto>>> GetLibraries()
{
@ -105,10 +105,10 @@ namespace API.Controllers
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(updateLibraryForUserDto.Username);
if (user == null) return BadRequest("Could not validate user");
var libraryString = String.Join(",", updateLibraryForUserDto.SelectedLibraries.Select(x => x.Name));
_logger.LogInformation("Granting user {UserName} access to: {Libraries}", updateLibraryForUserDto.Username, libraryString);
var allLibraries = await _unitOfWork.LibraryRepository.GetLibrariesAsync();
foreach (var library in allLibraries)
{
@ -117,16 +117,16 @@ namespace API.Controllers
var libraryIsSelected = updateLibraryForUserDto.SelectedLibraries.Any(l => l.Id == library.Id);
if (libraryContainsUser && !libraryIsSelected)
{
// Remove
// Remove
library.AppUsers.Remove(user);
}
else if (!libraryContainsUser && libraryIsSelected)
{
library.AppUsers.Add(user);
}
}
}
if (!_unitOfWork.HasChanges())
{
_logger.LogInformation("Added: {SelectedLibraries} to {Username}",libraryString, updateLibraryForUserDto.Username);
@ -138,8 +138,8 @@ namespace API.Controllers
_logger.LogInformation("Added: {SelectedLibraries} to {Username}",libraryString, updateLibraryForUserDto.Username);
return Ok(_mapper.Map<MemberDto>(user));
}
return BadRequest("There was a critical issue. Please try again.");
}
@ -150,7 +150,7 @@ namespace API.Controllers
_taskScheduler.ScanLibrary(libraryId);
return Ok();
}
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("refresh-metadata")]
public ActionResult RefreshMetadata(int libraryId)
@ -164,7 +164,7 @@ namespace API.Controllers
{
return Ok(await _unitOfWork.LibraryRepository.GetLibraryDtosForUsernameAsync(User.GetUsername()));
}
[Authorize(Policy = "RequireAdminRole")]
[HttpDelete("delete")]
public async Task<ActionResult<bool>> DeleteLibrary(int libraryId)
@ -176,13 +176,25 @@ namespace API.Controllers
var chapterIds =
await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(seriesIds);
var result = await _unitOfWork.LibraryRepository.DeleteLibrary(libraryId);
if (result && chapterIds.Any())
try
{
_taskScheduler.CleanupChapters(chapterIds);
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId);
_unitOfWork.LibraryRepository.Delete(library);
await _unitOfWork.CommitAsync();
if (chapterIds.Any())
{
_taskScheduler.CleanupChapters(chapterIds);
}
return Ok(true);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was a critical error trying to delete the library");
await _unitOfWork.RollbackAsync();
return Ok(false);
}
return Ok(result);
}
[Authorize(Policy = "RequireAdminRole")]
@ -204,20 +216,20 @@ namespace API.Controllers
{
_taskScheduler.ScanLibrary(library.Id, true);
}
return Ok();
}
[HttpGet("search")]
public async Task<ActionResult<IEnumerable<SearchResultDto>>> Search(string queryString)
{
queryString = queryString.Replace(@"%", "");
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
// Get libraries user has access to
var libraries = (await _unitOfWork.LibraryRepository.GetLibrariesForUserIdAsync(user.Id)).ToList();
if (!libraries.Any()) return BadRequest("User does not have access to any libraries");
var series = await _unitOfWork.SeriesRepository.SearchSeries(libraries.Select(l => l.Id).ToArray(), queryString);
@ -231,4 +243,4 @@ namespace API.Controllers
return Ok(await _unitOfWork.LibraryRepository.GetLibraryTypeAsync(libraryId));
}
}
}
}

View File

@ -22,6 +22,7 @@ namespace API.Controllers
private readonly ILogger<ReaderController> _logger;
private readonly IUnitOfWork _unitOfWork;
private readonly ChapterSortComparer _chapterSortComparer = new ChapterSortComparer();
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public ReaderController(IDirectoryService directoryService, ICacheService cacheService,
ILogger<ReaderController> logger, IUnitOfWork unitOfWork)
@ -44,7 +45,7 @@ namespace API.Controllers
var content = await _directoryService.ReadFileAsync(path);
var format = Path.GetExtension(path).Replace(".", "");
// Calculates SHA1 Hash for byte[]
Response.AddCacheHeader(content);
@ -54,6 +55,7 @@ namespace API.Controllers
[HttpGet("chapter-info")]
public async Task<ActionResult<ChapterInfoDto>> GetChapterInfo(int chapterId)
{
// PERF: Write this in one DB call
var chapter = await _cacheService.Ensure(chapterId);
if (chapter == null) return BadRequest("Could not find Chapter");
var volume = await _unitOfWork.SeriesRepository.GetVolumeAsync(chapter.VolumeId);
@ -108,7 +110,7 @@ namespace API.Controllers
foreach (var chapter in volume.Chapters)
{
var userProgress = user.Progresses.SingleOrDefault(x => x.ChapterId == chapter.Id && x.AppUserId == user.Id);
if (userProgress == null) // I need to get all chapters and generate new user progresses for them?
if (userProgress == null) // I need to get all chapters and generate new user progresses for them?
{
user.Progresses.Add(new AppUserProgress
{
@ -126,18 +128,18 @@ namespace API.Controllers
}
}
}
_unitOfWork.UserRepository.Update(user);
if (await _unitOfWork.CommitAsync())
{
return Ok();
}
return BadRequest("There was an issue saving progress");
}
[HttpPost("mark-unread")]
public async Task<ActionResult> MarkUnread(MarkReadDto markReadDto)
{
@ -167,15 +169,15 @@ namespace API.Controllers
}
}
}
_unitOfWork.UserRepository.Update(user);
if (await _unitOfWork.CommitAsync())
{
return Ok();
}
return BadRequest("There was an issue saving progress");
}
@ -183,8 +185,7 @@ namespace API.Controllers
public async Task<ActionResult> MarkVolumeAsRead(MarkVolumeReadDto markVolumeReadDto)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
_logger.LogDebug("Saving {UserName} progress for Volume {VolumeID} to read", user.UserName, markVolumeReadDto.VolumeId);
var chapters = await _unitOfWork.VolumeRepository.GetChaptersAsync(markVolumeReadDto.VolumeId);
foreach (var chapter in chapters)
{
@ -208,7 +209,7 @@ namespace API.Controllers
userProgress.VolumeId = markVolumeReadDto.VolumeId;
}
}
_unitOfWork.UserRepository.Update(user);
if (await _unitOfWork.CommitAsync())
@ -217,14 +218,13 @@ namespace API.Controllers
}
return BadRequest("Could not save progress");
}
}
[HttpPost("bookmark")]
public async Task<ActionResult> Bookmark(BookmarkDto bookmarkDto)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
_logger.LogDebug("Saving {UserName} progress for Chapter {ChapterId} to page {PageNum}", user.UserName, bookmarkDto.ChapterId, bookmarkDto.PageNum);
// Don't let user bookmark past total pages.
var chapter = await _unitOfWork.VolumeRepository.GetChapterAsync(bookmarkDto.ChapterId);
if (bookmarkDto.PageNum > chapter.Pages)
@ -236,8 +236,8 @@ namespace API.Controllers
{
return BadRequest("Can't bookmark less than 0");
}
user.Progresses ??= new List<AppUserProgress>();
var userProgress = user.Progresses.SingleOrDefault(x => x.ChapterId == bookmarkDto.ChapterId && x.AppUserId == user.Id);
@ -261,7 +261,7 @@ namespace API.Controllers
userProgress.BookScrollId = bookmarkDto.BookScrollId;
userProgress.LastModified = DateTime.Now;
}
_unitOfWork.UserRepository.Update(user);
if (await _unitOfWork.CommitAsync())
@ -275,6 +275,9 @@ namespace API.Controllers
/// <summary>
/// Returns the next logical chapter from the series.
/// </summary>
/// <example>
/// V1 → V2 → V3 chapter 0 → V3 chapter 10 → SP 01 → SP 02
/// </example>
/// <param name="seriesId"></param>
/// <param name="volumeId"></param>
/// <param name="currentChapterId"></param>
@ -288,6 +291,7 @@ namespace API.Controllers
var currentChapter = await _unitOfWork.VolumeRepository.GetChapterAsync(currentChapterId);
if (currentVolume.Number == 0)
{
// Handle specials
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer), currentChapter.Number);
if (chapterId > 0) return Ok(chapterId);
}
@ -295,14 +299,24 @@ namespace API.Controllers
foreach (var volume in volumes)
{
if (volume.Number == currentVolume.Number && volume.Chapters.Count > 1)
{
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer), currentChapter.Number);
{
// Handle Chapters within current Volume
// In this case, i need 0 first because 0 represents a full volume file.
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting), currentChapter.Number);
if (chapterId > 0) return Ok(chapterId);
}
if (volume.Number == currentVolume.Number + 1)
{
return Ok(volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).FirstOrDefault()?.Id);
// Handle Chapters within next Volume
// ! When selecting the chapter for the next volume, we need to make sure a c0 comes before a c1+
var chapters = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).ToList();
if (currentChapter.Number.Equals("0") && chapters.Last().Number.Equals("0"))
{
return chapters.Last().Id;
}
return Ok(chapters.FirstOrDefault()?.Id);
}
}
return Ok(-1);
@ -311,7 +325,8 @@ namespace API.Controllers
private static int GetNextChapterId(IEnumerable<Chapter> chapters, string currentChapterNumber)
{
var next = false;
foreach (var chapter in chapters)
var chaptersList = chapters.ToList();
foreach (var chapter in chaptersList)
{
if (next)
{
@ -326,6 +341,9 @@ namespace API.Controllers
/// <summary>
/// Returns the previous logical chapter from the series.
/// </summary>
/// <example>
/// V1 ← V2 ← V3 chapter 0 ← V3 chapter 10 ← SP 01 ← SP 02
/// </example>
/// <param name="seriesId"></param>
/// <param name="volumeId"></param>
/// <param name="currentChapterId"></param>
@ -337,7 +355,7 @@ namespace API.Controllers
var volumes = await _unitOfWork.SeriesRepository.GetVolumesDtoAsync(seriesId, user.Id);
var currentVolume = await _unitOfWork.SeriesRepository.GetVolumeAsync(volumeId);
var currentChapter = await _unitOfWork.VolumeRepository.GetChapterAsync(currentChapterId);
if (currentVolume.Number == 0)
{
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).Reverse(), currentChapter.Number);
@ -348,16 +366,16 @@ namespace API.Controllers
{
if (volume.Number == currentVolume.Number)
{
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).Reverse(), currentChapter.Number);
var chapterId = GetNextChapterId(currentVolume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).Reverse(), currentChapter.Number);
if (chapterId > 0) return Ok(chapterId);
}
if (volume.Number == currentVolume.Number - 1)
{
return Ok(volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparer).LastOrDefault()?.Id);
return Ok(volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).LastOrDefault()?.Id);
}
}
return Ok(-1);
}
}
}
}

View File

@ -26,24 +26,24 @@ namespace API.Controllers
_taskScheduler = taskScheduler;
_unitOfWork = unitOfWork;
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Series>>> GetSeriesForLibrary(int libraryId, [FromQuery] UserParams userParams)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
var series =
await _unitOfWork.SeriesRepository.GetSeriesDtoForLibraryIdAsync(libraryId, user.Id, userParams);
// Apply progress/rating information (I can't work out how to do this in initial query)
if (series == null) return BadRequest("Could not get series for library");
await _unitOfWork.SeriesRepository.AddSeriesModifiers(user.Id, series);
Response.AddPaginationHeader(series.CurrentPage, series.PageSize, series.TotalCount, series.TotalPages);
return Ok(series);
}
[HttpGet("{seriesId}")]
public async Task<ActionResult<SeriesDto>> GetSeries(int seriesId)
{
@ -59,7 +59,7 @@ namespace API.Controllers
var chapterIds = (await _unitOfWork.SeriesRepository.GetChapterIdsForSeriesAsync(new []{seriesId}));
_logger.LogInformation("Series {SeriesId} is being deleted by {UserName}", seriesId, username);
var result = await _unitOfWork.SeriesRepository.DeleteSeriesAsync(seriesId);
if (result)
{
_taskScheduler.CleanupChapters(chapterIds);
@ -78,22 +78,22 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
return Ok(await _unitOfWork.SeriesRepository.GetVolumesDtoAsync(seriesId, user.Id));
}
[HttpGet("volume")]
public async Task<ActionResult<VolumeDto>> GetVolume(int volumeId)
{
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
return Ok(await _unitOfWork.SeriesRepository.GetVolumeDtoAsync(volumeId, user.Id));
}
[HttpGet("chapter")]
public async Task<ActionResult<VolumeDto>> GetChapter(int chapterId)
{
return Ok(await _unitOfWork.VolumeRepository.GetChapterDtoAsync(chapterId));
}
[HttpPost("update-rating")]
public async Task<ActionResult> UpdateSeriesRating(UpdateSeriesRatingDto updateSeriesRatingDto)
@ -105,13 +105,13 @@ namespace API.Controllers
userRating.Rating = updateSeriesRatingDto.UserRating;
userRating.Review = updateSeriesRatingDto.UserReview;
userRating.SeriesId = updateSeriesRatingDto.SeriesId;
if (userRating.Id == 0)
{
user.Ratings ??= new List<AppUserRating>();
user.Ratings.Add(userRating);
}
_unitOfWork.UserRepository.Update(user);
if (!await _unitOfWork.CommitAsync()) return BadRequest("There was a critical error.");
@ -127,7 +127,7 @@ namespace API.Controllers
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(updateSeries.Id);
if (series == null) return BadRequest("Series does not exist");
if (series.Name != updateSeries.Name && await _unitOfWork.SeriesRepository.DoesSeriesNameExistInLibrary(updateSeries.Name))
{
return BadRequest("A series already exists in this library with this name. Series Names must be unique to a library.");
@ -143,7 +143,7 @@ namespace API.Controllers
{
return Ok();
}
return BadRequest("There was an error with updating the series");
}
@ -180,13 +180,21 @@ namespace API.Controllers
return Ok();
}
[Authorize(Policy = "RequireAdminRole")]
[HttpPost("scan")]
public ActionResult ScanSeries(RefreshSeriesDto refreshSeriesDto)
{
_taskScheduler.ScanSeries(refreshSeriesDto.LibraryId, refreshSeriesDto.SeriesId);
return Ok();
}
[HttpGet("metadata")]
public async Task<ActionResult<SeriesMetadataDto>> GetSeriesMetadata(int seriesId)
{
var metadata = await _unitOfWork.SeriesRepository.GetSeriesMetadata(seriesId);
return Ok(metadata);
}
[HttpPost("metadata")]
public async Task<ActionResult> UpdateSeriesMetadata(UpdateSeriesMetadataDto updateSeriesMetadataDto)
{
@ -221,7 +229,7 @@ namespace API.Controllers
var existingTag = series.Metadata.CollectionTags.SingleOrDefault(t => t.Title == tag.Title);
if (existingTag != null)
{
// Update existingTag
// Update existingTag
existingTag.Promoted = tag.Promoted;
existingTag.Title = tag.Title;
existingTag.NormalizedTitle = Parser.Parser.Normalize(tag.Title).ToUpper();
@ -263,17 +271,17 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByUsernameAsync(User.GetUsername());
var series =
await _unitOfWork.SeriesRepository.GetSeriesDtoForCollectionAsync(collectionId, user.Id, userParams);
// Apply progress/rating information (I can't work out how to do this in initial query)
if (series == null) return BadRequest("Could not get series for collection");
await _unitOfWork.SeriesRepository.AddSeriesModifiers(user.Id, series);
Response.AddPaginationHeader(series.CurrentPage, series.PageSize, series.TotalCount, series.TotalPages);
return Ok(series);
}
}
}
}

View File

@ -1,9 +1,12 @@
using System;
using System.IO;
using System.Threading.Tasks;
using API.DTOs;
using API.Extensions;
using API.Interfaces.Services;
using API.Services;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
@ -30,16 +33,26 @@ namespace API.Controllers
_backupService = backupService;
_archiveService = archiveService;
}
[HttpPost("restart")]
public ActionResult RestartServer()
{
_logger.LogInformation("{UserName} is restarting server from admin dashboard", User.GetUsername());
_applicationLifetime.StopApplication();
return Ok();
}
/// <summary>
/// Returns non-sensitive information about the current system
/// </summary>
/// <returns></returns>
[HttpGet("server-info")]
public ActionResult<ServerInfoDto> GetVersion()
{
return Ok(StatsService.GetServerInfo());
}
[HttpGet("logs")]
public async Task<ActionResult> GetLogs()
{
@ -47,14 +60,14 @@ namespace API.Controllers
try
{
var (fileBytes, zipPath) = await _archiveService.CreateZipForDownload(files, "logs");
return File(fileBytes, "application/zip", Path.GetFileName(zipPath));
return File(fileBytes, "application/zip", Path.GetFileName(zipPath));
}
catch (KavitaException ex)
{
return BadRequest(ex.Message);
}
}
}
}
}

View File

@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.DTOs;
@ -9,6 +10,7 @@ using API.Extensions;
using API.Helpers.Converters;
using API.Interfaces;
using Kavita.Common;
using Kavita.Common.Extensions;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
@ -31,7 +33,7 @@ namespace API.Controllers
_taskScheduler = taskScheduler;
_configuration = configuration;
}
[HttpGet("")]
public async Task<ActionResult<ServerSettingDto>> GetSettings()
{
@ -40,7 +42,7 @@ namespace API.Controllers
settingsDto.LoggingLevel = Configuration.GetLogLevel(Program.GetAppSettingFilename());
return Ok(settingsDto);
}
[HttpPost("")]
public async Task<ActionResult<ServerSettingDto>> UpdateSettings(ServerSettingDto updateSettingsDto)
{
@ -58,7 +60,7 @@ namespace API.Controllers
// We do not allow CacheDirectory changes, so we will ignore.
var currentSettings = await _unitOfWork.SettingsRepository.GetSettingsAsync();
var logLevelOptions = new LogLevelOptions();
_configuration.GetSection("Logging:LogLevel").Bind(logLevelOptions);
@ -75,7 +77,7 @@ namespace API.Controllers
setting.Value = updateSettingsDto.TaskScan;
_unitOfWork.SettingsRepository.Update(setting);
}
if (setting.Key == ServerSettingKey.Port && updateSettingsDto.Port + "" != setting.Value)
{
setting.Value = updateSettingsDto.Port + "";
@ -83,14 +85,14 @@ namespace API.Controllers
Configuration.UpdatePort(Program.GetAppSettingFilename(), updateSettingsDto.Port);
_unitOfWork.SettingsRepository.Update(setting);
}
if (setting.Key == ServerSettingKey.LoggingLevel && updateSettingsDto.LoggingLevel + "" != setting.Value)
{
setting.Value = updateSettingsDto.LoggingLevel + "";
Configuration.UpdateLogLevel(Program.GetAppSettingFilename(), updateSettingsDto.LoggingLevel);
_unitOfWork.SettingsRepository.Update(setting);
}
if (setting.Key == ServerSettingKey.AllowStatCollection && updateSettingsDto.AllowStatCollection + "" != setting.Value)
{
setting.Value = updateSettingsDto.AllowStatCollection + "";
@ -105,7 +107,7 @@ namespace API.Controllers
}
}
}
_configuration.GetSection("Logging:LogLevel:Default").Value = updateSettingsDto.LoggingLevel + "";
if (!_unitOfWork.HasChanges()) return Ok("Nothing was updated");
@ -119,23 +121,23 @@ namespace API.Controllers
_taskScheduler.ScheduleTasks();
return Ok(updateSettingsDto);
}
[HttpGet("task-frequencies")]
public ActionResult<IEnumerable<string>> GetTaskFrequencies()
{
return Ok(CronConverter.Options);
}
[HttpGet("library-types")]
public ActionResult<IEnumerable<string>> GetLibraryTypes()
{
return Ok(Enum.GetNames(typeof(LibraryType)));
return Ok(Enum.GetValues<LibraryType>().Select(t => t.ToDescription()));
}
[HttpGet("log-levels")]
public ActionResult<IEnumerable<string>> GetLogLevels()
{
return Ok(new [] {"Trace", "Debug", "Information", "Warning", "Critical"});
}
}
}
}

View File

@ -21,7 +21,7 @@ namespace API.Data
_context = context;
_mapper = mapper;
}
public void Add(Library library)
{
_context.Library.Add(library);
@ -32,6 +32,11 @@ namespace API.Data
_context.Entry(library).State = EntityState.Modified;
}
public void Delete(Library library)
{
_context.Library.Remove(library);
}
public async Task<IEnumerable<LibraryDto>> GetLibraryDtosForUsernameAsync(string userName)
{
return await _context.Library
@ -43,7 +48,7 @@ namespace API.Data
.AsSingleQuery()
.ToListAsync();
}
public async Task<IEnumerable<Library>> GetLibrariesAsync()
{
return await _context.Library
@ -101,7 +106,7 @@ namespace API.Data
/// <returns></returns>
public async Task<Library> GetFullLibraryForIdAsync(int libraryId)
{
return await _context.Library
.Where(x => x.Id == libraryId)
.Include(f => f.Folders)
@ -114,7 +119,29 @@ namespace API.Data
.AsSplitQuery()
.SingleAsync();
}
/// <summary>
/// This is a heavy call, pulls all entities for a Library, except this version only grabs for one series id
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
/// <returns></returns>
public async Task<Library> GetFullLibraryForIdAsync(int libraryId, int seriesId)
{
return await _context.Library
.Where(x => x.Id == libraryId)
.Include(f => f.Folders)
.Include(l => l.Series.Where(s => s.Id == seriesId))
.ThenInclude(s => s.Metadata)
.Include(l => l.Series.Where(s => s.Id == seriesId))
.ThenInclude(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Files)
.AsSplitQuery()
.SingleAsync();
}
public async Task<bool> LibraryExists(string libraryName)
{
return await _context.Library
@ -131,7 +158,7 @@ namespace API.Data
.ProjectTo<LibraryDto>(_mapper.ConfigurationProvider)
.ToListAsync();
}
}
}
}

View File

@ -9,6 +9,10 @@ namespace API.Entities.Enums
[Description("Comic")]
Comic = 1,
[Description("Book")]
Book = 2
Book = 2,
[Description("Images (Manga)")]
MangaImages = 3,
[Description("Images (Comic)")]
ComicImages = 4
}
}
}

View File

@ -5,10 +5,13 @@ namespace API.Entities.Enums
public enum ReaderMode
{
[Description("Left and Right")]
// ReSharper disable once InconsistentNaming
MANGA_LR = 0,
[Description("Up and Down")]
// ReSharper disable once InconsistentNaming
MANGA_UP = 1,
[Description("Webtoon")]
// ReSharper disable once InconsistentNaming
WEBTOON = 2
}
}
}

View File

@ -7,7 +7,7 @@ namespace API.Entities
{
public int Id { get; set; }
public string Name { get; set; }
// TODO: MetadataUpdate add ProviderId
// MetadataUpdate add ProviderId
[ConcurrencyCheck]
public uint RowVersion { get; set; }
@ -17,4 +17,4 @@ namespace API.Entities
RowVersion++;
}
}
}
}

View File

@ -9,17 +9,13 @@ namespace API.Entities
public class SeriesMetadata : IHasConcurrencyToken
{
public int Id { get; set; }
/// <summary>
/// Publisher of book or manga/comic
/// </summary>
//public string Publisher { get; set; }
public ICollection<CollectionTag> CollectionTags { get; set; }
// Relationship
public Series Series { get; set; }
public int SeriesId { get; set; }
[ConcurrencyCheck]
public uint RowVersion { get; set; }
@ -28,4 +24,4 @@ namespace API.Entities
RowVersion++;
}
}
}
}

View File

@ -16,7 +16,7 @@ namespace API.Extensions
{
public static class ApplicationServiceExtensions
{
public static IServiceCollection AddApplicationServices(this IServiceCollection services, IConfiguration config, IWebHostEnvironment env)
public static void AddApplicationServices(this IServiceCollection services, IConfiguration config, IWebHostEnvironment env)
{
services.AddAutoMapper(typeof(AutoMapperProfiles).Assembly);
services.AddScoped<IStatsService, StatsService>();
@ -31,19 +31,13 @@ namespace API.Extensions
services.AddScoped<IBackupService, BackupService>();
services.AddScoped<ICleanupService, CleanupService>();
services.AddScoped<IBookService, BookService>();
services.AddScoped<IImageService, ImageService>();
services.AddSqLite(config, env);
services.AddLogging(loggingBuilder =>
{
var loggingSection = config.GetSection("Logging");
loggingBuilder.AddFile(loggingSection);
});
return services;
services.AddLogging(config);
}
private static IServiceCollection AddSqLite(this IServiceCollection services, IConfiguration config,
private static void AddSqLite(this IServiceCollection services, IConfiguration config,
IWebHostEnvironment env)
{
services.AddDbContext<DataContext>(options =>
@ -51,8 +45,15 @@ namespace API.Extensions
options.UseSqlite(config.GetConnectionString("DefaultConnection"));
options.EnableSensitiveDataLogging(env.IsDevelopment() || Configuration.GetLogLevel(Program.GetAppSettingFilename()).Equals("Debug"));
});
}
return services;
private static void AddLogging(this IServiceCollection services, IConfiguration config)
{
services.AddLogging(loggingBuilder =>
{
var loggingSection = config.GetSection("Logging");
loggingBuilder.AddFile(loggingSection);
});
}
}
}
}

View File

@ -9,14 +9,26 @@ namespace API.Extensions
private static readonly NaturalSortComparer Comparer = new NaturalSortComparer();
public static void Empty(this DirectoryInfo directory)
{
foreach(FileInfo file in directory.EnumerateFiles()) file.Delete();
foreach(DirectoryInfo subDirectory in directory.EnumerateDirectories()) subDirectory.Delete(true);
// NOTE: We have this in DirectoryService.Empty(), do we need this here?
foreach(FileInfo file in directory.EnumerateFiles()) file.Delete();
foreach(DirectoryInfo subDirectory in directory.EnumerateDirectories()) subDirectory.Delete(true);
}
public static void RemoveNonImages(this DirectoryInfo directory)
{
foreach (var file in directory.EnumerateFiles())
{
if (!Parser.Parser.IsImage(file.FullName))
{
file.Delete();
}
}
}
/// <summary>
/// Flattens all files in subfolders to the passed directory recursively.
///
///
///
///
/// foo<para />
/// ├── 1.txt<para />
/// ├── 2.txt<para />
@ -26,7 +38,7 @@ namespace API.Extensions
/// ├── 1.txt<para />
/// ├── 2.txt<para />
/// └── 5.txt<para />
///
///
/// becomes:<para />
/// foo<para />
/// ├── 1.txt<para />
@ -49,7 +61,7 @@ namespace API.Extensions
if (!root.FullName.Equals(directory.FullName))
{
var fileIndex = 1;
foreach (var file in directory.EnumerateFiles().OrderBy(file => file.FullName, Comparer))
{
if (file.Directory == null) continue;
@ -63,11 +75,11 @@ namespace API.Extensions
directoryIndex++;
}
foreach (var subDirectory in directory.EnumerateDirectories())
{
FlattenDirectory(root, subDirectory, ref directoryIndex);
}
}
}
}
}

View File

@ -1,6 +1,7 @@
using System.Collections.Generic;
using System.Linq;
using API.Entities;
using API.Parser;
namespace API.Extensions
{
@ -14,7 +15,21 @@ namespace API.Extensions
/// <returns></returns>
public static bool NameInList(this Series series, IEnumerable<string> list)
{
return list.Any(name => Parser.Parser.Normalize(name) == series.NormalizedName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.Name) || name == series.Name || name == series.LocalizedName || name == series.OriginalName);
return list.Any(name => Parser.Parser.Normalize(name) == series.NormalizedName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.Name)
|| name == series.Name || name == series.LocalizedName || name == series.OriginalName || Parser.Parser.Normalize(name) == Parser.Parser.Normalize(series.OriginalName));
}
/// <summary>
/// Checks against all the name variables of the Series if it matches the <see cref="ParserInfo"/>
/// </summary>
/// <param name="series"></param>
/// <param name="info"></param>
/// <returns></returns>
public static bool NameInParserInfo(this Series series, ParserInfo info)
{
if (info == null) return false;
return Parser.Parser.Normalize(info.Series) == series.NormalizedName || Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.Name)
|| info.Series == series.Name || info.Series == series.LocalizedName || info.Series == series.OriginalName || Parser.Parser.Normalize(info.Series) == Parser.Parser.Normalize(series.OriginalName);
}
}
}
}

View File

@ -1,5 +1,4 @@
using System;
using API.Interfaces.Services;
using API.Interfaces.Services;
using API.Services.Clients;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
@ -22,4 +21,4 @@ namespace API.Extensions
return services;
}
}
}
}

View File

@ -10,14 +10,16 @@ namespace API.Interfaces
{
void Add(Library library);
void Update(Library library);
void Delete(Library library);
Task<IEnumerable<LibraryDto>> GetLibraryDtosAsync();
Task<bool> LibraryExists(string libraryName);
Task<Library> GetLibraryForIdAsync(int libraryId);
Task<Library> GetFullLibraryForIdAsync(int libraryId);
Task<Library> GetFullLibraryForIdAsync(int libraryId, int seriesId);
Task<IEnumerable<LibraryDto>> GetLibraryDtosForUsernameAsync(string userName);
Task<IEnumerable<Library>> GetLibrariesAsync();
Task<bool> DeleteLibrary(int libraryId);
Task<IEnumerable<Library>> GetLibrariesForUserIdAsync(int userId);
Task<LibraryType> GetLibraryTypeAsync(int libraryId);
}
}
}

View File

@ -11,7 +11,8 @@
void RefreshMetadata(int libraryId, bool forceUpdate = true);
void CleanupTemp();
void RefreshSeriesMetadata(int libraryId, int seriesId);
void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false);
void ScheduleStatsTasks();
void CancelStatsTasks();
}
}
}

View File

@ -17,4 +17,4 @@ namespace API.Interfaces.Services
bool ArchiveNeedsFlattening(ZipArchive archive);
Task<Tuple<byte[], string>> CreateZipForDownload(IEnumerable<string> files, string tempFolder);
}
}
}

View File

@ -3,7 +3,7 @@ using System.Threading.Tasks;
using API.Parser;
using VersOne.Epub;
namespace API.Interfaces
namespace API.Interfaces.Services
{
public interface IBookService
{
@ -23,4 +23,4 @@ namespace API.Interfaces
string GetSummaryInfo(string filePath);
ParserInfo ParseInfo(string filePath);
}
}
}

View File

@ -25,5 +25,8 @@ namespace API.Interfaces.Services
IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly);
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName);
}
}
}

View File

@ -0,0 +1,10 @@
using API.Entities;
namespace API.Interfaces.Services
{
public interface IImageService
{
byte[] GetCoverImage(string path, bool createThumbnail = false);
string GetCoverFile(MangaFile file);
}
}

View File

@ -1,4 +1,7 @@

using System.Threading;
using System.Threading.Tasks;
namespace API.Interfaces.Services
{
public interface IScannerService
@ -11,5 +14,6 @@ namespace API.Interfaces.Services
/// <param name="forceUpdate">Force overwriting for cover images</param>
void ScanLibrary(int libraryId, bool forceUpdate);
void ScanLibraries();
Task ScanSeries(int libraryId, int seriesId, bool forceUpdate, CancellationToken token);
}
}
}

View File

@ -12,7 +12,7 @@ namespace API.Parser
public const string DefaultChapter = "0";
public const string DefaultVolume = "0";
public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|.cb7";
public const string ArchiveFileExtensions = @"\.cbz|\.zip|\.rar|\.cbr|\.tar.gz|\.7zip|\.7z|\.cb7|\.cbt";
public const string BookFileExtensions = @"\.epub";
public const string ImageFileExtensions = @"^(\.png|\.jpeg|\.jpg)";
public static readonly Regex FontSrcUrlRegex = new Regex(@"(src:url\(.{1})" + "([^\"']*)" + @"(.{1}\))", RegexOptions.IgnoreCase | RegexOptions.Compiled);
@ -24,7 +24,7 @@ namespace API.Parser
private static readonly Regex XmlRegex = new Regex(XmlRegexExtensions, RegexOptions.IgnoreCase | RegexOptions.Compiled);
private static readonly Regex BookFileRegex = new Regex(BookFileExtensions, RegexOptions.IgnoreCase | RegexOptions.Compiled);
private static readonly Regex CoverImageRegex = new Regex(@"(?<![[a-z]\d])(?:!?)(cover|folder)(?![\w\d])", RegexOptions.IgnoreCase | RegexOptions.Compiled);
private static readonly Regex[] MangaVolumeRegex = new[]
{
@ -53,34 +53,45 @@ namespace API.Parser
@"(volume )(?<Volume>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Tower Of God S01 014 (CBT) (digital).cbz
new Regex(
new Regex(
@"(?<Series>.*)(\b|_|)(S(?<Volume>\d+))",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
private static readonly Regex[] MangaSeriesRegex = new[]
{
// [SugoiSugoi]_NEEDLESS_Vol.2_-_Disk_The_Informant_5_[ENG].rar
// Grand Blue Dreaming - SP02
new Regex(
@"^(?<Series>.*)( |_)Vol\.?\d+",
@"(?<Series>.*)(\b|_|-|\s)(?:sp)\d",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// [SugoiSugoi]_NEEDLESS_Vol.2_-_Disk_The_Informant_5_[ENG].rar, Yuusha Ga Shinda! - Vol.tbd Chapter 27.001 V2 Infection ①.cbz
new Regex(
@"^(?<Series>.*)( |_)Vol\.?(\d+|tbd)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Ichiban_Ushiro_no_Daimaou_v04_ch34_[VISCANS].zip, VanDread-v01-c01.zip
new Regex(
@"(?<Series>.*)(\b|_)v(?<Volume>\d+-?\d*)( |_|-)",
@"(?<Series>.*)(\b|_)v(?<Volume>\d+-?\d*)(\s|_|-)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Gokukoku no Brynhildr - c001-008 (v01) [TrinityBAKumA], Black Bullet - v4 c17 [batoto]
new Regex(
@"(?<Series>.*)( - )(?:v|vo|c)\d",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// [dmntsf.net] One Piece - Digital Colored Comics Vol. 20 Ch. 177 - 30 Million vs 81 Million.cbz
new Regex(
@"(?<Series>.*) (\b|_|-)(vol)\.?",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Kedouin Makoto - Corpse Party Musume, Chapter 19 [Dametrans].zip
new Regex(
@"(?<Series>.*)(?:, Chapter )(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Mad Chimera World - Volume 005 - Chapter 026.cbz (couldn't figure out how to get Volume negative lookaround working on below regex)
new Regex(
@"(?<Series>.*)(\s|_|-)(?:Volume(\s|_|-)+\d+)(\s|_|-)+(?:Chapter)(\s|_|-)(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Please Go Home, Akutsu-San! - Chapter 038.5 - Volume Announcement.cbz
new Regex(
@"(?<Series>.*)(\s|_|-)(?!Vol)(\s|_|-)(?:Chapter)(\s|_|-)(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// [dmntsf.net] One Piece - Digital Colored Comics Vol. 20 Ch. 177 - 30 Million vs 81 Million.cbz
new Regex(
@"(?<Series>.*) (\b|_|-)(vol)\.?",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
//Knights of Sidonia c000 (S2 LE BD Omake - BLAME!) [Habanero Scans]
new Regex(
@"(?<Series>.*)(\bc\d+\b)",
@ -89,9 +100,9 @@ namespace API.Parser
new Regex(
@"(?<Series>.*)(?: _|-|\[|\()\s?vol(ume)?",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Momo The Blood Taker - Chapter 027 Violent Emotion.cbz
// Momo The Blood Taker - Chapter 027 Violent Emotion.cbz, Grand Blue Dreaming - SP02 Extra (2019) (Digital) (danke-Empire).cbz
new Regex(
@"(?<Series>.*)(\b|_|-|\s)(?:chapter)(\b|_|-|\s)\d",
@"(?<Series>.*)(\b|_|-|\s)(?:(chapter(\b|_|-|\s))|sp)\d",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Historys Strongest Disciple Kenichi_v11_c90-98.zip, Killing Bites Vol. 0001 Ch. 0001 - Galactica Scanlations (gb)
new Regex(
@ -108,7 +119,7 @@ namespace API.Parser
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Hinowa ga CRUSH! 018 (2019) (Digital) (LuCaZ).cbz
new Regex(
@"(?<Series>.*) (?<Chapter>\d+) (?:\(\d{4}\)) ",
@"(?<Series>.*) (?<Chapter>\d+) (?:\(\d{4}\)) ",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Goblin Slayer - Brand New Day 006.5 (2019) (Digital) (danke-Empire)
new Regex(
@ -142,25 +153,25 @@ namespace API.Parser
new Regex(
@"^(?!Vol)(?<Series>.*)( |_)Chapter( |_)(\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Fullmetal Alchemist chapters 101-108.cbz
new Regex(
@"^(?!vol)(?<Series>.*)( |_)(chapters( |_)?)\d+-?\d*",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Umineko no Naku Koro ni - Episode 1 - Legend of the Golden Witch #1
new Regex(
@"^(?!Vol\.?)(?<Series>.*)( |_|-)(?<!-)(episode ?)\d+-?\d*",
@"^(?!Vol\.?)(?<Series>.*)( |_|-)(?<!-)(episode|chapter|(ch\.?) ?)\d+-?\d*",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Baketeriya ch01-05.zip
new Regex(
@"^(?!Vol)(?<Series>.*)ch\d+-?\d?",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Magi - Ch.252-005.cbz
new Regex(
@"(?<Series>.*)( ?- ?)Ch\.\d+-?\d*",
@"(?<Series>.*)( ?- ?)Ch\.\d+-?\d*",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// [BAA]_Darker_than_Black_Omake-1.zip
// [BAA]_Darker_than_Black_Omake-1.zip
new Regex(
@"^(?!Vol)(?<Series>.*)(-)\d+-?\d*", // This catches a lot of stuff ^(?!Vol)(?<Series>.*)( |_)(\d+)
RegexOptions.IgnoreCase | RegexOptions.Compiled),
@ -177,7 +188,7 @@ namespace API.Parser
@"^(?!Vol)(?<Series>.*)( |_|-)(ch?)\d+",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
private static readonly Regex[] ComicSeriesRegex = new[]
{
// Invincible Vol 01 Family matters (2005) (Digital)
@ -229,7 +240,7 @@ namespace API.Parser
@"^(?<Series>.*)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
private static readonly Regex[] ComicVolumeRegex = new[]
{
// 04 - Asterix the Gladiator (1964) (Digital-Empire) (WebP by Doc MaKS)
@ -261,18 +272,10 @@ namespace API.Parser
@"^(?<Series>.*)(?: |_)#(?<Volume>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
private static readonly Regex[] ComicChapterRegex = new[]
{
// // 04 - Asterix the Gladiator (1964) (Digital-Empire) (WebP by Doc MaKS)
// new Regex(
// @"^(?<Volume>\d+) (- |_)?(?<Series>.*(\d{4})?)( |_)(\(|\d+)",
// RegexOptions.IgnoreCase | RegexOptions.Compiled),
// // 01 Spider-Man & Wolverine 01.cbr
// new Regex(
// @"^(?<Volume>\d+) (?:- )?(?<Series>.*) (\d+)?", // NOTE: WHy is this here without a capture group
// RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Batman & Wildcat (1 of 3)
// Batman & Wildcat (1 of 3)
new Regex(
@"(?<Series>.*(\d{4})?)( |_)(?:\((?<Chapter>\d+) of \d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
@ -307,7 +310,7 @@ namespace API.Parser
// [TrinityBAKumA Finella&anon], [BAA]_, [SlowManga&OverloadScans], [batoto]
new Regex(@"(?:\[(?<subgroup>(?!\s).+?(?<!\s))\](?:_|-|\s|\.)?)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// (Shadowcat-Empire),
// (Shadowcat-Empire),
// new Regex(@"(?:\[(?<subgroup>(?!\s).+?(?<!\s))\](?:_|-|\s|\.)?)",
// RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
@ -323,24 +326,24 @@ namespace API.Parser
@"v\d+\.(?<Chapter>\d+(?:.\d+|-\d+)?)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Umineko no Naku Koro ni - Episode 3 - Banquet of the Golden Witch #02.cbz (Rare case, if causes issue remove)
new Regex(
new Regex(
@"^(?<Series>.*)(?: |_)#(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Green Worldz - Chapter 027
new Regex(
@"^(?!Vol)(?<Series>.*)\s?(?<!vol\. )\sChapter\s(?<Chapter>\d+(?:.\d+|-\d+)?)",
@"^(?!Vol)(?<Series>.*)\s?(?<!vol\. )\sChapter\s(?<Chapter>\d+(?:\.?[\d-])?)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Hinowa ga CRUSH! 018 (2019) (Digital) (LuCaZ).cbz, Hinowa ga CRUSH! 018.5 (2019) (Digital) (LuCaZ).cbz
// Hinowa ga CRUSH! 018 (2019) (Digital) (LuCaZ).cbz, Hinowa ga CRUSH! 018.5 (2019) (Digital) (LuCaZ).cbz
new Regex(
@"^(?!Vol)(?<Series>.*) (?<!vol\. )(?<Chapter>\d+(?:.\d+|-\d+)?)(?: \(\d{4}\))?(\b|_|-)",
@"^(?!Vol)(?<Series>.*)\s(?<!vol\. )(?<Chapter>\d+(?:.\d+|-\d+)?)(?:\s\(\d{4}\))?(\b|_|-)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Tower Of God S01 014 (CBT) (digital).cbz
new Regex(
@"(?<Series>.*) S(?<Volume>\d+) (?<Chapter>\d+(?:.\d+|-\d+)?)",
@"(?<Series>.*)\sS(?<Volume>\d+)\s(?<Chapter>\d+(?:.\d+|-\d+)?)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Beelzebub_01_[Noodles].zip, Beelzebub_153b_RHS.zip
new Regex(
@"^((?!v|vo|vol|Volume).)*( |_)(?<Chapter>\.?\d+(?:.\d+|-\d+)?)(?<ChapterPart>b)?( |_|\[|\()",
@"^((?!v|vo|vol|Volume).)*(\s|_)(?<Chapter>\.?\d+(?:.\d+|-\d+)?)(?<ChapterPart>b)?(\s|_|\[|\()",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Yumekui-Merry_DKThias_Chapter21.zip
new Regex(
@ -348,14 +351,18 @@ namespace API.Parser
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// [Hidoi]_Amaenaideyo_MS_vol01_chp02.rar
new Regex(
@"(?<Series>.*)( |_)(vol\d+)?( |_)Chp\.? ?(?<Chapter>\d+)",
@"(?<Series>.*)(\s|_)(vol\d+)?(\s|_)Chp\.? ?(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Vol 1 Chapter 2
new Regex(
@"(?<Volume>((vol|volume|v))?(\s|_)?\.?\d+)(\s|_)(Chp|Chapter)\.?(\s|_)?(?<Chapter>\d+)",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
};
private static readonly Regex[] MangaEditionRegex = {
// Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz
new Regex(
@"(?<Edition>({|\(|\[).* Edition(}|\)|\]))",
@"(?<Edition>({|\(|\[).* Edition(}|\)|\]))",
RegexOptions.IgnoreCase | RegexOptions.Compiled),
// Tenjo Tenge {Full Contact Edition} v01 (2011) (Digital) (ASTC).cbz
new Regex(
@ -441,31 +448,18 @@ namespace API.Parser
};
}
if (ret.Series == string.Empty)
if (type is LibraryType.ComicImages or LibraryType.MangaImages)
{
// Reset Chapters, Volumes, and Series as images are not good to parse information out of. Better to use folders.
ret.Volumes = DefaultVolume;
ret.Chapters = DefaultChapter;
ret.Series = string.Empty;
}
if (ret.Series == string.Empty || (type is LibraryType.ComicImages or LibraryType.MangaImages))
{
// Try to parse information out of each folder all the way to rootPath
var fallbackFolders = DirectoryService.GetFoldersTillRoot(rootPath, Path.GetDirectoryName(filePath)).ToList();
for (var i = 0; i < fallbackFolders.Count; i++)
{
var folder = fallbackFolders[i];
if (!string.IsNullOrEmpty(ParseMangaSpecial(folder))) continue;
if (ParseVolume(folder) != DefaultVolume || ParseChapter(folder) != DefaultChapter) continue;
var series = ParseSeries(folder);
if ((string.IsNullOrEmpty(series) && i == fallbackFolders.Count - 1))
{
ret.Series = CleanTitle(folder);
break;
}
if (!string.IsNullOrEmpty(series))
{
ret.Series = series;
break;
}
}
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
var edition = ParseEdition(fileName);
@ -476,7 +470,7 @@ namespace API.Parser
}
var isSpecial = ParseMangaSpecial(fileName);
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// We must ensure that we can only parse a special out. As some files will have v20 c171-180+Omake and that
// could cause a problem as Omake is a special term, but there is valid volume/chapter information.
if (ret.Chapters == DefaultChapter && ret.Volumes == DefaultVolume && !string.IsNullOrEmpty(isSpecial))
{
@ -488,7 +482,11 @@ namespace API.Parser
ret.IsSpecial = true;
ret.Chapters = DefaultChapter;
ret.Volumes = DefaultVolume;
ParseFromFallbackFolders(filePath, rootPath, type, ref ret);
}
// here is the issue. If we are a special with marker, we need to ensure we use the correct series name.
// we can do this by falling back
if (string.IsNullOrEmpty(ret.Series))
{
@ -498,6 +496,54 @@ namespace API.Parser
return ret.Series == string.Empty ? null : ret;
}
/// <summary>
///
/// </summary>
/// <param name="filePath"></param>
/// <param name="rootPath"></param>
/// <param name="type"></param>
/// <param name="ret">Expects a non-null ParserInfo which this method will populate</param>
public static void ParseFromFallbackFolders(string filePath, string rootPath, LibraryType type, ref ParserInfo ret)
{
var fallbackFolders = DirectoryService.GetFoldersTillRoot(rootPath, filePath).ToList();
for (var i = 0; i < fallbackFolders.Count; i++)
{
var folder = fallbackFolders[i];
if (!string.IsNullOrEmpty(ParseMangaSpecial(folder))) continue;
var parsedVolume = (type is LibraryType.Manga or LibraryType.MangaImages) ? ParseVolume(folder) : ParseComicVolume(folder);
var parsedChapter = (type is LibraryType.Manga or LibraryType.MangaImages) ? ParseChapter(folder) : ParseComicChapter(folder);
if (!parsedVolume.Equals(DefaultVolume) || !parsedChapter.Equals(DefaultChapter))
{
if ((ret.Volumes.Equals(DefaultVolume) || string.IsNullOrEmpty(ret.Volumes)) && !parsedVolume.Equals(DefaultVolume))
{
ret.Volumes = parsedVolume;
}
if ((ret.Chapters.Equals(DefaultChapter) || string.IsNullOrEmpty(ret.Chapters)) && !parsedChapter.Equals(DefaultChapter))
{
ret.Chapters = parsedChapter;
}
continue;
}
var series = ParseSeries(folder);
if ((string.IsNullOrEmpty(series) && i == fallbackFolders.Count - 1))
{
ret.Series = CleanTitle(folder);
break;
}
if (!string.IsNullOrEmpty(series))
{
ret.Series = series;
break;
}
}
}
public static MangaFormat ParseFormat(string filePath)
{
if (IsArchive(filePath)) return MangaFormat.Archive;
@ -517,17 +563,17 @@ namespace API.Parser
{
var edition = match.Groups["Edition"].Value.Replace("{", "").Replace("}", "")
.Replace("[", "").Replace("]", "").Replace("(", "").Replace(")", "");
return edition;
}
}
}
return string.Empty;
}
/// <summary>
/// If the file has SP marker.
/// If the file has SP marker.
/// </summary>
/// <param name="filePath"></param>
/// <returns></returns>
@ -541,10 +587,10 @@ namespace API.Parser
return true;
}
}
return false;
}
public static string ParseMangaSpecial(string filePath)
{
foreach (var regex in MangaSpecialRegex)
@ -558,10 +604,10 @@ namespace API.Parser
}
}
}
return string.Empty;
}
public static string ParseSeries(string filename)
{
foreach (var regex in MangaSeriesRegex)
@ -575,7 +621,7 @@ namespace API.Parser
}
}
}
return string.Empty;
}
public static string ParseComicSeries(string filename)
@ -591,7 +637,7 @@ namespace API.Parser
}
}
}
return string.Empty;
}
@ -603,7 +649,7 @@ namespace API.Parser
foreach (Match match in matches)
{
if (!match.Groups["Volume"].Success || match.Groups["Volume"] == Match.Empty) continue;
var value = match.Groups["Volume"].Value;
if (!value.Contains("-")) return RemoveLeadingZeroes(match.Groups["Volume"].Value);
var tokens = value.Split("-");
@ -613,7 +659,7 @@ namespace API.Parser
}
}
return DefaultVolume;
}
@ -625,7 +671,7 @@ namespace API.Parser
foreach (Match match in matches)
{
if (!match.Groups["Volume"].Success || match.Groups["Volume"] == Match.Empty) continue;
var value = match.Groups["Volume"].Value;
if (!value.Contains("-")) return RemoveLeadingZeroes(match.Groups["Volume"].Value);
var tokens = value.Split("-");
@ -635,7 +681,7 @@ namespace API.Parser
}
}
return DefaultVolume;
}
@ -647,7 +693,7 @@ namespace API.Parser
foreach (Match match in matches)
{
if (!match.Groups["Chapter"].Success || match.Groups["Chapter"] == Match.Empty) continue;
var value = match.Groups["Chapter"].Value;
var hasChapterPart = match.Groups["ChapterPart"].Success;
@ -655,7 +701,7 @@ namespace API.Parser
{
return RemoveLeadingZeroes(hasChapterPart ? AddChapterPart(value) : value);
}
var tokens = value.Split("-");
var from = RemoveLeadingZeroes(tokens[0]);
var to = RemoveLeadingZeroes(hasChapterPart ? AddChapterPart(tokens[1]) : tokens[1]);
@ -676,7 +722,7 @@ namespace API.Parser
return $"{value}.5";
}
public static string ParseComicChapter(string filename)
{
foreach (var regex in ComicChapterRegex)
@ -718,7 +764,7 @@ namespace API.Parser
}
}
}
foreach (var regex in MangaEditionRegex)
{
var matches = regex.Matches(title);
@ -733,7 +779,7 @@ namespace API.Parser
return title;
}
private static string RemoveSpecialTags(string title)
{
foreach (var regex in MangaSpecialRegex)
@ -750,9 +796,9 @@ namespace API.Parser
return title;
}
/// <summary>
/// Translates _ -> spaces, trims front and back of string, removes release groups
/// </summary>
@ -820,13 +866,13 @@ namespace API.Parser
_ => number
};
}
public static string RemoveLeadingZeroes(string title)
{
var ret = title.TrimStart(new[] { '0' });
return ret == string.Empty ? "0" : ret;
}
public static bool IsArchive(string filePath)
{
return ArchiveFileRegex.IsMatch(Path.GetExtension(filePath));
@ -841,12 +887,12 @@ namespace API.Parser
if (filePath.StartsWith(".") || (!suppressExtraChecks && filePath.StartsWith("!"))) return false;
return ImageRegex.IsMatch(Path.GetExtension(filePath));
}
public static bool IsXml(string filePath)
{
return XmlRegex.IsMatch(Path.GetExtension(filePath));
}
public static float MinimumNumberFromRange(string range)
{
try
@ -891,4 +937,4 @@ namespace API.Parser
return Path.GetExtension(filePath).ToLower() == ".epub";
}
}
}
}

View File

@ -5,7 +5,6 @@ using System.Threading;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Services.HostedServices;
using Kavita.Common;
using Kavita.Common.EnvironmentInfo;
using Microsoft.AspNetCore.Hosting;
@ -26,14 +25,14 @@ namespace API
protected Program()
{
}
public static string GetAppSettingFilename()
{
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
var isDevelopment = environment == Environments.Development;
return "appsettings" + (isDevelopment ? ".Development" : "") + ".json";
}
public static async Task Main(string[] args)
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
@ -47,7 +46,7 @@ namespace API
var base64 = Convert.ToBase64String(rBytes).Replace("/", "");
Configuration.UpdateJwtToken(GetAppSettingFilename(), base64);
}
// Get HttpPort from Config
_httpPort = Configuration.GetPort(GetAppSettingFilename());
@ -86,7 +85,7 @@ namespace API
options.Protocols = HttpProtocols.Http1AndHttp2;
});
});
var environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");
if (environment != Environments.Development)
{
@ -125,7 +124,7 @@ namespace API
sentryEvent.ServerName = null; // Never send Server Name to Sentry
return sentryEvent;
};
options.ConfigureScope(scope =>
{
scope.User = new User()
@ -143,7 +142,7 @@ namespace API
});
}
webBuilder.UseStartup<Startup>();
});
}

View File

@ -28,7 +28,6 @@ namespace API.Services
{
private readonly ILogger<ArchiveService> _logger;
private readonly IDirectoryService _directoryService;
private const int ThumbnailWidth = 320; // 153w x 230h
private static readonly RecyclableMemoryStreamManager StreamManager = new();
private readonly NaturalSortComparer _comparer;
@ -38,7 +37,7 @@ namespace API.Services
_directoryService = directoryService;
_comparer = new NaturalSortComparer();
}
/// <summary>
/// Checks if a File can be opened. Requires up to 2 opens of the filestream.
/// </summary>
@ -47,7 +46,7 @@ namespace API.Services
public virtual ArchiveLibrary CanOpen(string archivePath)
{
if (!(File.Exists(archivePath) && Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath))) return ArchiveLibrary.NotSupported;
try
{
using var a2 = ZipFile.OpenRead(archivePath);
@ -90,7 +89,7 @@ namespace API.Services
{
_logger.LogDebug("Using SharpCompress compression handling");
using var archive = ArchiveFactory.Open(archivePath);
return archive.Entries.Count(entry => !entry.IsDirectory &&
return archive.Entries.Count(entry => !entry.IsDirectory &&
!Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Parser.Parser.IsImage(entry.Key));
}
@ -133,10 +132,10 @@ namespace API.Services
var result = entryFullNames.OrderBy(Path.GetFileName, _comparer)
.FirstOrDefault(x => !Parser.Parser.HasBlacklistedFolderInPath(x)
&& Parser.Parser.IsImage(x));
return string.IsNullOrEmpty(result) ? null : result;
}
/// <summary>
/// Generates byte array of cover image.
@ -173,14 +172,14 @@ namespace API.Services
_logger.LogDebug("Using SharpCompress compression handling");
using var archive = ArchiveFactory.Open(archivePath);
var entryNames = archive.Entries.Where(archiveEntry => !archiveEntry.IsDirectory).Select(e => e.Key).ToList();
var entryName = FindFolderEntry(entryNames) ?? FirstFileEntry(entryNames);
var entry = archive.Entries.Single(e => e.Key == entryName);
using var ms = StreamManager.GetStream();
entry.WriteTo(ms);
ms.Position = 0;
return createThumbnail ? CreateThumbnail(entry.Key, ms, Path.GetExtension(entry.Key)) : ms.ToArray();
}
case ArchiveLibrary.NotSupported:
@ -195,7 +194,7 @@ namespace API.Services
{
_logger.LogWarning(ex, "[GetCoverImage] There was an exception when reading archive stream: {ArchivePath}. Defaulting to no cover image", archivePath);
}
return Array.Empty<byte>();
}
@ -206,7 +205,7 @@ namespace API.Services
stream.CopyTo(ms);
return ms.ToArray();
}
/// <summary>
/// Given an archive stream, will assess whether directory needs to be flattened so that the extracted archive files are directly
/// under extract path and not nested in subfolders. See <see cref="DirectoryInfoExtensions"/> Flatten method.
@ -225,14 +224,14 @@ namespace API.Services
{
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
var dateString = DateTime.Now.ToShortDateString().Replace("/", "_");
var tempLocation = Path.Join(tempDirectory, $"{tempFolder}_{dateString}");
DirectoryService.ExistOrCreate(tempLocation);
if (!_directoryService.CopyFilesToDirectory(files, tempLocation))
{
throw new KavitaException("Unable to copy files to temp directory archive download.");
}
var zipPath = Path.Join(tempDirectory, $"kavita_{tempFolder}_{dateString}.zip");
try
{
@ -243,10 +242,10 @@ namespace API.Services
_logger.LogError(ex, "There was an issue creating temp archive");
throw new KavitaException("There was an issue creating temp archive");
}
var fileBytes = await _directoryService.ReadFileAsync(zipPath);
DirectoryService.ClearAndDeleteDirectory(tempLocation);
(new FileInfo(zipPath)).Delete();
@ -261,7 +260,7 @@ namespace API.Services
}
try
{
using var thumbnail = Image.ThumbnailStream(stream, ThumbnailWidth);
using var thumbnail = Image.ThumbnailStream(stream, MetadataService.ThumbnailWidth);
return thumbnail.WriteToBuffer(formatExtension);
}
catch (Exception ex)
@ -286,12 +285,12 @@ namespace API.Services
}
if (Parser.Parser.IsArchive(archivePath) || Parser.Parser.IsEpub(archivePath)) return true;
_logger.LogWarning("Archive {ArchivePath} is not a valid archive", archivePath);
return false;
}
private static ComicInfo FindComicInfoXml(IEnumerable<IArchiveEntry> entries)
{
foreach (var entry in entries)
@ -309,7 +308,7 @@ namespace API.Services
}
}
return null;
}
@ -322,7 +321,7 @@ namespace API.Services
try
{
if (!File.Exists(archivePath)) return summary;
var libraryHandler = CanOpen(archivePath);
switch (libraryHandler)
{
@ -343,7 +342,7 @@ namespace API.Services
{
_logger.LogDebug("Using SharpCompress compression handling");
using var archive = ArchiveFactory.Open(archivePath);
info = FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
info = FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
&& !Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Parser.Parser.IsXml(entry.Key)));
break;
@ -365,7 +364,7 @@ namespace API.Services
{
_logger.LogWarning(ex, "[GetSummaryInfo] There was an exception when reading archive stream: {Filepath}", archivePath);
}
return summary;
}
@ -386,7 +385,7 @@ namespace API.Services
{
var needsFlattening = ArchiveNeedsFlattening(archive);
if (!archive.HasFiles() && !needsFlattening) return;
archive.ExtractToDirectory(extractPath, true);
if (needsFlattening)
{
@ -408,7 +407,7 @@ namespace API.Services
if (!IsValidArchive(archivePath)) return;
if (Directory.Exists(extractPath)) return;
var sw = Stopwatch.StartNew();
try
@ -427,7 +426,7 @@ namespace API.Services
{
_logger.LogDebug("Using SharpCompress compression handling");
using var archive = ArchiveFactory.Open(archivePath);
ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory
ExtractArchiveEntities(archive.Entries.Where(entry => !entry.IsDirectory
&& !Parser.Parser.HasBlacklistedFolderInPath(Path.GetDirectoryName(entry.Key) ?? string.Empty)
&& Parser.Parser.IsImage(entry.Key)), extractPath);
break;
@ -439,7 +438,7 @@ namespace API.Services
_logger.LogWarning("[ExtractArchive] There was an exception when reading archive stream: {ArchivePath}. Defaulting to 0 pages", archivePath);
return;
}
}
catch (Exception e)
{
@ -449,4 +448,4 @@ namespace API.Services
_logger.LogDebug("Extracted archive to {ExtractPath} in {ElapsedMilliseconds} milliseconds", extractPath, sw.ElapsedMilliseconds);
}
}
}
}

View File

@ -7,7 +7,7 @@ using System.Text.RegularExpressions;
using System.Threading.Tasks;
using System.Web;
using API.Entities.Enums;
using API.Interfaces;
using API.Interfaces.Services;
using API.Parser;
using ExCSS;
using HtmlAgilityPack;
@ -20,18 +20,16 @@ namespace API.Services
public class BookService : IBookService
{
private readonly ILogger<BookService> _logger;
private const int ThumbnailWidth = 320; // 153w x 230h
private readonly StylesheetParser _cssParser = new ();
public BookService(ILogger<BookService> logger)
{
_logger = logger;
}
private static bool HasClickableHrefPart(HtmlNode anchor)
{
return anchor.GetAttributeValue("href", string.Empty).Contains("#")
return anchor.GetAttributeValue("href", string.Empty).Contains("#")
&& anchor.GetAttributeValue("tabindex", string.Empty) != "-1"
&& anchor.GetAttributeValue("role", string.Empty) != "presentation";
}
@ -74,7 +72,7 @@ namespace API.Services
.Split("#");
// Some keys get uri encoded when parsed, so replace any of those characters with original
var mappingKey = HttpUtility.UrlDecode(hrefParts[0]);
if (!mappings.ContainsKey(mappingKey))
{
if (HasClickableHrefPart(anchor))
@ -95,7 +93,7 @@ namespace API.Services
return;
}
var mappedPage = mappings[mappingKey];
anchor.Attributes.Add("kavita-page", $"{mappedPage}");
if (hrefParts.Length > 1)
@ -103,7 +101,7 @@ namespace API.Services
anchor.Attributes.Add("kavita-part",
hrefParts[1]);
}
anchor.Attributes.Remove("href");
anchor.Attributes.Add("href", "javascript:void(0)");
}
@ -117,7 +115,7 @@ namespace API.Services
foreach (Match match in Parser.Parser.CssImportUrlRegex.Matches(stylesheetHtml))
{
if (!match.Success) continue;
var importFile = match.Groups["Filename"].Value;
var key = CleanContentKeys(importFile);
if (!key.Contains(prepend))
@ -125,7 +123,7 @@ namespace API.Services
key = prepend + key;
}
if (!book.Content.AllFiles.ContainsKey(key)) continue;
var bookFile = book.Content.AllFiles[key];
var content = await bookFile.ReadContentAsBytesAsync();
importBuilder.Append(Encoding.UTF8.GetString(content));
@ -134,13 +132,13 @@ namespace API.Services
stylesheetHtml = stylesheetHtml.Insert(0, importBuilder.ToString());
stylesheetHtml =
Parser.Parser.CssImportUrlRegex.Replace(stylesheetHtml, "$1" + apiBase + prepend + "$2" + "$3");
var styleContent = RemoveWhiteSpaceFromStylesheets(stylesheetHtml);
styleContent =
Parser.Parser.FontSrcUrlRegex.Replace(styleContent, "$1" + apiBase + "$2" + "$3");
styleContent = styleContent.Replace("body", ".reading-section");
var stylesheet = await _cssParser.ParseAsync(styleContent);
foreach (var styleRule in stylesheet.StyleRules)
{
@ -183,9 +181,9 @@ namespace API.Services
}
if (Parser.Parser.IsBook(filePath)) return true;
_logger.LogWarning("[BookService] Book {EpubFile} is not a valid EPUB", filePath);
return false;
return false;
}
public int GetNumberOfPages(string filePath)
@ -227,7 +225,7 @@ namespace API.Services
dict.Add(contentFileRef.FileName, pageCount);
pageCount += 1;
}
return dict;
}
@ -242,7 +240,7 @@ namespace API.Services
try
{
using var epubBook = EpubReader.OpenBook(filePath);
// If the epub has the following tags, we can group the books as Volumes
// <meta content="5.0" name="calibre:series_index"/>
// <meta content="The Dark Tower" name="calibre:series"/>
@ -262,7 +260,7 @@ namespace API.Services
var specialName = string.Empty;
var groupPosition = string.Empty;
foreach (var metadataItem in epubBook.Schema.Package.Metadata.MetaItems)
{
// EPUB 2 and 3
@ -340,12 +338,12 @@ namespace API.Services
return null;
}
public byte[] GetCoverImage(string fileFilePath, bool createThumbnail = true)
{
if (!IsValidFile(fileFilePath)) return Array.Empty<byte>();
using var epubBook = EpubReader.OpenBook(fileFilePath);
@ -355,14 +353,14 @@ namespace API.Services
var coverImageContent = epubBook.Content.Cover
?? epubBook.Content.Images.Values.FirstOrDefault(file => Parser.Parser.IsCoverImage(file.FileName))
?? epubBook.Content.Images.Values.FirstOrDefault();
if (coverImageContent == null) return Array.Empty<byte>();
if (createThumbnail)
{
using var stream = new MemoryStream(coverImageContent.ReadContent());
using var thumbnail = Image.ThumbnailStream(stream, ThumbnailWidth);
using var thumbnail = Image.ThumbnailStream(stream, MetadataService.ThumbnailWidth);
return thumbnail.WriteToBuffer(".jpg");
}
@ -372,10 +370,10 @@ namespace API.Services
{
_logger.LogWarning(ex, "[BookService] There was a critical error and prevented thumbnail generation on {BookFile}. Defaulting to no cover image", fileFilePath);
}
return Array.Empty<byte>();
}
private static string RemoveWhiteSpaceFromStylesheets(string body)
{
body = Regex.Replace(body, @"[a-zA-Z]+#", "#");
@ -391,4 +389,4 @@ namespace API.Services
return body;
}
}
}
}

View File

@ -21,7 +21,7 @@ namespace API.Services
private readonly NumericComparer _numericComparer;
public static readonly string CacheDirectory = Path.GetFullPath(Path.Join(Directory.GetCurrentDirectory(), "cache/"));
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
public CacheService(ILogger<CacheService> logger, IUnitOfWork unitOfWork, IArchiveService archiveService,
IDirectoryService directoryService)
{
_logger = logger;
@ -47,58 +47,71 @@ namespace API.Services
var fileCount = files.Count;
var extractPath = GetCachePath(chapterId);
var extraPath = "";
foreach (var file in files)
if (Directory.Exists(extractPath))
{
if (fileCount > 1)
{
extraPath = file.Id + "";
}
if (file.Format == MangaFormat.Archive)
{
_archiveService.ExtractArchive(file.FilePath, Path.Join(extractPath, extraPath));
}
return chapter;
}
new DirectoryInfo(extractPath).Flatten();
var extractDi = new DirectoryInfo(extractPath);
if (files.Count > 0 && files[0].Format == MangaFormat.Image)
{
DirectoryService.ExistOrCreate(extractPath);
_directoryService.CopyDirectoryToDirectory(Path.GetDirectoryName(files[0].FilePath), extractPath);
extractDi.Flatten();
return chapter;
}
foreach (var file in files)
{
if (fileCount > 1)
{
extraPath = file.Id + string.Empty;
}
if (file.Format == MangaFormat.Archive)
{
_archiveService.ExtractArchive(file.FilePath, Path.Join(extractPath, extraPath));
}
}
extractDi.Flatten();
extractDi.RemoveNonImages();
return chapter;
}
public void Cleanup()
{
_logger.LogInformation("Performing cleanup of Cache directory");
EnsureCacheDirectory();
DirectoryInfo di = new DirectoryInfo(CacheDirectory);
try
{
di.Empty();
DirectoryService.ClearDirectory(CacheDirectory);
}
catch (Exception ex)
{
_logger.LogError(ex, "There was an issue deleting one or more folders/files during cleanup");
}
_logger.LogInformation("Cache directory purged");
}
public void CleanupChapters(int[] chapterIds)
{
_logger.LogInformation("Running Cache cleanup on Volumes");
foreach (var chapter in chapterIds)
{
var di = new DirectoryInfo(GetCachePath(chapter));
if (di.Exists)
{
di.Delete(true);
di.Delete(true);
}
}
_logger.LogInformation("Cache directory purged");
}
@ -124,27 +137,33 @@ namespace API.Services
if (page <= (mangaFile.Pages + pagesSoFar))
{
var path = GetCachePath(chapter.Id);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
var files = _directoryService.GetFilesWithExtension(path, Parser.Parser.ImageFileExtensions);
Array.Sort(files, _numericComparer);
if (files.Length == 0)
{
return (files.ElementAt(0), mangaFile);
}
// Since array is 0 based, we need to keep that in account (only affects last image)
if (page == files.Length)
{
return (files.ElementAt(page - 1 - pagesSoFar), mangaFile);
}
if (mangaFile.Format == MangaFormat.Image && mangaFile.Pages == 1)
{
// Each file is one page, meaning we should just get element at page
return (files.ElementAt(page), mangaFile);
}
return (files.ElementAt(page - pagesSoFar), mangaFile);
}
pagesSoFar += mangaFile.Pages;
}
return ("", null);
return (string.Empty, null);
}
}
}
}

View File

@ -1,7 +1,6 @@
using System;
using System.Net.Http;
using System.Net.Http.Json;
using System.Threading;
using System.Threading.Tasks;
using API.Configurations.CustomOptions;
@ -59,4 +58,4 @@ namespace API.Services.Clients
}
}
}
}
}

View File

@ -14,7 +14,7 @@ namespace API.Services
{
private readonly ILogger<DirectoryService> _logger;
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store",
@"@eaDir|\.DS_Store",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
public DirectoryService(ILogger<DirectoryService> logger)
@ -23,13 +23,13 @@ namespace API.Services
}
/// <summary>
/// Given a set of regex search criteria, get files in the given path.
/// Given a set of regex search criteria, get files in the given path.
/// </summary>
/// <param name="path">Directory to search</param>
/// <param name="searchPatternExpression">Regex version of search pattern (ie \.mp3|\.mp4). Defaults to * meaning all files.</param>
/// <param name="searchOption">SearchOption to use, defaults to TopDirectoryOnly</param>
/// <returns>List of file paths</returns>
private static IEnumerable<string> GetFilesWithCertainExtensions(string path,
private static IEnumerable<string> GetFilesWithCertainExtensions(string path,
string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
@ -40,8 +40,9 @@ namespace API.Services
reSearchPattern.IsMatch(Path.GetExtension(file)));
}
/// <summary>
/// Returns a list of folders from end of fullPath to rootPath.
/// Returns a list of folders from end of fullPath to rootPath. If a file is passed at the end of the fullPath, it will be ignored.
///
/// Example) (C:/Manga/, C:/Manga/Love Hina/Specials/Omake/) returns [Omake, Specials, Love Hina]
/// </summary>
@ -50,25 +51,33 @@ namespace API.Services
/// <returns></returns>
public static IEnumerable<string> GetFoldersTillRoot(string rootPath, string fullPath)
{
var separator = Path.AltDirectorySeparatorChar;
var separator = Path.AltDirectorySeparatorChar;
if (fullPath.Contains(Path.DirectorySeparatorChar))
{
fullPath = fullPath.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
}
if (rootPath.Contains(Path.DirectorySeparatorChar))
{
rootPath = rootPath.Replace(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
}
var path = fullPath.EndsWith(separator) ? fullPath.Substring(0, fullPath.Length - 1) : fullPath;
var root = rootPath.EndsWith(separator) ? rootPath.Substring(0, rootPath.Length - 1) : rootPath;
var paths = new List<string>();
// If a file is at the end of the path, remove it before we start processing folders
if (Path.GetExtension(path) != string.Empty)
{
path = path.Substring(0, path.LastIndexOf(separator));
}
while (Path.GetDirectoryName(path) != Path.GetDirectoryName(root))
{
var folder = new DirectoryInfo(path).Name;
paths.Add(folder);
path = path.Replace(separator + folder, string.Empty);
path = path.Substring(0, path.LastIndexOf(separator));
}
return paths;
@ -80,7 +89,7 @@ namespace API.Services
return di.Exists;
}
public IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
public IEnumerable<string> GetFiles(string path, string searchPatternExpression = "",
SearchOption searchOption = SearchOption.TopDirectoryOnly)
{
if (searchPatternExpression != string.Empty)
@ -91,17 +100,68 @@ namespace API.Services
.Where(file =>
reSearchPattern.IsMatch(file));
}
return !Directory.Exists(path) ? Array.Empty<string>() : Directory.GetFiles(path);
}
public void CopyFileToDirectory(string fullFilePath, string targetDirectory)
{
var fileInfo = new FileInfo(fullFilePath);
if (fileInfo.Exists)
{
fileInfo.CopyTo(Path.Join(targetDirectory, fileInfo.Name));
}
}
public bool CopyDirectoryToDirectory(string sourceDirName, string destDirName)
{
if (string.IsNullOrEmpty(sourceDirName)) return false;
var di = new DirectoryInfo(sourceDirName);
if (!di.Exists) return false;
// Get the subdirectories for the specified directory.
var dir = new DirectoryInfo(sourceDirName);
if (!dir.Exists)
{
throw new DirectoryNotFoundException(
"Source directory does not exist or could not be found: "
+ sourceDirName);
}
var dirs = dir.GetDirectories();
// If the destination directory doesn't exist, create it.
Directory.CreateDirectory(destDirName);
// Get the files in the directory and copy them to the new location.
var files = dir.GetFiles();
foreach (var file in files)
{
var tempPath = Path.Combine(destDirName, file.Name);
file.CopyTo(tempPath, false);
}
// If copying subdirectories, copy them and their contents to new location.
foreach (var subDir in dirs)
{
var tempPath = Path.Combine(destDirName, subDir.Name);
CopyDirectoryToDirectory(subDir.FullName, tempPath);
}
return true;
}
public string[] GetFilesWithExtension(string path, string searchPatternExpression = "")
{
if (searchPatternExpression != string.Empty)
{
return GetFilesWithCertainExtensions(path, searchPatternExpression).ToArray();
}
return !Directory.Exists(path) ? Array.Empty<string>() : Directory.GetFiles(path);
}
@ -142,11 +202,11 @@ namespace API.Services
public static void ClearAndDeleteDirectory(string directoryPath)
{
if (!Directory.Exists(directoryPath)) return;
DirectoryInfo di = new DirectoryInfo(directoryPath);
ClearDirectory(directoryPath);
di.Delete(true);
}
@ -162,11 +222,11 @@ namespace API.Services
foreach (var file in di.EnumerateFiles())
{
file.Delete();
file.Delete();
}
foreach (var dir in di.EnumerateDirectories())
{
dir.Delete(true);
dir.Delete(true);
}
}
@ -181,13 +241,13 @@ namespace API.Services
var fileInfo = new FileInfo(file);
if (fileInfo.Exists)
{
fileInfo.CopyTo(Path.Join(directoryPath, fileInfo.Name));
fileInfo.CopyTo(Path.Join(directoryPath, fileInfo.Name));
}
else
{
_logger.LogWarning("Tried to copy {File} but it doesn't exist", file);
}
}
}
catch (Exception ex)
@ -202,12 +262,12 @@ namespace API.Services
public IEnumerable<string> ListDirectory(string rootPath)
{
if (!Directory.Exists(rootPath)) return ImmutableList<string>.Empty;
var di = new DirectoryInfo(rootPath);
var dirs = di.GetDirectories()
.Where(dir => !(dir.Attributes.HasFlag(FileAttributes.Hidden) || dir.Attributes.HasFlag(FileAttributes.System)))
.Select(d => d.Name).ToImmutableList();
return dirs;
}
@ -326,6 +386,6 @@ namespace API.Services
return fileCount;
}
}
}
}

View File

@ -0,0 +1,71 @@
using System;
using System.IO;
using System.Linq;
using API.Comparators;
using API.Entities;
using API.Interfaces.Services;
using Microsoft.Extensions.Logging;
using NetVips;
namespace API.Services
{
public class ImageService : IImageService
{
private readonly ILogger<ImageService> _logger;
private readonly IDirectoryService _directoryService;
private readonly NaturalSortComparer _naturalSortComparer;
public ImageService(ILogger<ImageService> logger, IDirectoryService directoryService)
{
_logger = logger;
_directoryService = directoryService;
_naturalSortComparer = new NaturalSortComparer();
}
/// <summary>
/// Finds the first image in the directory of the first file. Does not check for "cover/folder".ext files to override.
/// </summary>
/// <param name="file"></param>
/// <returns></returns>
public string GetCoverFile(MangaFile file)
{
var directory = Path.GetDirectoryName(file.FilePath);
if (string.IsNullOrEmpty(directory))
{
_logger.LogError("Could not find Directory for {File}", file.FilePath);
return null;
}
var firstImage = _directoryService.GetFilesWithExtension(directory, Parser.Parser.ImageFileExtensions)
.OrderBy(f => f, _naturalSortComparer).FirstOrDefault();
return firstImage;
}
public byte[] GetCoverImage(string path, bool createThumbnail = false)
{
if (string.IsNullOrEmpty(path)) return Array.Empty<byte>();
try
{
if (createThumbnail)
{
using var thumbnail = Image.Thumbnail(path, MetadataService.ThumbnailWidth);
return thumbnail.WriteToBuffer(".jpg");
}
using var img = Image.NewFromFile(path);
using var stream = new MemoryStream();
img.JpegsaveStream(stream);
return stream.ToArray();
}
catch (Exception ex)
{
_logger.LogWarning(ex, "[GetCoverImage] There was an error and prevented thumbnail generation on {ImageFile}. Defaulting to no cover image", path);
}
return Array.Empty<byte>();
}
}
}

View File

@ -20,16 +20,22 @@ namespace API.Services
private readonly ILogger<MetadataService> _logger;
private readonly IArchiveService _archiveService;
private readonly IBookService _bookService;
private readonly IDirectoryService _directoryService;
private readonly IImageService _imageService;
private readonly ChapterSortComparer _chapterSortComparer = new ChapterSortComparer();
public static readonly int ThumbnailWidth = 320; // 153w x 230h
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger, IArchiveService archiveService, IBookService bookService)
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IArchiveService archiveService, IBookService bookService, IDirectoryService directoryService, IImageService imageService)
{
_unitOfWork = unitOfWork;
_logger = logger;
_archiveService = archiveService;
_bookService = bookService;
_directoryService = directoryService;
_imageService = imageService;
}
private static bool ShouldFindCoverImage(byte[] coverImage, bool forceUpdate = false)
{
return forceUpdate || coverImage == null || !coverImage.Any();
@ -37,23 +43,25 @@ namespace API.Services
private byte[] GetCoverImage(MangaFile file, bool createThumbnail = true)
{
if (file.Format == MangaFormat.Book)
{
switch (file.Format)
{
case MangaFormat.Book:
return _bookService.GetCoverImage(file.FilePath, createThumbnail);
}
else
{
return _archiveService.GetCoverImage(file.FilePath, createThumbnail);
}
case MangaFormat.Image:
var coverImage = _imageService.GetCoverFile(file);
return _imageService.GetCoverImage(coverImage, createThumbnail);
default:
return _archiveService.GetCoverImage(file.FilePath, createThumbnail);
}
}
public void UpdateMetadata(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (ShouldFindCoverImage(chapter.CoverImage, forceUpdate) && firstFile != null && !new FileInfo(firstFile.FilePath).IsLastWriteLessThan(firstFile.LastModified))
{
chapter.Files ??= new List<MangaFile>();
chapter.CoverImage = GetCoverImage(firstFile);
chapter.CoverImage = GetCoverImage(firstFile);
}
}
@ -88,10 +96,10 @@ namespace API.Services
{
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Library.Type);
byte[] coverImage = null;
byte[] coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparer)
@ -113,24 +121,21 @@ namespace API.Services
private void UpdateSeriesSummary(Series series, bool forceUpdate)
{
if (!string.IsNullOrEmpty(series.Summary) && !forceUpdate) return;
var isBook = series.Library.Type == LibraryType.Book;
var firstVolume = series.Volumes.FirstWithChapters(isBook);
var firstChapter = firstVolume?.Chapters.GetFirstChapterWithFiles();
// NOTE: This suffers from code changes not taking effect due to stale data
var firstFile = firstChapter?.Files.FirstOrDefault();
if (firstFile != null &&
(forceUpdate || firstFile.HasFileBeenModified())) // !new FileInfo(firstFile.FilePath).IsLastWriteLessThan(firstFile.LastModified)
if (firstFile == null || (!forceUpdate && !firstFile.HasFileBeenModified())) return;
var summary = isBook ? _bookService.GetSummaryInfo(firstFile.FilePath) : _archiveService.GetSummaryInfo(firstFile.FilePath);
if (string.IsNullOrEmpty(series.Summary))
{
var summary = isBook ? _bookService.GetSummaryInfo(firstFile.FilePath) : _archiveService.GetSummaryInfo(firstFile.FilePath);
if (string.IsNullOrEmpty(series.Summary))
{
series.Summary = summary;
}
firstFile.LastModified = DateTime.Now;
series.Summary = summary;
}
firstFile.LastModified = DateTime.Now;
}
@ -149,7 +154,7 @@ namespace API.Services
{
UpdateMetadata(chapter, forceUpdate);
}
UpdateMetadata(volume, forceUpdate);
}
@ -163,13 +168,13 @@ namespace API.Services
_logger.LogInformation("Updated metadata for {LibraryName} in {ElapsedMilliseconds} milliseconds", library.Name, sw.ElapsedMilliseconds);
}
}
public void RefreshMetadataForSeries(int libraryId, int seriesId)
{
var sw = Stopwatch.StartNew();
var library = Task.Run(() => _unitOfWork.LibraryRepository.GetFullLibraryForIdAsync(libraryId)).GetAwaiter().GetResult();
var series = library.Series.SingleOrDefault(s => s.Id == seriesId);
if (series == null)
{
@ -183,7 +188,7 @@ namespace API.Services
{
UpdateMetadata(chapter, true);
}
UpdateMetadata(volume, true);
}
@ -197,4 +202,4 @@ namespace API.Services
}
}
}
}
}

View File

@ -133,7 +133,7 @@ namespace API.Services
return usageInfo;
}
private static ServerInfoDto GetServerInfo()
public static ServerInfoDto GetServerInfo()
{
var serverInfo = new ServerInfoDto
{
@ -183,4 +183,4 @@ namespace API.Services
await File.WriteAllTextAsync(FinalPath, dataJson);
}
}
}
}

View File

@ -1,4 +1,5 @@
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using API.Entities.Enums;
using API.Helpers.Converters;
@ -24,7 +25,7 @@ namespace API.Services
public static BackgroundJobServer Client => new BackgroundJobServer();
public TaskScheduler(ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
public TaskScheduler(ICacheService cacheService, ILogger<TaskScheduler> logger, IScannerService scannerService,
IUnitOfWork unitOfWork, IMetadataService metadataService, IBackupService backupService,
ICleanupService cleanupService, IStatsService statsService)
{
@ -41,20 +42,20 @@ namespace API.Services
public void ScheduleTasks()
{
_logger.LogInformation("Scheduling reoccurring tasks");
var setting = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskScan)).GetAwaiter().GetResult().Value;
if (setting != null)
{
var scanLibrarySetting = setting;
_logger.LogDebug("Scheduling Scan Library Task for {Setting}", scanLibrarySetting);
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(),
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(),
() => CronConverter.ConvertToCronNotation(scanLibrarySetting));
}
else
{
RecurringJob.AddOrUpdate("scan-libraries", () => _scannerService.ScanLibraries(), Cron.Daily);
}
setting = Task.Run(() => _unitOfWork.SettingsRepository.GetSettingAsync(ServerSettingKey.TaskBackup)).Result.Value;
if (setting != null)
{
@ -65,7 +66,7 @@ namespace API.Services
{
RecurringJob.AddOrUpdate("backup", () => _backupService.BackupDatabase(), Cron.Weekly);
}
RecurringJob.AddOrUpdate("cleanup", () => _cleanupService.Cleanup(), Cron.Daily);
}
@ -80,7 +81,7 @@ namespace API.Services
_logger.LogDebug("User has opted out of stat collection, not registering tasks");
return;
}
_logger.LogDebug("Adding StatsTasks");
_logger.LogDebug("Scheduling Send data to the Stats server {Setting}", nameof(Cron.Daily));
@ -99,9 +100,9 @@ namespace API.Services
public void ScanLibrary(int libraryId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing library scan for: {LibraryId}", libraryId);
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, forceUpdate));
BackgroundJob.Enqueue(() => _scannerService.ScanLibrary(libraryId, forceUpdate));
// When we do a scan, force cache to re-unpack in case page numbers change
BackgroundJob.Enqueue(() => _cleanupService.Cleanup());
BackgroundJob.Enqueue(() => _cleanupService.Cleanup());
}
public void CleanupChapters(int[] chapterIds)
@ -112,19 +113,25 @@ namespace API.Services
public void RefreshMetadata(int libraryId, bool forceUpdate = true)
{
_logger.LogInformation("Enqueuing library metadata refresh for: {LibraryId}", libraryId);
BackgroundJob.Enqueue((() => _metadataService.RefreshMetadata(libraryId, forceUpdate)));
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
public void CleanupTemp()
{
var tempDirectory = Path.Join(Directory.GetCurrentDirectory(), "temp");
BackgroundJob.Enqueue((() => DirectoryService.ClearDirectory(tempDirectory)));
BackgroundJob.Enqueue(() => DirectoryService.ClearDirectory(tempDirectory));
}
public void RefreshSeriesMetadata(int libraryId, int seriesId)
{
_logger.LogInformation("Enqueuing series metadata refresh for: {SeriesId}", seriesId);
BackgroundJob.Enqueue((() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId)));
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadataForSeries(libraryId, seriesId));
}
public void ScanSeries(int libraryId, int seriesId, bool forceUpdate = false)
{
_logger.LogInformation("Enqueuing series scan for: {SeriesId}", seriesId);
BackgroundJob.Enqueue(() => _scannerService.ScanSeries(libraryId, seriesId, forceUpdate, CancellationToken.None));
}
public void BackupDatabase()
@ -132,4 +139,4 @@ namespace API.Services
BackgroundJob.Enqueue(() => _backupService.BackupDatabase());
}
}
}
}

View File

@ -4,6 +4,7 @@ using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
@ -28,7 +29,7 @@ namespace API.Services.Tasks
private ConcurrentDictionary<string, List<ParserInfo>> _scannedSeries;
private readonly NaturalSortComparer _naturalSort;
public ScannerService(IUnitOfWork unitOfWork, ILogger<ScannerService> logger, IArchiveService archiveService,
public ScannerService(IUnitOfWork unitOfWork, ILogger<ScannerService> logger, IArchiveService archiveService,
IMetadataService metadataService, IBookService bookService)
{
_unitOfWork = unitOfWork;
@ -39,6 +40,86 @@ namespace API.Services.Tasks
_naturalSort = new NaturalSortComparer();
}
[DisableConcurrentExecution(timeoutInSeconds: 360)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
public async Task ScanSeries(int libraryId, int seriesId, bool forceUpdate, CancellationToken token)
{
var files = await _unitOfWork.SeriesRepository.GetFilesForSeries(seriesId);
var series = await _unitOfWork.SeriesRepository.GetSeriesByIdAsync(seriesId);
var library = await _unitOfWork.LibraryRepository.GetFullLibraryForIdAsync(libraryId, seriesId);
var dirs = FindHighestDirectoriesFromFiles(library, files);
_logger.LogInformation("Beginning file scan on {SeriesName}", series.Name);
// TODO: We can't have a global variable if multiple scans are taking place. Refactor this.
_scannedSeries = new ConcurrentDictionary<string, List<ParserInfo>>();
var parsedSeries = ScanLibrariesForSeries(library.Type, dirs.Keys, out var totalFiles, out var scanElapsedTime);
// If a root level folder scan occurs, then multiple series gets passed in and thus we get a unique constraint issue
// Hence we clear out anything but what we selected for
var firstSeries = library.Series.FirstOrDefault();
var keys = parsedSeries.Keys;
foreach (var key in keys.Where(key => !firstSeries.NameInParserInfo(parsedSeries[key].FirstOrDefault())))
{
parsedSeries.Remove(key);
}
var sw = new Stopwatch();
UpdateLibrary(library, parsedSeries);
_unitOfWork.LibraryRepository.Update(library);
if (await _unitOfWork.CommitAsync())
{
_logger.LogInformation(
"Processed {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {SeriesName}",
totalFiles, parsedSeries.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, series.Name);
CleanupUserProgress();
BackgroundJob.Enqueue(() => _metadataService.RefreshMetadata(libraryId, forceUpdate));
}
else
{
_logger.LogCritical(
"There was a critical error that resulted in a failed scan. Please check logs and rescan");
await _unitOfWork.RollbackAsync();
}
}
/// <summary>
/// Finds the highest directories from a set of MangaFiles
/// </summary>
/// <param name="library"></param>
/// <param name="files"></param>
/// <returns></returns>
private static Dictionary<string, string> FindHighestDirectoriesFromFiles(Library library, IList<MangaFile> files)
{
var stopLookingForDirectories = false;
var dirs = new Dictionary<string, string>();
foreach (var folder in library.Folders)
{
if (stopLookingForDirectories) break;
foreach (var file in files)
{
if (!file.FilePath.Contains(folder.Path)) continue;
var parts = DirectoryService.GetFoldersTillRoot(folder.Path, file.FilePath).ToList();
if (parts.Count == 0)
{
// Break from all loops, we done, just scan folder.Path (library root)
dirs.Add(folder.Path, string.Empty);
stopLookingForDirectories = true;
break;
}
var fullPath = Path.Join(folder.Path, parts.Last());
if (!dirs.ContainsKey(fullPath))
{
dirs.Add(fullPath, string.Empty);
}
}
}
return dirs;
}
[DisableConcurrentExecution(timeoutInSeconds: 360)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
@ -51,51 +132,46 @@ namespace API.Services.Tasks
}
}
private bool ShouldSkipFolderScan(FolderPath folder, ref int skippedFolders)
{
// NOTE: The only way to skip folders is if Directory hasn't been modified, we aren't doing a forcedUpdate and version hasn't changed between scans.
return false;
// if (!_forceUpdate && Directory.GetLastWriteTime(folder.Path) < folder.LastScanned)
// {
// _logger.LogDebug("{FolderPath} hasn't been modified since last scan. Skipping", folder.Path);
// skippedFolders += 1;
// return true;
// }
//return false;
}
[DisableConcurrentExecution(360)]
[AutomaticRetry(Attempts = 0, OnAttemptsExceeded = AttemptsExceededAction.Delete)]
public void ScanLibrary(int libraryId, bool forceUpdate)
{
var sw = Stopwatch.StartNew();
_scannedSeries = new ConcurrentDictionary<string, List<ParserInfo>>();
Library library;
_scannedSeries = new ConcurrentDictionary<string, List<ParserInfo>>();
Library library;
try
{
library = Task.Run(() => _unitOfWork.LibraryRepository.GetFullLibraryForIdAsync(libraryId)).GetAwaiter().GetResult();
library = Task.Run(() => _unitOfWork.LibraryRepository.GetFullLibraryForIdAsync(libraryId)).GetAwaiter()
.GetResult();
}
catch (Exception ex)
{
// This usually only fails if user is not authenticated.
_logger.LogError(ex, "There was an issue fetching Library {LibraryId}", libraryId);
return;
// This usually only fails if user is not authenticated.
_logger.LogError(ex, "There was an issue fetching Library {LibraryId}", libraryId);
return;
}
var series = ScanLibrariesForSeries(forceUpdate, library, sw, out var totalFiles, out var scanElapsedTime);
_logger.LogInformation("Beginning file scan on {LibraryName}", library.Name);
var series = ScanLibrariesForSeries(library.Type, library.Folders.Select(fp => fp.Path), out var totalFiles, out var scanElapsedTime);
foreach (var folderPath in library.Folders)
{
folderPath.LastScanned = DateTime.Now;
}
var sw = Stopwatch.StartNew();
UpdateLibrary(library, series);
_unitOfWork.LibraryRepository.Update(library);
if (Task.Run(() => _unitOfWork.CommitAsync()).Result)
{
_logger.LogInformation("Processed {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}", totalFiles, series.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, library.Name);
_logger.LogInformation(
"Processed {TotalFiles} files and {ParsedSeriesCount} series in {ElapsedScanTime} milliseconds for {LibraryName}",
totalFiles, series.Keys.Count, sw.ElapsedMilliseconds + scanElapsedTime, library.Name);
}
else
{
_logger.LogCritical("There was a critical error that resulted in a failed scan. Please check logs and rescan");
_logger.LogCritical(
"There was a critical error that resulted in a failed scan. Please check logs and rescan");
}
CleanupUserProgress();
@ -112,61 +188,59 @@ namespace API.Services.Tasks
_logger.LogInformation("Removed {Count} abandoned progress rows", cleanedUp);
}
private Dictionary<string, List<ParserInfo>> ScanLibrariesForSeries(bool forceUpdate, Library library, Stopwatch sw, out int totalFiles,
/// <summary>
///
/// </summary>
/// <param name="libraryType">Type of library. Used for selecting the correct file extensions to search for and parsing files</param>
/// <param name="folders">The folders to scan. By default, this should be library.Folders, however it can be overwritten to restrict folders</param>
/// <param name="totalFiles">Total files scanned</param>
/// <param name="scanElapsedTime">Time it took to scan and parse files</param>
/// <returns></returns>
private Dictionary<string, List<ParserInfo>> ScanLibrariesForSeries(LibraryType libraryType, IEnumerable<string> folders, out int totalFiles,
out long scanElapsedTime)
{
_logger.LogInformation("Beginning scan on {LibraryName}. Forcing metadata update: {ForceUpdate}", library.Name,
forceUpdate);
totalFiles = 0;
var skippedFolders = 0;
foreach (var folderPath in library.Folders)
{
if (ShouldSkipFolderScan(folderPath, ref skippedFolders)) continue;
// NOTE: we can refactor this to allow all filetypes and handle everything in the ProcessFile to allow mixed library types.
var searchPattern = Parser.Parser.ArchiveFileExtensions;
if (library.Type == LibraryType.Book)
{
searchPattern = Parser.Parser.BookFileExtensions;
}
try
{
totalFiles += DirectoryService.TraverseTreeParallelForEach(folderPath.Path, (f) =>
{
try
var sw = Stopwatch.StartNew();
totalFiles = 0;
var searchPattern = GetLibrarySearchPattern(libraryType);
foreach (var folderPath in folders)
{
try
{
totalFiles += DirectoryService.TraverseTreeParallelForEach(folderPath, (f) =>
{
ProcessFile(f, folderPath.Path, library.Type);
}
catch (FileNotFoundException exception)
{
_logger.LogError(exception, "The file {Filename} could not be found", f);
}
}, searchPattern, _logger);
}
catch (ArgumentException ex)
{
_logger.LogError(ex, "The directory '{FolderPath}' does not exist", folderPath.Path);
}
try
{
ProcessFile(f, folderPath, libraryType);
}
catch (FileNotFoundException exception)
{
_logger.LogError(exception, "The file {Filename} could not be found", f);
}
}, searchPattern, _logger);
}
catch (ArgumentException ex)
{
_logger.LogError(ex, "The directory '{FolderPath}' does not exist", folderPath);
}
}
folderPath.LastScanned = DateTime.Now;
}
scanElapsedTime = sw.ElapsedMilliseconds;
_logger.LogInformation("Scanned {TotalFiles} files in {ElapsedScanTime} milliseconds", totalFiles,
scanElapsedTime);
scanElapsedTime = sw.ElapsedMilliseconds;
_logger.LogInformation("Folders Scanned {TotalFiles} files in {ElapsedScanTime} milliseconds", totalFiles,
scanElapsedTime);
sw.Restart();
if (skippedFolders == library.Folders.Count)
{
_logger.LogInformation("All Folders were skipped due to no modifications to the directories");
_unitOfWork.LibraryRepository.Update(library);
_scannedSeries = null;
_logger.LogInformation("Processed {TotalFiles} files in {ElapsedScanTime} milliseconds for {LibraryName}",
totalFiles, sw.ElapsedMilliseconds, library.Name);
return new Dictionary<string, List<ParserInfo>>();
}
return SeriesWithInfos(_scannedSeries);
return SeriesWithInfos(_scannedSeries);
}
private static string GetLibrarySearchPattern(LibraryType libraryType)
{
var searchPattern = libraryType switch
{
LibraryType.Book => Parser.Parser.BookFileExtensions,
LibraryType.MangaImages or LibraryType.ComicImages => Parser.Parser.ImageFileExtensions,
_ => Parser.Parser.ArchiveFileExtensions
};
return searchPattern;
}
/// <summary>
@ -181,7 +255,7 @@ namespace API.Services.Tasks
return series;
}
private void UpdateLibrary(Library library, Dictionary<string, List<ParserInfo>> parsedSeries)
{
if (parsedSeries == null) throw new ArgumentNullException(nameof(parsedSeries));
@ -197,8 +271,8 @@ namespace API.Services.Tasks
_logger.LogDebug("Removed {SeriesName}", s.Name);
}
}
// Add new series that have parsedInfos
foreach (var (key, infos) in parsedSeries)
{
@ -215,7 +289,6 @@ namespace API.Services.Tasks
foreach (var series in duplicateSeries)
{
_logger.LogCritical("{Key} maps with {Series}", key, series.OriginalName);
}
continue;
@ -225,7 +298,7 @@ namespace API.Services.Tasks
existingSeries = DbFactory.Series(infos[0].Series);
library.Series.Add(existingSeries);
}
existingSeries.NormalizedName = Parser.Parser.Normalize(existingSeries.Name);
existingSeries.OriginalName ??= infos[0].Series;
existingSeries.Metadata ??= DbFactory.SeriesMetadata(new List<CollectionTag>());
@ -240,7 +313,6 @@ namespace API.Services.Tasks
_logger.LogInformation("Processing series {SeriesName}", series.OriginalName);
UpdateVolumes(series, parsedSeries[Parser.Parser.Normalize(series.OriginalName)].ToArray());
series.Pages = series.Volumes.Sum(v => v.Pages);
// Test
}
catch (Exception ex)
{
@ -267,13 +339,13 @@ namespace API.Services.Tasks
{
var existingCount = existingSeries.Count;
var missingList = missingSeries.ToList();
existingSeries = existingSeries.Where(
s => !missingList.Exists(
m => m.NormalizedName.Equals(s.NormalizedName))).ToList();
removeCount = existingCount - existingSeries.Count;
return existingSeries;
}
@ -291,15 +363,15 @@ namespace API.Services.Tasks
volume = DbFactory.Volume(volumeNumber);
series.Volumes.Add(volume);
}
// NOTE: Instead of creating and adding? Why Not Merge a new volume into an existing, so no matter what, new properties,etc get propagated?
_logger.LogDebug("Parsing {SeriesName} - Volume {VolumeNumber}", series.Name, volume.Name);
var infos = parsedInfos.Where(p => p.Volumes == volumeNumber).ToArray();
UpdateChapters(volume, infos);
volume.Pages = volume.Chapters.Sum(c => c.Pages);
}
// Remove existing volumes that aren't in parsedInfos
var nonDeletedVolumes = series.Volumes.Where(v => parsedInfos.Select(p => p.Volumes).Contains(v.Name)).ToList();
if (series.Volumes.Count != nonDeletedVolumes.Count)
@ -320,12 +392,12 @@ namespace API.Services.Tasks
series.Volumes = nonDeletedVolumes;
}
_logger.LogDebug("Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
_logger.LogDebug("Updated {SeriesName} volumes from {StartingVolumeCount} to {VolumeCount}",
series.Name, startingVolumeCount, series.Volumes.Count);
}
/// <summary>
///
///
/// </summary>
/// <param name="volume"></param>
/// <param name="parsedInfos"></param>
@ -346,7 +418,7 @@ namespace API.Services.Tasks
_logger.LogError(ex, "{FileName} mapped as '{Series} - Vol {Volume} Ch {Chapter}' is a duplicate, skipping", info.FullFilePath, info.Series, info.Volumes, info.Chapters);
continue;
}
if (chapter == null)
{
_logger.LogDebug(
@ -357,9 +429,9 @@ namespace API.Services.Tasks
{
chapter.UpdateFrom(info);
}
}
// Add files
foreach (var info in parsedInfos)
{
@ -379,8 +451,8 @@ namespace API.Services.Tasks
chapter.Number = Parser.Parser.MinimumNumberFromRange(info.Chapters) + string.Empty;
chapter.Range = specialTreatment ? info.Filename : info.Chapters;
}
// Remove chapters that aren't in parsedInfos or have no files linked
var existingChapters = volume.Chapters.ToList();
foreach (var existingChapter in existingChapters)
@ -403,15 +475,16 @@ namespace API.Services.Tasks
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
/// </summary>
/// <param name="info"></param>
private void TrackSeries(ParserInfo info)
{
if (info.Series == string.Empty) return;
// Check if normalized info.Series already exists and if so, update info to use that name instead
info.Series = MergeName(_scannedSeries, info);
_scannedSeries.AddOrUpdate(Parser.Parser.Normalize(info.Series), new List<ParserInfo>() {info}, (_, oldValue) =>
{
oldValue ??= new List<ParserInfo>();
@ -424,14 +497,20 @@ namespace API.Services.Tasks
});
}
/// <summary>
/// Using a normalized name from the passed ParserInfo, this checks against all found series so far and if an existing one exists with
/// same normalized name, it merges into the existing one. This is important as some manga may have a slight difference with punctuation or capitalization.
/// </summary>
/// <param name="collectedSeries"></param>
/// <param name="info"></param>
/// <returns></returns>
public string MergeName(ConcurrentDictionary<string,List<ParserInfo>> collectedSeries, ParserInfo info)
{
var normalizedSeries = Parser.Parser.Normalize(info.Series);
_logger.LogDebug("Checking if we can merge {NormalizedSeries}", normalizedSeries);
var existingName = collectedSeries.SingleOrDefault(p => Parser.Parser.Normalize(p.Key) == normalizedSeries)
.Key;
// BUG: We are comparing info.Series against a normalized string. They should never match. (This can cause series to not delete or parse correctly after a rename)
if (!string.IsNullOrEmpty(existingName)) // && info.Series != existingName
if (!string.IsNullOrEmpty(existingName))
{
_logger.LogDebug("Found duplicate parsed infos, merged {Original} into {Merged}", info.Series, existingName);
return existingName;
@ -450,7 +529,7 @@ namespace API.Services.Tasks
private void ProcessFile(string path, string rootPath, LibraryType type)
{
ParserInfo info;
if (type == LibraryType.Book && Parser.Parser.IsEpub(path))
{
info = _bookService.ParseInfo(path);
@ -465,14 +544,14 @@ namespace API.Services.Tasks
_logger.LogWarning("[Scanner] Could not parse series from {Path}", path);
return;
}
if (type == LibraryType.Book && Parser.Parser.IsEpub(path) && Parser.Parser.ParseVolume(info.Series) != Parser.Parser.DefaultVolume)
{
info = Parser.Parser.Parse(path, rootPath, type);
var info2 = _bookService.ParseInfo(path);
info.Merge(info2);
}
TrackSeries(info);
}
@ -498,6 +577,15 @@ namespace API.Services.Tasks
Pages = _bookService.GetNumberOfPages(info.FullFilePath)
};
}
case MangaFormat.Image:
{
return new MangaFile()
{
FilePath = info.FullFilePath,
Format = info.Format,
Pages = 1
};
}
default:
_logger.LogWarning("[Scanner] Ignoring {Filename}. Non-archives are not supported", info.Filename);
break;
@ -505,7 +593,7 @@ namespace API.Services.Tasks
return null;
}
private void AddOrUpdateFileForChapter(Chapter chapter, ParserInfo info)
{
chapter.Files ??= new List<MangaFile>();
@ -515,8 +603,8 @@ namespace API.Services.Tasks
existingFile.Format = info.Format;
if (!existingFile.HasFileBeenModified() && existingFile.Pages > 0)
{
existingFile.Pages = existingFile.Format == MangaFormat.Book
? _bookService.GetNumberOfPages(info.FullFilePath)
existingFile.Pages = existingFile.Format == MangaFormat.Book
? _bookService.GetNumberOfPages(info.FullFilePath)
: _archiveService.GetNumberOfPagesFromArchive(info.FullFilePath);
}
}
@ -536,4 +624,4 @@ namespace API.Services.Tasks
}
}
}
}
}

View File

@ -52,18 +52,18 @@ namespace API
{
options.Providers.Add<BrotliCompressionProvider>();
options.Providers.Add<GzipCompressionProvider>();
options.MimeTypes =
options.MimeTypes =
ResponseCompressionDefaults.MimeTypes.Concat(
new[] { "image/jpeg", "image/jpg" });
options.EnableForHttps = true;
});
services.Configure<BrotliCompressionProviderOptions>(options =>
services.Configure<BrotliCompressionProviderOptions>(options =>
{
options.Level = CompressionLevel.Fastest;
});
services.AddResponseCaching();
services.AddStatsClient(_config);
services.AddHangfire(configuration => configuration
@ -80,7 +80,7 @@ namespace API
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IBackgroundJobClient backgroundJobs, IWebHostEnvironment env,
public void Configure(IApplicationBuilder app, IBackgroundJobClient backgroundJobs, IWebHostEnvironment env,
IHostApplicationLifetime applicationLifetime)
{
app.UseMiddleware<ExceptionMiddleware>();
@ -91,19 +91,19 @@ namespace API
app.UseSwaggerUI(c => c.SwaggerEndpoint("/swagger/v1/swagger.json", "API v1"));
app.UseHangfireDashboard();
}
app.UseResponseCompression();
app.UseForwardedHeaders();
app.UseRouting();
// Ordering is important. Cors, authentication, authorization
if (env.IsDevelopment())
{
app.UseCors(policy => policy.AllowAnyHeader().AllowAnyMethod().WithOrigins("http://localhost:4200"));
}
app.UseResponseCaching();
app.UseAuthentication();
@ -116,18 +116,18 @@ namespace API
{
ContentTypeProvider = new FileExtensionContentTypeProvider()
});
app.Use(async (context, next) =>
{
context.Response.GetTypedHeaders().CacheControl =
context.Response.GetTypedHeaders().CacheControl =
new Microsoft.Net.Http.Headers.CacheControlHeaderValue()
{
Public = false,
MaxAge = TimeSpan.FromSeconds(10)
};
context.Response.Headers[Microsoft.Net.Http.Headers.HeaderNames.Vary] =
context.Response.Headers[Microsoft.Net.Http.Headers.HeaderNames.Vary] =
new[] { "Accept-Encoding" };
await next();
});
@ -137,14 +137,14 @@ namespace API
endpoints.MapHangfireDashboard();
endpoints.MapFallbackToController("Index", "Fallback");
});
applicationLifetime.ApplicationStopping.Register(OnShutdown);
applicationLifetime.ApplicationStarted.Register(() =>
{
Console.WriteLine($"Kavita - v{BuildInfo.Version}");
});
}
private void OnShutdown()
{
Console.WriteLine("Server is shutting down. Please allow a few seconds to stop any background jobs...");
@ -152,7 +152,7 @@ namespace API
System.Threading.Thread.Sleep(1000);
Console.WriteLine("You may now close the application window.");
}
}
}

View File

@ -0,0 +1,21 @@
using System.ComponentModel;
namespace Kavita.Common.Extensions
{
public static class EnumExtensions
{
public static string ToDescription<TEnum>(this TEnum value) where TEnum : struct
{
var fi = value.GetType().GetField(value.ToString() ?? string.Empty);
if (fi == null)
{
return value.ToString();
}
var attributes = (DescriptionAttribute[])fi.GetCustomAttributes(typeof(DescriptionAttribute), false);
return attributes is {Length: > 0} ? attributes[0].Description : value.ToString();
}
}
}

View File

@ -0,0 +1,12 @@
using System.IO;
namespace Kavita.Common.Extensions
{
public static class PathExtensions
{
public static string GetParentDirectory(string filePath)
{
return Path.GetDirectoryName(filePath);
}
}
}

View File

@ -5,7 +5,7 @@ namespace Kavita.Common
{
public static class HashUtil
{
public static string CalculateCrc(string input)
private static string CalculateCrc(string input)
{
uint mCrc = 0xffffffff;
byte[] bytes = Encoding.UTF8.GetBytes(input);
@ -28,10 +28,14 @@ namespace Kavita.Common
return $"{mCrc:x8}";
}
/// <summary>
/// Calculates a unique, Anonymous Token that will represent this unique Kavita installation.
/// </summary>
/// <returns></returns>
public static string AnonymousToken()
{
var seed = $"{Environment.ProcessorCount}_{Environment.OSVersion.Platform}_{Environment.MachineName}_{Environment.UserName}";
return HashUtil.CalculateCrc(seed);
return CalculateCrc(seed);
}
}
}
}

View File

@ -2,21 +2,16 @@
<PropertyGroup>
<TargetFramework>net5.0</TargetFramework>
<Company>kareadita.github.io</Company>
<Company>kavitareader.com</Company>
<Product>Kavita</Product>
<AssemblyVersion>0.4.2.0</AssemblyVersion>
<AssemblyVersion>0.4.2.1</AssemblyVersion>
<NeutralLanguage>en</NeutralLanguage>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Configuration.Abstractions" Version="5.0.0" />
<PackageReference Include="Sentry" Version="3.3.4" />
<PackageReference Include="Sentry" Version="3.7.0" />
</ItemGroup>
<ItemGroup>
<Reference Include="JetBrains.ReSharper.TestRunner.Merged, Version=1.3.1.55, Culture=neutral, PublicKeyToken=5c492ec4f3eccde3">
<HintPath>D:\Program Files\JetBrains\JetBrains Rider 2020.3.2\lib\ReSharperHost\TestRunner\netcoreapp2.0\JetBrains.ReSharper.TestRunner.Merged.dll</HintPath>
</Reference>
</ItemGroup>
</Project>

674
UI/Web/LICENSE Normal file
View File

@ -0,0 +1,674 @@
 GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

27
UI/Web/README.md Normal file
View File

@ -0,0 +1,27 @@
# Kavita Webui
This project was generated with [Angular CLI](https://github.com/angular/angular-cli) version 11.0.0.
## Development server
Run `ng serve` for a dev server. Navigate to `http://localhost:4200/`. The app will automatically reload if you change any of the source files.
## Code scaffolding
Run `ng generate component component-name` to generate a new component. You can also use `ng generate directive|pipe|service|class|guard|interface|enum|module`.
## Build
Run `ng build` to build the project. The build artifacts will be stored in the `dist/` directory. Use the `--prod` flag for a production build.
## Running unit tests
Run `ng test` to execute the unit tests via [Karma](https://karma-runner.github.io).
## Running end-to-end tests
Run `ng e2e` to execute the end-to-end tests via [Protractor](http://www.protractortest.org/).
## Further help
To get more help on the Angular CLI use `ng help` or go check out the [Angular CLI Overview and Command Reference](https://angular.io/cli) page.

139
UI/Web/angular.json Normal file
View File

@ -0,0 +1,139 @@
{
"$schema": "./node_modules/@angular/cli/lib/config/schema.json",
"version": 1,
"newProjectRoot": "projects",
"projects": {
"kavita-webui": {
"projectType": "application",
"schematics": {
"@schematics/angular:component": {
"style": "scss"
},
"@schematics/angular:application": {
"strict": true
}
},
"root": "",
"sourceRoot": "src",
"prefix": "app",
"architect": {
"build": {
"builder": "@angular-devkit/build-angular:browser",
"options": {
"outputPath": "dist",
"index": "src/index.html",
"main": "src/main.ts",
"polyfills": "src/polyfills.ts",
"tsConfig": "tsconfig.app.json",
"aot": true,
"assets": [
"src/assets",
"src/site.webmanifest"
],
"sourceMap": {
"hidden": false,
"scripts": true,
"styles": true
},
"styles": [
"src/styles.scss",
"node_modules/@fortawesome/fontawesome-free/css/all.min.css"
],
"scripts": []
},
"configurations": {
"production": {
"fileReplacements": [
{
"replace": "src/environments/environment.ts",
"with": "src/environments/environment.prod.ts"
}
],
"optimization": true,
"outputHashing": "all",
"namedChunks": false,
"extractLicenses": true,
"vendorChunk": true,
"buildOptimizer": true,
"budgets": [
{
"type": "initial",
"maximumWarning": "1mb",
"maximumError": "2mb"
},
{
"type": "anyComponentStyle",
"maximumWarning": "2kb",
"maximumError": "4kb"
}
]
}
}
},
"serve": {
"builder": "@angular-devkit/build-angular:dev-server",
"options": {
"sslKey": "./ssl/server.key",
"sslCert": "./ssl/server.crt",
"ssl": false,
"browserTarget": "kavita-webui:build"
},
"configurations": {
"production": {
"browserTarget": "kavita-webui:build:production"
}
}
},
"extract-i18n": {
"builder": "@angular-devkit/build-angular:extract-i18n",
"options": {
"browserTarget": "kavita-webui:build"
}
},
"test": {
"builder": "@angular-devkit/build-angular:karma",
"options": {
"main": "src/test.ts",
"polyfills": "src/polyfills.ts",
"tsConfig": "tsconfig.spec.json",
"karmaConfig": "karma.conf.js",
"assets": [
"src/assets",
"src/site.webmanifest"
],
"styles": [
"src/styles.scss"
],
"scripts": []
}
},
"lint": {
"builder": "@angular-devkit/build-angular:tslint",
"options": {
"tsConfig": [
"tsconfig.app.json",
"tsconfig.spec.json",
"e2e/tsconfig.json"
],
"exclude": [
"**/node_modules/**"
]
}
},
"e2e": {
"builder": "@angular-devkit/build-angular:protractor",
"options": {
"protractorConfig": "e2e/protractor.conf.js",
"devServerTarget": "kavita-webui:serve"
},
"configurations": {
"production": {
"devServerTarget": "kavita-webui:serve:production"
}
}
}
}
}
},
"defaultProject": "kavita-webui"
}

View File

@ -0,0 +1,37 @@
// @ts-check
// Protractor configuration file, see link for more information
// https://github.com/angular/protractor/blob/master/lib/config.ts
const { SpecReporter, StacktraceOption } = require('jasmine-spec-reporter');
/**
* @type { import("protractor").Config }
*/
exports.config = {
allScriptsTimeout: 11000,
specs: [
'./src/**/*.e2e-spec.ts'
],
capabilities: {
browserName: 'chrome'
},
directConnect: true,
SELENIUM_PROMISE_MANAGER: false,
baseUrl: 'http://localhost:4200/',
framework: 'jasmine',
jasmineNodeOpts: {
showColors: true,
defaultTimeoutInterval: 30000,
print: function() {}
},
onPrepare() {
require('ts-node').register({
project: require('path').join(__dirname, './tsconfig.json')
});
jasmine.getEnv().addReporter(new SpecReporter({
spec: {
displayStacktrace: StacktraceOption.PRETTY
}
}));
}
};

View File

@ -0,0 +1,23 @@
import { AppPage } from './app.po';
import { browser, logging } from 'protractor';
describe('workspace-project App', () => {
let page: AppPage;
beforeEach(() => {
page = new AppPage();
});
it('should display welcome message', async () => {
await page.navigateTo();
expect(await page.getTitleText()).toEqual('kavita-webui app is running!');
});
afterEach(async () => {
// Assert that there are no errors emitted from the browser
const logs = await browser.manage().logs().get(logging.Type.BROWSER);
expect(logs).not.toContain(jasmine.objectContaining({
level: logging.Level.SEVERE,
} as logging.Entry));
});
});

11
UI/Web/e2e/src/app.po.ts Normal file
View File

@ -0,0 +1,11 @@
import { browser, by, element } from 'protractor';
export class AppPage {
async navigateTo(): Promise<unknown> {
return browser.get(browser.baseUrl);
}
async getTitleText(): Promise<string> {
return element(by.css('app-root .content span')).getText();
}
}

13
UI/Web/e2e/tsconfig.json Normal file
View File

@ -0,0 +1,13 @@
/* To learn more about this file see: https://angular.io/config/tsconfig. */
{
"extends": "../tsconfig.json",
"compilerOptions": {
"outDir": "../out-tsc/e2e",
"module": "commonjs",
"target": "es2018",
"types": [
"jasmine",
"node"
]
}
}

37201
UI/Web/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

78
UI/Web/package.json Normal file
View File

@ -0,0 +1,78 @@
{
"name": "kavita-webui",
"version": "0.4.2",
"scripts": {
"ng": "ng",
"start": "ng serve",
"build": "ng build",
"prod": "ng build --prod",
"explore": "ng build --stats-json && webpack-bundle-analyzer ../kavita/API/wwwroot/stats.json",
"test": "jest",
"test:watch": "jest --watch",
"test:coverage": "jest --coverage",
"lint": "ng lint",
"e2e": "ng e2e"
},
"private": true,
"dependencies": {
"@angular-slider/ngx-slider": "^2.0.3",
"@angular/animations": "~11.0.0",
"@angular/common": "~11.0.0",
"@angular/compiler": "~11.0.0",
"@angular/core": "~11.0.0",
"@angular/forms": "~11.0.0",
"@angular/localize": "~11.0.0",
"@angular/platform-browser": "~11.0.0",
"@angular/platform-browser-dynamic": "~11.0.0",
"@angular/router": "~11.0.0",
"@fortawesome/fontawesome-free": "^5.15.1",
"@ng-bootstrap/ng-bootstrap": "^9.1.0",
"@ngx-lite/nav-drawer": "^0.4.6",
"@ngx-lite/util": "0.0.0",
"@sentry/angular": "^6.4.1",
"@sentry/integrations": "^6.4.1",
"@types/file-saver": "^2.0.1",
"angular-ng-autocomplete": "^2.0.5",
"bootstrap": "^4.5.0",
"bowser": "^2.11.0",
"file-saver": "^2.0.5",
"ng-lazyload-image": "^9.1.0",
"ng-sidebar": "^9.4.2",
"ngx-toastr": "^13.2.1",
"rxjs": "~6.6.0",
"swiper": "^6.5.8",
"tslib": "^2.0.0",
"zone.js": "~0.10.2"
},
"devDependencies": {
"@angular-devkit/build-angular": "~0.1100.0",
"@angular/cli": "^11.2.11",
"@angular/compiler-cli": "~11.0.0",
"@types/jest": "^26.0.20",
"@types/node": "^12.11.1",
"codelyzer": "^6.0.0",
"jest": "^26.6.3",
"jest-preset-angular": "^8.3.2",
"karma-coverage": "~2.0.3",
"protractor": "~7.0.0",
"ts-node": "~8.3.0",
"tslint": "^6.1.3",
"typescript": "~4.0.2"
},
"jest": {
"preset": "jest-preset-angular",
"setupFilesAfterEnv": [
"<rootDir>/setupJest.ts"
],
"testPathIgnorePatterns": [
"<rootDir>/node_modules/",
"<rootDir>/dist/"
],
"globals": {
"ts-jest": {
"tsConfig": "<rootDir>/tsconfig.spec.json",
"stringifyContentPathRegex": "\\.html$"
}
}
}
}

19
UI/Web/setupJest.ts Normal file
View File

@ -0,0 +1,19 @@
import 'jest-preset-angular';
/* global mocks for jsdom */
const mock = () => {
let storage: { [key: string]: string } = {};
return {
getItem: (key: string) => (key in storage ? storage[key] : null),
setItem: (key: string, value: string) => (storage[key] = value || ''),
removeItem: (key: string) => delete storage[key],
clear: () => (storage = {})
};
};
Object.defineProperty(window, 'localStorage', { value: mock() });
Object.defineProperty(window, 'sessionStorage', { value: mock() });
Object.defineProperty(window, 'getComputedStyle', {
value: () => ['-webkit-appearance'],
});

View File

@ -0,0 +1,28 @@
import { Injectable } from '@angular/core';
import { CanActivate } from '@angular/router';
import { ToastrService } from 'ngx-toastr';
import { Observable } from 'rxjs';
import { map } from 'rxjs/operators';
import { User } from '../_models/user';
import { AccountService } from '../_services/account.service';
@Injectable({
providedIn: 'root'
})
export class AdminGuard implements CanActivate {
constructor(private accountService: AccountService, private toastr: ToastrService) {}
canActivate(): Observable<boolean> {
// this automaticallys subs due to being router guard
return this.accountService.currentUser$.pipe(
map((user: User) => {
if (this.accountService.hasAdminRole(user)) {
return true;
}
this.toastr.error('You are not authorized to view this page.');
return false;
})
);
}
}

View File

@ -0,0 +1,27 @@
import { Injectable } from '@angular/core';
import { CanActivate, Router } from '@angular/router';
import { ToastrService } from 'ngx-toastr';
import { Observable } from 'rxjs';
import { map } from 'rxjs/operators';
import { User } from '../_models/user';
import { AccountService } from '../_services/account.service';
@Injectable({
providedIn: 'root'
})
export class AuthGuard implements CanActivate {
constructor(private accountService: AccountService, private router: Router, private toastr: ToastrService) {}
canActivate(): Observable<boolean> {
return this.accountService.currentUser$.pipe(
map((user: User) => {
if (user) {
return true;
}
this.toastr.error('You are not authorized to view this page.');
this.router.navigateByUrl('/home');
return false;
})
);
}
}

View File

@ -0,0 +1,17 @@
import { Injectable } from '@angular/core';
import { CanActivate, ActivatedRouteSnapshot, RouterStateSnapshot } from '@angular/router';
import { Observable } from 'rxjs';
import { MemberService } from '../_services/member.service';
@Injectable({
providedIn: 'root'
})
export class LibraryAccessGuard implements CanActivate {
constructor(private memberService: MemberService) {}
canActivate(next: ActivatedRouteSnapshot, state: RouterStateSnapshot): Observable<boolean> {
const libraryId = parseInt(state.url.split('library/')[1], 10);
return this.memberService.hasLibraryAccess(libraryId);
}
}

View File

@ -0,0 +1,121 @@
import { Injectable, OnDestroy } from '@angular/core';
import {
HttpRequest,
HttpHandler,
HttpEvent,
HttpInterceptor
} from '@angular/common/http';
import { Observable, throwError } from 'rxjs';
import { Router } from '@angular/router';
import { ToastrService } from 'ngx-toastr';
import { catchError, take } from 'rxjs/operators';
import { AccountService } from '../_services/account.service';
import { environment } from 'src/environments/environment';
@Injectable()
export class ErrorInterceptor implements HttpInterceptor {
constructor(private router: Router, private toastr: ToastrService, private accountService: AccountService) {}
intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {
return next.handle(request).pipe(
catchError(error => {
if (error === undefined || error === null) {
return throwError(error);
}
if (!environment.production) {
console.error('error:', error);
}
switch (error.status) {
case 400:
this.handleValidationError(error);
break;
case 401:
this.handleAuthError(error);
break;
case 404:
this.handleNotFound(error);
break;
case 500:
this.handleServerException(error);
break;
default:
// Don't throw multiple Something undexpected went wrong
if (this.toastr.previousToastMessage !== 'Something unexpected went wrong.') {
this.toastr.error('Something unexpected went wrong.');
}
// If we are not on no-connection, redirect there and save current url so when we refersh, we redirect back there
if (this.router.url !== '/no-connection') {
localStorage.setItem('kavita--no-connection-url', this.router.url);
this.router.navigateByUrl('/no-connection');
}
break;
}
return throwError(error);
})
);
}
private handleValidationError(error: any) {
// This 400 can also be a bad request
if (Array.isArray(error.error)) {
const modalStateErrors: any[] = [];
if (error.error.length > 0 && error.error[0].hasOwnProperty('message')) {
error.error.forEach((issue: {status: string, details: string, message: string}) => {
modalStateErrors.push(issue.details);
});
} else {
error.error.forEach((issue: {code: string, description: string}) => {
modalStateErrors.push(issue.description);
});
}
throw modalStateErrors.flat();
} else if (error.error.errors) {
// Validation error
const modalStateErrors = [];
for (const key in error.error.errors) {
if (error.error.errors[key]) {
modalStateErrors.push(error.error.errors[key]);
}
}
throw modalStateErrors.flat();
} else {
console.error('error:', error);
if (error.statusText === 'Bad Request') {
this.toastr.error(error.error, error.status);
} else {
this.toastr.error(error.statusText === 'OK' ? error.error : error.statusText, error.status);
}
}
}
private handleNotFound(error: any) {
this.toastr.error('That url does not exist.');
}
private handleServerException(error: any) {
console.error('500 error:', error);
const err = error.error;
if (err.hasOwnProperty('message') && err.message.trim() !== '') {
this.toastr.error(err.message);
} else {
this.toastr.error('There was an unknown critical error.');
}
}
private handleAuthError(error: any) {
// NOTE: Signin has error.error or error.statusText available.
// if statement is due to http/2 spec issue: https://github.com/angular/angular/issues/23334
this.accountService.currentUser$.pipe(take(1)).subscribe(user => {
if (user) {
this.toastr.error(error.statusText === 'OK' ? 'Unauthorized' : error.statusText, error.status);
}
this.accountService.logout();
});
}
}

View File

@ -0,0 +1,36 @@
import { Injectable } from '@angular/core';
import {
HttpRequest,
HttpHandler,
HttpEvent,
HttpInterceptor
} from '@angular/common/http';
import { Observable } from 'rxjs';
import { AccountService } from '../_services/account.service';
import { User } from '../_models/user';
import { take } from 'rxjs/operators';
@Injectable()
export class JwtInterceptor implements HttpInterceptor {
constructor(private accountService: AccountService) {}
intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {
let currentUser: User;
// Take 1 means we don't have to unsubscribe because we take 1 then complete
this.accountService.currentUser$.pipe(take(1)).subscribe(user => {
currentUser = user;
if (currentUser) {
request = request.clone({
setHeaders: {
Authorization: `Bearer ${currentUser.token}`
}
});
}
});
return next.handle(request);
}
}

View File

@ -0,0 +1,53 @@
<div class="modal-header">
<h4 class="modal-title" id="modal-basic-title">Edit {{tag?.title}} Collection</h4>
<button type="button" class="close" aria-label="Close" (click)="close()">
<span aria-hidden="true">&times;</span>
</button>
</div>
<div class="modal-body">
<p>
This tag is currently {{tag?.promoted ? 'promoted' : 'not promoted'}} (<i class="fa fa-angle-double-up" aria-hidden="true"></i>).
Promotion means that the tag can be seen server-wide, not just for admin users. All series that have this tag will still have user-access restrictions placed on them.
</p>
<form [formGroup]="collectionTagForm">
<div class="form-group">
<label for="summary">Summary</label>
<textarea id="summary" class="form-control" formControlName="summary" rows="3"></textarea>
</div>
</form>
<div class="list-group" *ngIf="!isLoading">
<h6>Applies to Series</h6>
<div class="form-check">
<input id="selectall" type="checkbox" class="form-check-input"
[ngModel]="selectAll" (change)="toggleAll()" [indeterminate]="someSelected">
<label for="selectall" class="form-check-label">{{selectAll ? 'Deselect' : 'Select'}} All</label>
</div>
<ul>
<li class="list-group-item" *ngFor="let item of series; let i = index">
<div class="form-check">
<input id="series-{{i}}" type="checkbox" class="form-check-input"
[ngModel]="selections.isSelected(item)" (change)="handleSelection(item)">
<label attr.for="series-{{i}}" class="form-check-label">{{item.name}} ({{libraryName(item.libraryId)}})</label>
</div>
</li>
</ul>
</div>
<div class="d-flex justify-content-center" *ngIf="pagination && series.length !== 0">
<ngb-pagination
*ngIf="pagination.totalPages > 1"
[(page)]="pagination.currentPage"
[pageSize]="pagination.itemsPerPage"
(pageChange)="onPageChange($event)"
[rotate]="false" [ellipses]="false" [boundaryLinks]="true"
[collectionSize]="pagination.totalItems"></ngb-pagination>
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" (click)="close()">Cancel</button>
<button type="button" class="btn btn-info" (click)="togglePromotion()">{{tag?.promoted ? 'Demote' : 'Promote'}}</button>
<button type="button" class="btn btn-primary" (click)="save()">Save</button>
</div>

View File

@ -0,0 +1,121 @@
import { Component, Input, OnInit } from '@angular/core';
import { FormControl, FormGroup } from '@angular/forms';
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
import { ToastrService } from 'ngx-toastr';
import { forkJoin } from 'rxjs';
import { ConfirmService } from 'src/app/shared/confirm.service';
import { SelectionModel } from 'src/app/typeahead/typeahead.component';
import { CollectionTag } from 'src/app/_models/collection-tag';
import { Pagination } from 'src/app/_models/pagination';
import { Series } from 'src/app/_models/series';
import { CollectionTagService } from 'src/app/_services/collection-tag.service';
import { LibraryService } from 'src/app/_services/library.service';
import { SeriesService } from 'src/app/_services/series.service';
@Component({
selector: 'app-edit-collection-tags',
templateUrl: './edit-collection-tags.component.html',
styleUrls: ['./edit-collection-tags.component.scss']
})
export class EditCollectionTagsComponent implements OnInit {
@Input() tag!: CollectionTag;
series: Array<Series> = [];
selections!: SelectionModel<Series>;
isLoading: boolean = true;
pagination!: Pagination;
selectAll: boolean = true;
libraryNames!: any;
collectionTagForm!: FormGroup;
constructor(public modal: NgbActiveModal, private seriesService: SeriesService,
private collectionService: CollectionTagService, private toastr: ToastrService,
private confirmSerivce: ConfirmService, private libraryService: LibraryService) { }
ngOnInit(): void {
if (this.pagination == undefined) {
this.pagination = {totalPages: 1, totalItems: 200, itemsPerPage: 200, currentPage: 0};
}
this.collectionTagForm = new FormGroup({
summary: new FormControl(this.tag.summary, []),
});
this.loadSeries();
}
onPageChange(pageNum: number) {
this.pagination.currentPage = pageNum;
this.loadSeries();
}
toggleAll() {
this.selectAll = !this.selectAll;
this.series.forEach(s => this.selections.toggle(s, this.selectAll));
}
loadSeries() {
forkJoin([
this.seriesService.getSeriesForTag(this.tag.id, this.pagination.currentPage, this.pagination.itemsPerPage),
this.libraryService.getLibraryNames()
]).subscribe(results => {
const series = results[0];
this.pagination = series.pagination;
this.series = series.result;
this.selections = new SelectionModel<Series>(true, this.series);
this.isLoading = false;
this.libraryNames = results[1];
});
}
handleSelection(item: Series) {
this.selections.toggle(item);
const numberOfSelected = this.selections.selected().length;
if (numberOfSelected == 0) {
this.selectAll = false;
} else if (numberOfSelected == this.series.length) {
this.selectAll = true;
}
}
togglePromotion() {
const originalPromotion = this.tag.promoted;
this.tag.promoted = !this.tag.promoted;
this.collectionService.updateTag(this.tag).subscribe(res => {
this.toastr.success('Tag updated successfully');
}, err => {
this.tag.promoted = originalPromotion;
});
}
libraryName(libraryId: number) {
return this.libraryNames[libraryId];
}
close() {
this.modal.close(false);
}
async save() {
const unselectedIds = this.selections.unselected().map(s => s.id);
const tag: CollectionTag = {...this.tag};
tag.summary = this.collectionTagForm.get('summary')?.value;
if (unselectedIds.length == this.series.length && !await this.confirmSerivce.confirm('Warning! No series are selected, saving will delete the tag. Are you sure you want to continue?')) {
return;
}
this.collectionService.updateSeriesForTag(tag, this.selections.unselected().map(s => s.id)).subscribe(() => {
this.toastr.success('Tag updated');
this.modal.close(true);
});
}
get someSelected() {
const selected = this.selections.selected();
return (selected.length != this.series.length && selected.length != 0);
}
}

View File

@ -0,0 +1,169 @@
<div *ngIf="series !== undefined">
<div class="modal-header">
<h4 class="modal-title">
{{this.series.name}} Details</h4>
<button type="button" class="close" aria-label="Close" (click)="close()">
<span aria-hidden="true">&times;</span>
</button>
</div>
<div class="modal-body scrollable-modal">
<form [formGroup]="editSeriesForm">
<ul ngbNav #nav="ngbNav" [(activeId)]="active" class="nav-tabs">
<li [ngbNavItem]="tabs[0]">
<a ngbNavLink>{{tabs[0]}}</a>
<ng-template ngbNavContent>
<div class="row no-gutters">
<div class="form-group" style="width: 100%">
<label for="name">Name</label>
<input id="name" class="form-control" formControlName="name" type="text">
</div>
</div>
<div class="row no-gutters">
<div class="form-group" style="width: 100%">
<label for="sort-name">Sort Name</label>
<input id="sort-name" class="form-control" formControlName="sortName" type="text">
</div>
</div>
<div class="row no-gutters">
<div class="form-group" style="width: 100%">
<label for="localized-name">Localized Name</label>
<input id="localized-name" class="form-control" formControlName="localizedName" type="text">
</div>
</div>
<div class="row no-gutters" *ngIf="metadata">
<div class="col-md-6">
<div class="form-group">
<label for="author">Author</label>
<input id="author" class="form-control" placeholder="Not Implemented" readonly="true" formControlName="author" type="text">
</div>
</div>
<div class="col-md-6">
<div class="form-group">
<label for="artist">Artist</label>
<input id="artist" class="form-control" placeholder="Not Implemented" readonly="true" formControlName="artist" type="text">
</div>
</div>
</div>
<div class="row no-gutters" *ngIf="metadata">
<div class="col-md-6">
<div class="form-group">
<label for="genres">Genres</label>
<input id="genres" class="form-control" placeholder="Not Implemented" readonly="true" formControlName="genres" type="text">
</div>
</div>
<div class="col-md-6">
<div class="form-group">
<label for="collections">Collections</label>
<app-typeahead (selectedData)="updateCollections($event)" [settings]="settings">
<ng-template #badgeItem let-item let-position="idx">
{{item.title}}
</ng-template>
<ng-template #optionItem let-item let-position="idx">
{{item.title}}
</ng-template>
</app-typeahead>
</div>
</div>
</div>
<div class="row no-gutters">
<div class="form-group" style="width: 100%">
<label for="summary">Summary</label>
<textarea id="summary" class="form-control" formControlName="summary" rows="4"></textarea>
</div>
</div>
</ng-template>
</li>
<li [ngbNavItem]="tabs[1]">
<a ngbNavLink>{{tabs[1]}}</a>
<ng-template ngbNavContent>
<p>Not Yet implemented</p>
</ng-template>
</li>
<li [ngbNavItem]="tabs[2]">
<a ngbNavLink>{{tabs[2]}}</a>
<ng-template ngbNavContent>
<p>Not Yet implemented</p>
<img src="{{imageService.getSeriesCoverImage(series.id)}}">
</ng-template>
</li>
<li [ngbNavItem]="tabs[3]">
<a ngbNavLink>{{tabs[3]}}</a>
<ng-template ngbNavContent>
<h4>Information</h4>
<div class="row no-gutters mb-2">
<div class="col-md-6" *ngIf="libraryName">Library: {{libraryName | titlecase}}</div>
</div>
<h4>Volumes</h4>
<div class="spinner-border text-secondary" role="status" *ngIf="isLoadingVolumes">
<span class="invisible">Loading...</span>
</div>
<ul class="list-unstyled" *ngIf="!isLoadingVolumes">
<li class="media my-4" *ngFor="let volume of seriesVolumes">
<img class="mr-3" style="width: 74px;" src="{{imageService.getVolumeCoverImage(volume.id)}}" >
<div class="media-body">
<h5 class="mt-0 mb-1">Volume {{volume.name}}</h5>
<div>
<div class="row no-gutters">
<div class="col">
Created: {{volume.created | date: 'MM/dd/yyyy'}}
</div>
<div class="col">
Last Modified: {{volume.lastModified | date: 'MM/dd/yyyy'}}
</div>
</div>
<div class="row no-gutters">
<div class="col">
<!-- Is Special: {{volume.isSpecial}} -->
<button type="button" class="btn btn-outline-primary" (click)="collapse.toggle()" [attr.aria-expanded]="!volumeCollapsed[volume.name]">
View Files
</button>
</div>
<div class="col">
Pages: {{volume.pages}}
</div>
</div>
<div #collapse="ngbCollapse" [(ngbCollapse)]="volumeCollapsed[volume.name]">
<ul class="list-group mt-2">
<li *ngFor="let file of volume.volumeFiles.sort()" class="list-group-item">
<span>{{file.filePath}}</span>
<div class="row no-gutters">
<div class="col">
Chapter: {{file.chapter}}
</div>
<div class="col">
Pages: {{file.pages}}
</div>
<div class="col">
Format: <span class="badge badge-secondary">{{utilityService.mangaFormatToText(file.format)}}</span>
</div>
</div>
</li>
</ul>
</div>
</div>
</div>
</li>
</ul>
</ng-template>
</li>
</ul>
</form>
<div [ngbNavOutlet]="nav" class="mt-3"></div>
</div>
<div class="modal-footer">
<!-- TODO: Replace secondary buttons in modals with btn-light -->
<button type="button" class="btn btn-secondary" (click)="close()">Close</button>
<button type="submit" class="btn btn-primary" (click)="save()">Save</button>
</div>
</div>

View File

@ -0,0 +1,3 @@
.scrollable-modal {
height: 600px;
}

View File

@ -0,0 +1,148 @@
import { Component, Input, OnDestroy, OnInit } from '@angular/core';
import { FormBuilder, FormControl, FormGroup } from '@angular/forms';
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
import { forkJoin, Subject } from 'rxjs';
import { takeUntil } from 'rxjs/operators';
import { UtilityService } from 'src/app/shared/_services/utility.service';
import { TypeaheadSettings } from 'src/app/typeahead/typeahead-settings';
import { Chapter } from 'src/app/_models/chapter';
import { CollectionTag } from 'src/app/_models/collection-tag';
import { Series } from 'src/app/_models/series';
import { SeriesMetadata } from 'src/app/_models/series-metadata';
import { CollectionTagService } from 'src/app/_services/collection-tag.service';
import { ImageService } from 'src/app/_services/image.service';
import { LibraryService } from 'src/app/_services/library.service';
import { SeriesService } from 'src/app/_services/series.service';
@Component({
selector: 'app-edit-series-modal',
templateUrl: './edit-series-modal.component.html',
styleUrls: ['./edit-series-modal.component.scss']
})
export class EditSeriesModalComponent implements OnInit, OnDestroy {
@Input() series!: Series;
seriesVolumes: any[] = [];
isLoadingVolumes = false;
isCollapsed = true;
volumeCollapsed: any = {};
tabs = ['General', 'Fix Match', 'Cover Image', 'Info'];
active = this.tabs[0];
editSeriesForm!: FormGroup;
libraryName: string | undefined = undefined;
private readonly onDestroy = new Subject<void>();
settings: TypeaheadSettings<CollectionTag> = new TypeaheadSettings();
tags: CollectionTag[] = [];
metadata!: SeriesMetadata;
constructor(public modal: NgbActiveModal,
private seriesService: SeriesService,
public utilityService: UtilityService,
private fb: FormBuilder,
public imageService: ImageService,
private libraryService: LibraryService,
private collectionService: CollectionTagService) { }
ngOnInit(): void {
this.libraryService.getLibraryNames().pipe(takeUntil(this.onDestroy)).subscribe(names => {
this.libraryName = names[this.series.libraryId];
});
this.setupTypeaheadSettings();
this.editSeriesForm = this.fb.group({
id: new FormControl(this.series.id, []),
summary: new FormControl(this.series.summary, []),
name: new FormControl(this.series.name, []),
localizedName: new FormControl(this.series.localizedName, []),
sortName: new FormControl(this.series.sortName, []),
rating: new FormControl(this.series.userRating, []),
genres: new FormControl('', []),
author: new FormControl('', []),
artist: new FormControl('', []),
coverImageIndex: new FormControl(0, [])
});
this.seriesService.getMetadata(this.series.id).subscribe(metadata => {
if (metadata) {
this.metadata = metadata;
this.settings.savedData = metadata.tags;
}
});
this.isLoadingVolumes = true;
this.seriesService.getVolumes(this.series.id).subscribe(volumes => {
this.seriesVolumes = volumes;
this.isLoadingVolumes = false;
volumes.forEach(v => {
this.volumeCollapsed[v.name] = true;
});
this.seriesVolumes.forEach(vol => {
vol.volumeFiles = vol.chapters?.sort(this.utilityService.sortChapters).map((c: Chapter) => c.files.map((f: any) => {
f.chapter = c.number;
return f;
})).flat();
});
});
}
ngOnDestroy() {
this.onDestroy.next();
this.onDestroy.complete();
}
setupTypeaheadSettings() {
this.settings.minCharacters = 0;
this.settings.multiple = true;
this.settings.id = 'collections';
this.settings.unique = true;
this.settings.addIfNonExisting = true;
this.settings.fetchFn = (filter: string) => this.fetchCollectionTags(filter);
this.settings.addTransformFn = ((title: string) => {
return {id: 0, title: title, promoted: false, coverImage: '', summary: '' };
});
this.settings.compareFn = (options: CollectionTag[], filter: string) => {
const f = filter.toLowerCase();
return options.filter(m => m.title.toLowerCase() === f);
}
}
close() {
this.modal.close({success: false, series: undefined});
}
fetchCollectionTags(filter: string = '') {
return this.collectionService.search(filter);
}
formatChapterNumber(chapter: Chapter) {
if (chapter.number === '0') {
return '1';
}
return chapter.number;
}
save() {
// TODO: In future (once locking or metadata implemented), do a converstion to updateSeriesDto
forkJoin([
this.seriesService.updateSeries(this.editSeriesForm.value),
this.seriesService.updateMetadata(this.metadata, this.tags)
]).subscribe(results => {
this.modal.close({success: true, series: this.editSeriesForm.value});
});
}
updateCollections(tags: CollectionTag[]) {
this.tags = tags;
}
}

View File

@ -0,0 +1,32 @@
<div *ngIf="series !== undefined">
<div class="modal-header">
<h4 class="modal-title" id="modal-basic-title">
{{series.name}} Review</h4>
<button type="button" class="close" aria-label="Close" (click)="close()">
<span aria-hidden="true">&times;</span>
</button>
</div>
<div class="modal-body">
<form [formGroup]="reviewGroup">
<div class="form-group">
<label for="rating">Rating</label>
<div>
<ngb-rating style="margin-top: 2px; font-size: 1.5rem;" formControlName="rating"></ngb-rating>
<button class="btn btn-information ml-2" (click)="clearRating()"><i aria-hidden="true" class="fa fa-ban"></i><span class="phone-hidden">&nbsp;Clear</span></button>
</div>
</div>
<div class="form-group">
<label for="review">Review</label>
<textarea id="review" class="form-control" formControlName="review" rows="3"></textarea>
</div>
</form>
</div>
<div class="modal-footer">
<button type="submit" class="btn btn-secondary" (click)="close()">Close</button>
<button type="submit" class="btn btn-primary" (click)="save()">Save</button>
</div>
</div>

View File

@ -0,0 +1,41 @@
import { Component, Input, OnInit } from '@angular/core';
import { FormControl, FormGroup } from '@angular/forms';
import { NgbModal, NgbActiveModal } from '@ng-bootstrap/ng-bootstrap';
import { Series } from 'src/app/_models/series';
import { SeriesService } from 'src/app/_services/series.service';
@Component({
selector: 'app-review-series-modal',
templateUrl: './review-series-modal.component.html',
styleUrls: ['./review-series-modal.component.scss']
})
export class ReviewSeriesModalComponent implements OnInit {
@Input() series!: Series;
reviewGroup!: FormGroup;
constructor(public modal: NgbActiveModal, private seriesService: SeriesService) {}
ngOnInit(): void {
this.reviewGroup = new FormGroup({
review: new FormControl(this.series.userReview, []),
rating: new FormControl(this.series.userRating, [])
});
}
close() {
this.modal.close({success: false, review: null});
}
clearRating() {
this.reviewGroup.get('rating')?.setValue(0);
}
save() {
const model = this.reviewGroup.value;
this.seriesService.updateRating(this.series?.id, model.rating, model.review).subscribe(() => {
this.modal.close({success: true, review: model.review, rating: model.rating});
});
}
}

View File

@ -0,0 +1,5 @@
export interface Bookmark {
pageNum: number;
chapterId: number;
bookScrollId?: string;
}

View File

@ -0,0 +1,14 @@
import { MangaFile } from './manga-file';
export interface Chapter {
id: number;
range: string;
number: string;
files: Array<MangaFile>;
coverImage: string;
pages: number;
volumeId: number;
pagesRead: number; // Attached for the given user when requesting from API
isSpecial: boolean;
title: string;
}

View File

@ -0,0 +1,12 @@
import { DetailsVersion } from "./details-version";
export interface ClientInfo {
os: DetailsVersion,
browser: DetailsVersion,
platformType: string,
kavitaUiVersion: string,
screenResolution: string;
collectedAt?: Date;
}

View File

@ -0,0 +1,7 @@
export interface CollectionTag {
id: number;
title: string;
promoted: boolean;
coverImage: string;
summary: string;
}

View File

@ -0,0 +1,4 @@
export interface DetailsVersion {
name: string;
version: string;
}

View File

@ -0,0 +1,14 @@
//TODO: Refactor this name to something better
export interface InProgressChapter {
id: number;
range: string;
number: string;
pages: number;
volumeId: number;
pagesRead: number;
seriesId: number;
seriesName: string;
coverImage: string;
libraryId: number;
libraryName: string;
}

View File

@ -0,0 +1,15 @@
export enum LibraryType {
Manga = 0,
Comic = 1,
Book = 2,
MangaImages = 3,
ComicImages = 4
}
export interface Library {
id: number;
name: string;
coverImage: string;
type: LibraryType;
folders: string[];
}

View File

@ -0,0 +1,7 @@
import { MangaFormat } from './manga-format';
export interface MangaFile {
filePath: string;
pages: number;
format: MangaFormat;
}

View File

@ -0,0 +1,6 @@
export enum MangaFormat {
IMAGE = 0,
ARCHIVE = 1,
UNKNOWN = 2,
BOOK = 3
}

View File

@ -0,0 +1,10 @@
import { Library } from './library';
export interface Member {
username: string;
lastActive: string; // datetime
created: string; // datetime
isAdmin: boolean;
roles: string[];
libraries: Library[];
}

View File

@ -0,0 +1,11 @@
export interface Pagination {
currentPage: number;
itemsPerPage: number;
totalItems: number;
totalPages: number;
}
export class PaginatedResult<T> {
result!: T;
pagination!: Pagination;
}

View File

@ -0,0 +1,10 @@
export enum PersonRole {
Other = 0,
Author = 1,
Artist = 2
}
export interface Person {
name: string;
role: PersonRole;
}

View File

@ -0,0 +1,5 @@
export enum PageSplitOption {
SplitLeftToRight = 0,
SplitRightToLeft = 1,
NoSplit = 2
}

View File

@ -0,0 +1,30 @@
import { PageSplitOption } from './page-split-option';
import { READER_MODE } from './reader-mode';
import { ReadingDirection } from './reading-direction';
import { ScalingOption } from './scaling-option';
export interface Preferences {
// Manga Reader
readingDirection: ReadingDirection;
scalingOption: ScalingOption;
pageSplitOption: PageSplitOption;
readerMode: READER_MODE;
autoCloseMenu: boolean;
// Book Reader
bookReaderDarkMode: boolean;
bookReaderMargin: number;
bookReaderLineSpacing: number;
bookReaderFontSize: number;
bookReaderFontFamily: string;
bookReaderTapToPaginate: boolean;
bookReaderReadingDirection: ReadingDirection;
// Global
siteDarkMode: boolean;
}
export const readingDirections = [{text: 'Left to Right', value: ReadingDirection.LeftToRight}, {text: 'Right to Left', value: ReadingDirection.RightToLeft}];
export const scalingOptions = [{text: 'Automatic', value: ScalingOption.Automatic}, {text: 'Fit to Height', value: ScalingOption.FitToHeight}, {text: 'Fit to Width', value: ScalingOption.FitToWidth}, {text: 'Original', value: ScalingOption.Original}];
export const pageSplitOptions = [{text: 'Right to Left', value: PageSplitOption.SplitRightToLeft}, {text: 'Left to Right', value: PageSplitOption.SplitLeftToRight}, {text: 'No Split', value: PageSplitOption.NoSplit}];
export const readingModes = [{text: 'Left to Right', value: READER_MODE.MANGA_LR}, {text: 'Up to Down', value: READER_MODE.MANGA_UD}/*, {text: 'Webtoon', value: READER_MODE.WEBTOON}*/];

View File

@ -0,0 +1,14 @@
export enum READER_MODE {
/**
* Manga default left/right to page
*/
MANGA_LR = 0,
/**
* Manga up and down to page
*/
MANGA_UD = 1,
/**
* Webtoon reading (scroll) with optional areas to tap
*/
WEBTOON = 2
}

View File

@ -0,0 +1,4 @@
export enum ReadingDirection {
LeftToRight = 0,
RightToLeft = 1
}

View File

@ -0,0 +1,6 @@
export enum ScalingOption {
FitToHeight = 0,
FitToWidth = 1,
Original = 2,
Automatic = 3
}

View File

@ -0,0 +1,9 @@
export interface SearchResult {
seriesId: number;
libraryId: number;
libraryName: string;
name: string;
originalName: string;
sortName: string;
coverImage: string; // byte64 encoded
}

View File

@ -0,0 +1,10 @@
import { CollectionTag } from "./collection-tag";
import { Person } from "./person";
export interface SeriesMetadata {
publisher: string;
genres: Array<string>;
tags: Array<CollectionTag>;
persons: Array<Person>;
seriesId: number;
}

View File

@ -0,0 +1,18 @@
import { Volume } from './volume';
export interface Series {
id: number;
name: string;
originalName: string; // This is not shown to user
localizedName: string;
sortName: string;
summary: string;
coverImage: string;
volumes: Volume[];
pages: number; // Total pages in series
pagesRead: number; // Total pages the logged in user has read
userRating: number; // User rating
userReview: string; // User review
libraryId: number;
created: string; // DateTime when entity was created
}

Some files were not shown because too many files have changed in this diff Show More