Local Metadata Integration Part 1 (#817)

* Started with some basic plumbing with comic info parsing updating Series/Volume.

* We can now get chapter title from comicInfo.xml

* Hooked in the ability to store people into the chapter metadata.

* Removed no longer used imports, fixed up some foreign key constraints on deleting series with person linked.

* Refactored Summary out of the UI for Series into SeriesMetadata. Updated application to .net 6. There is a bug in metadata code for updating.

* Removed the parallel.ForEach with a normal foreach which lets us use async. For I/O heavy code, shouldn't change much.

* Refactored scan code to only check extensions with comic info, fixed a bug on scan events not using correct method name, removed summary field (still buggy)

* Fixed a bug where on cancelling a metadata request in modal, underlying button would get stuck in a disabled state.

* Changed how metadata selects the first volume to read summary info from. It will now select the first non-special volume rather than Volume 1.

* More debugging and found more bugs to fix

* Redid all the migrations as one single one. Fixed a bug with GetChapterInfo returning null when ChapterMetadata didn't exist for that Chapter.

Fixed an issue with mapper failing on GetChapterMetadata. Started work on adding people and a design for people.

* Fixed a bug where checking if file modified now takes into account if file has been processed at least once. Introduced a bug in saving people to series.

* Just made code compilable again

* Fixed up code. Now people for series and chapters add correctly without any db issues.

* Things are working, but I'm not happy with how the management of Person is. I need to take into account that 1 person needs to map to an image and role is arbitrary.

* Started adding UI code to showcase chapter metadata

* Updated workflow to be .NET 6

* WIP of updating card detail to show the information more clearly and without so many if statements

* Removed ChatperMetadata and store on the Chapter itself. Much easier to use and less joins.

* Implemented Genre on SeriesMetadata level

* Genres and People are now removed from Series level if they are no longer on comicInfo

* PeopleHelper is done with unit tests. Everything is working.

* Unit tests in place for Genre Helper

* Starting on CacheHelper

* Finished tests for ShouldUpdateCoverImage. Fixed and added tests in ArchiveService/ScannerService.

* CacheHelper is fully tested

* Some DI cleanup

* Scanner Service now calls GetComicInfo for books. Added ability to update Series Sort name from metadata files (mainly epub as comicinfo doesn't have a field)

* Forgot to move a line of code

* SortName now populates from metadata (epub only, ComicInfo has no tags)

* Cards now show the chapter title name if it's set on hover, else will default back to title.

* Fixed a major issue with how MangaFiles were being updated with LastModified, which messed up our logic for avoiding refreshes.

* Woohoo, more tests and some refactors to be able to test more services wtih mock filesystem. Fixed an issue where SortName was getting set as first chapter, but the Series was in a group.

* Refactored the MangaFile creation code into the DbFactory where we also setup the first LastModified update.

* Has file changed bug is now finally fixed

* Remove dead genres, refactor genre to use title instead of name.

* Refactored out a directory from ShouldUpdateCoverImage() to keep the code clean

* Unit tests for ComicInfo on BookService.

* Refactored series detail into it's own component

* Series-detail now received refresh metadata events to refresh what's on screen

* Removed references to Artist on PersonRole as it has no metadata mapping

* Security audit

* Fixed a benchmark

* Updated JWT Token generator to use new methods in .NET 6

* Updated all the docker and build commands to use net6.0

* Commented out sonar scan since it's not setup for net6.0 yet.
This commit is contained in:
Joseph Milazzo 2021-12-02 11:02:34 -06:00 committed by GitHub
parent 10a6a3a544
commit e7619e6b0a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
140 changed files with 9315 additions and 1545 deletions

View File

@ -35,63 +35,63 @@ jobs:
name: csproj
path: Kavita.Common/Kavita.Common.csproj
test:
name: Install Sonar & Test
needs: build
runs-on: windows-latest
steps:
- name: Checkout Repo
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 6.0.100
- name: Install dependencies
run: dotnet restore
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 1.11
- name: Cache SonarCloud packages
uses: actions/cache@v1
with:
path: ~\sonar\cache
key: ${{ runner.os }}-sonar
restore-keys: ${{ runner.os }}-sonar
- name: Cache SonarCloud scanner
id: cache-sonar-scanner
uses: actions/cache@v1
with:
path: .\.sonar\scanner
key: ${{ runner.os }}-sonar-scanner
restore-keys: ${{ runner.os }}-sonar-scanner
- name: Install SonarCloud scanner
if: steps.cache-sonar-scanner.outputs.cache-hit != 'true'
shell: powershell
run: |
New-Item -Path .\.sonar\scanner -ItemType Directory
dotnet tool update dotnet-sonarscanner --tool-path .\.sonar\scanner
- name: Sonar Scan
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
shell: powershell
run: |
.\.sonar\scanner\dotnet-sonarscanner begin /k:"Kareadita_Kavita" /o:"kareadita" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="https://sonarcloud.io"
dotnet build --configuration Release
.\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
- name: Test
run: dotnet test --no-restore --verbosity normal
# test:
# name: Install Sonar & Test
# needs: build
# runs-on: windows-latest
# steps:
# - name: Checkout Repo
# uses: actions/checkout@v2
# with:
# fetch-depth: 0
#
# - name: Setup .NET Core
# uses: actions/setup-dotnet@v1
# with:
# dotnet-version: 6.0.100
#
# - name: Install dependencies
# run: dotnet restore
#
# - name: Set up JDK 11
# uses: actions/setup-java@v1
# with:
# java-version: 1.11
#
# - name: Cache SonarCloud packages
# uses: actions/cache@v1
# with:
# path: ~\sonar\cache
# key: ${{ runner.os }}-sonar
# restore-keys: ${{ runner.os }}-sonar
#
# - name: Cache SonarCloud scanner
# id: cache-sonar-scanner
# uses: actions/cache@v1
# with:
# path: .\.sonar\scanner
# key: ${{ runner.os }}-sonar-scanner
# restore-keys: ${{ runner.os }}-sonar-scanner
#
# - name: Install SonarCloud scanner
# if: steps.cache-sonar-scanner.outputs.cache-hit != 'true'
# shell: powershell
# run: |
# New-Item -Path .\.sonar\scanner -ItemType Directory
# dotnet tool update dotnet-sonarscanner --tool-path .\.sonar\scanner
#
# - name: Sonar Scan
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
# shell: powershell
# run: |
# .\.sonar\scanner\dotnet-sonarscanner begin /k:"Kareadita_Kavita" /o:"kareadita" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="https://sonarcloud.io"
# dotnet build --configuration Release
# .\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
#
# - name: Test
# run: dotnet test --no-restore --verbosity normal
version:
name: Bump version on Develop push

View File

@ -1,7 +1,7 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net5.0</TargetFramework>
<TargetFramework>net6.0</TargetFramework>
<OutputType>Exe</OutputType>
</PropertyGroup>

View File

@ -1,4 +1,5 @@
using System.IO;
using System.IO.Abstractions;
using API.Entities.Enums;
using API.Interfaces.Services;
using API.Parser;
@ -20,11 +21,13 @@ namespace API.Benchmark
private readonly ParseScannedFiles _parseScannedFiles;
private readonly ILogger<ParseScannedFiles> _logger = Substitute.For<ILogger<ParseScannedFiles>>();
private readonly ILogger<BookService> _bookLogger = Substitute.For<ILogger<BookService>>();
private readonly IArchiveService _archiveService = Substitute.For<ArchiveService>();
public ParseScannedFilesBenchmarks()
{
IBookService bookService = new BookService(_bookLogger);
_parseScannedFiles = new ParseScannedFiles(bookService, _logger);
_parseScannedFiles = new ParseScannedFiles(bookService, _logger, _archiveService,
new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new FileSystem()));
}
// [Benchmark]

View File

@ -1,15 +1,16 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net5.0</TargetFramework>
<TargetFramework>net6.0</TargetFramework>
<IsPackable>false</IsPackable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="5.0.10" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="16.11.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="6.0.0" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.0.0" />
<PackageReference Include="NSubstitute" Version="4.2.2" />
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="14.0.3" />
<PackageReference Include="xunit" Version="2.4.1" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.4.3">
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>

View File

@ -1,4 +1,5 @@
using API.Entities;
using API.Entities.Metadata;
using API.Extensions;
using API.Parser;
using Xunit;

View File

@ -0,0 +1,277 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Abstractions.TestingHelpers;
using API.Entities;
using API.Helpers;
using API.Services;
using Xunit;
namespace API.Tests.Helpers;
public class CacheHelperTests
{
private const string TestCoverImageDirectory = @"c:\";
private const string TestCoverImageFile = "thumbnail.jpg";
private readonly string _testCoverPath = Path.Join(TestCoverImageDirectory, TestCoverImageFile);
private const string TestCoverArchive = @"file in folder.zip";
private readonly ICacheHelper _cacheHelper;
public CacheHelperTests()
{
var file = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), file },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), file }
});
var fileService = new FileService(fileSystem);
_cacheHelper = new CacheHelper(fileService);
}
[Theory]
[InlineData("", false)]
[InlineData("C:/", false)]
[InlineData(null, false)]
public void CoverImageExists_DoesFileExist(string coverImage, bool exists)
{
Assert.Equal(exists, _cacheHelper.CoverImageExists(coverImage));
}
[Fact]
public void CoverImageExists_FileExists()
{
Assert.True(_cacheHelper.CoverImageExists(TestCoverArchive));
}
[Fact]
public void ShouldUpdateCoverImage_OnFirstRun()
{
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = DateTime.Now
};
Assert.True(_cacheHelper.ShouldUpdateCoverImage(null, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
false, false));
}
[Fact]
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetNotLocked()
{
// Represents first run
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = DateTime.Now
};
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
false, false));
}
[Fact]
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetLocked()
{
// Represents first run
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = DateTime.Now
};
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
false, true));
}
[Fact]
public void ShouldUpdateCoverImage_ShouldNotUpdateOnSecondRunWithCoverImageSetLocked_Modified()
{
// Represents first run
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = DateTime.Now
};
Assert.False(_cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, DateTime.Now.Subtract(TimeSpan.FromMinutes(1)),
false, true));
}
[Fact]
public void ShouldUpdateCoverImage_CoverImageSetAndReplaced_Modified()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var created = DateTime.Now.Subtract(TimeSpan.FromHours(1));
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = DateTime.Now.Subtract(TimeSpan.FromMinutes(1))
};
Assert.True(cacheHelper.ShouldUpdateCoverImage(_testCoverPath, file, created,
false, false));
}
[Fact]
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceCreated()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var chapter = new Chapter()
{
Created = filesystemFile.LastWriteTime.DateTime,
LastModified = filesystemFile.LastWriteTime.DateTime
};
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = filesystemFile.LastWriteTime.DateTime
};
Assert.True(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
}
[Fact]
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceLastModified()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var chapter = new Chapter()
{
Created = filesystemFile.LastWriteTime.DateTime,
LastModified = filesystemFile.LastWriteTime.DateTime
};
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = filesystemFile.LastWriteTime.DateTime
};
Assert.True(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
}
[Fact]
public void HasFileNotChangedSinceCreationOrLastScan_NotChangedSinceLastModified_ForceUpdate()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var chapter = new Chapter()
{
Created = filesystemFile.LastWriteTime.DateTime,
LastModified = filesystemFile.LastWriteTime.DateTime
};
var file = new MangaFile()
{
FilePath = TestCoverArchive,
LastModified = filesystemFile.LastWriteTime.DateTime
};
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, true, file));
}
[Fact]
public void HasFileNotChangedSinceCreationOrLastScan_ModifiedSinceLastScan()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var chapter = new Chapter()
{
Created = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10)),
LastModified = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10))
};
var file = new MangaFile()
{
FilePath = Path.Join(TestCoverImageDirectory, TestCoverArchive),
LastModified = filesystemFile.LastWriteTime.DateTime
};
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
}
[Fact]
public void HasFileNotChangedSinceCreationOrLastScan_ModifiedSinceLastScan_ButLastModifiedSame()
{
var filesystemFile = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ Path.Join(TestCoverImageDirectory, TestCoverArchive), filesystemFile },
{ Path.Join(TestCoverImageDirectory, TestCoverImageFile), filesystemFile }
});
var fileService = new FileService(fileSystem);
var cacheHelper = new CacheHelper(fileService);
var chapter = new Chapter()
{
Created = filesystemFile.LastWriteTime.DateTime.Subtract(TimeSpan.FromMinutes(10)),
LastModified = filesystemFile.LastWriteTime.DateTime
};
var file = new MangaFile()
{
FilePath = Path.Join(TestCoverImageDirectory, TestCoverArchive),
LastModified = filesystemFile.LastWriteTime.DateTime
};
Assert.False(cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, false, file));
}
}

View File

@ -1,6 +1,7 @@
using System.Collections.Generic;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
namespace API.Tests.Helpers
{

View File

@ -0,0 +1,110 @@
using System.Collections.Generic;
using API.Data;
using API.Entities;
using API.Helpers;
using Xunit;
namespace API.Tests.Helpers;
public class GenreHelperTests
{
[Fact]
public void UpdateGenre_ShouldAddNewGenre()
{
var allGenres = new List<Genre>
{
DbFactory.Genre("Action", false),
DbFactory.Genre("action", false),
DbFactory.Genre("Sci-fi", false),
};
var genreAdded = new List<Genre>();
GenreHelper.UpdateGenre(allGenres, new[] {"Action", "Adventure"}, false, genre =>
{
genreAdded.Add(genre);
});
Assert.Equal(2, genreAdded.Count);
Assert.Equal(4, allGenres.Count);
}
[Fact]
public void UpdateGenre_ShouldNotAddDuplicateGenre()
{
var allGenres = new List<Genre>
{
DbFactory.Genre("Action", false),
DbFactory.Genre("action", false),
DbFactory.Genre("Sci-fi", false),
};
var genreAdded = new List<Genre>();
GenreHelper.UpdateGenre(allGenres, new[] {"Action", "Scifi"}, false, genre =>
{
genreAdded.Add(genre);
});
Assert.Equal(3, allGenres.Count);
}
[Fact]
public void AddGenre_ShouldAddOnlyNonExistingGenre()
{
var existingGenres = new List<Genre>
{
DbFactory.Genre("Action", false),
DbFactory.Genre("action", false),
DbFactory.Genre("Sci-fi", false),
};
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Action", false));
Assert.Equal(3, existingGenres.Count);
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("action", false));
Assert.Equal(3, existingGenres.Count);
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Shonen", false));
Assert.Equal(4, existingGenres.Count);
}
[Fact]
public void AddGenre_ShouldNotAddSameNameAndExternal()
{
var existingGenres = new List<Genre>
{
DbFactory.Genre("Action", false),
DbFactory.Genre("action", false),
DbFactory.Genre("Sci-fi", false),
};
GenreHelper.AddGenreIfNotExists(existingGenres, DbFactory.Genre("Action", true));
Assert.Equal(3, existingGenres.Count);
}
[Fact]
public void KeepOnlySamePeopleBetweenLists()
{
var existingGenres = new List<Genre>
{
DbFactory.Genre("Action", false),
DbFactory.Genre("Sci-fi", false),
};
var peopleFromChapters = new List<Genre>
{
DbFactory.Genre("Action", false),
};
var genreRemoved = new List<Genre>();
GenreHelper.KeepOnlySameGenreBetweenLists(existingGenres,
peopleFromChapters, genre =>
{
genreRemoved.Add(genre);
});
Assert.Equal(1, genreRemoved.Count);
}
}

View File

@ -0,0 +1,140 @@
using System.Collections.Generic;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using Xunit;
namespace API.Tests.Helpers;
public class PersonHelperTests
{
[Fact]
public void UpdatePeople_ShouldAddNewPeople()
{
var allPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer)
};
var peopleAdded = new List<Person>();
PersonHelper.UpdatePeople(allPeople, new[] {"Joseph Shmo", "Sally Ann"}, PersonRole.Writer, person =>
{
peopleAdded.Add(person);
});
Assert.Equal(2, peopleAdded.Count);
Assert.Equal(4, allPeople.Count);
}
[Fact]
public void UpdatePeople_ShouldNotAddDuplicatePeople()
{
var allPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer),
DbFactory.Person("Sally Ann", PersonRole.CoverArtist),
};
var peopleAdded = new List<Person>();
PersonHelper.UpdatePeople(allPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.CoverArtist, person =>
{
peopleAdded.Add(person);
});
Assert.Equal(3, allPeople.Count);
}
[Fact]
public void RemovePeople_ShouldRemovePeopleOfSameRole()
{
var existingPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer)
};
var peopleRemoved = new List<Person>();
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.Writer, person =>
{
peopleRemoved.Add(person);
});
Assert.NotEqual(existingPeople, peopleRemoved);
Assert.Equal(1, peopleRemoved.Count);
}
[Fact]
public void RemovePeople_ShouldRemovePeopleFromBothRoles()
{
var existingPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer)
};
var peopleRemoved = new List<Person>();
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo", "Sally Ann"}, PersonRole.Writer, person =>
{
peopleRemoved.Add(person);
});
Assert.NotEqual(existingPeople, peopleRemoved);
Assert.Equal(1, peopleRemoved.Count);
PersonHelper.RemovePeople(existingPeople, new[] {"Joe Shmo"}, PersonRole.CoverArtist, person =>
{
peopleRemoved.Add(person);
});
Assert.Equal(0, existingPeople.Count);
Assert.Equal(2, peopleRemoved.Count);
}
[Fact]
public void KeepOnlySamePeopleBetweenLists()
{
var existingPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer),
DbFactory.Person("Sally", PersonRole.Writer),
};
var peopleFromChapters = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
};
var peopleRemoved = new List<Person>();
PersonHelper.KeepOnlySamePeopleBetweenLists(existingPeople,
peopleFromChapters, person =>
{
peopleRemoved.Add(person);
});
Assert.Equal(2, peopleRemoved.Count);
}
[Fact]
public void AddPeople_ShouldAddOnlyNonExistingPeople()
{
var existingPeople = new List<Person>
{
DbFactory.Person("Joe Shmo", PersonRole.CoverArtist),
DbFactory.Person("Joe Shmo", PersonRole.Writer),
DbFactory.Person("Sally", PersonRole.Writer),
};
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo", PersonRole.CoverArtist));
Assert.Equal(3, existingPeople.Count);
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo", PersonRole.Writer));
Assert.Equal(3, existingPeople.Count);
PersonHelper.AddPersonIfNotExists(existingPeople, DbFactory.Person("Joe Shmo Two", PersonRole.CoverArtist));
Assert.Equal(4, existingPeople.Count);
}
}

View File

@ -0,0 +1,125 @@
using System.Collections.Generic;
using System.Linq;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Helpers;
using API.Services.Tasks.Scanner;
using Xunit;
namespace API.Tests.Helpers;
public class SeriesHelperTests
{
#region FindSeries
[Fact]
public void FindSeries_ShouldFind_SameFormat()
{
var series = DbFactory.Series("Darker than Black");
series.OriginalName = "Something Random";
series.Format = MangaFormat.Archive;
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Darker than Black",
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Darker than Black".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Archive,
Name = "Darker than Black".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
}
[Fact]
public void FindSeries_ShouldNotFind_WrongFormat()
{
var series = DbFactory.Series("Darker than Black");
series.OriginalName = "Something Random";
series.Format = MangaFormat.Archive;
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Darker than Black",
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Darker than Black".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
Assert.False(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Darker than Black".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Darker than Black")
}));
}
[Fact]
public void FindSeries_ShouldFind_UsingOriginalName()
{
var series = DbFactory.Series("Darker than Black");
series.OriginalName = "Something Random";
series.Format = MangaFormat.Image;
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random",
NormalizedName = API.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToLower(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "Something Random".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("Something Random")
}));
Assert.True(SeriesHelper.FindSeries(series, new ParsedSeries()
{
Format = MangaFormat.Image,
Name = "SomethingRandom".ToUpper(),
NormalizedName = API.Parser.Parser.Normalize("SomethingRandom")
}));
}
#endregion
[Fact]
public void RemoveMissingSeries_Should_RemoveSeries()
{
var existingSeries = new List<Series>()
{
EntityFactory.CreateSeries("Darker than Black Vol 1"),
EntityFactory.CreateSeries("Darker than Black"),
EntityFactory.CreateSeries("Beastars"),
};
var missingSeries = new List<Series>()
{
EntityFactory.CreateSeries("Darker than Black Vol 1"),
};
existingSeries = SeriesHelper.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
Assert.Equal(missingSeries.Count, removeCount);
}
}

View File

@ -139,6 +139,14 @@ namespace API.Tests.Parser
Assert.Equal(expected, IsImage(filename));
}
[Theory]
[InlineData("Joe Smo", "Joe Smo")]
[InlineData("Smo, Joe", "Joe Smo")]
public void CleanAuthorTest(string author, string expected)
{
Assert.Equal(expected, CleanAuthor(expected));
}
[Theory]
[InlineData("C:/", "C:/Love Hina/Love Hina - Special.cbz", "Love Hina")]
[InlineData("C:/", "C:/Love Hina/Specials/Ani-Hina Art Collection.cbz", "Love Hina")]

View File

@ -1,5 +1,6 @@
using System.Diagnostics;
using System.IO;
using System.IO.Abstractions.TestingHelpers;
using System.IO.Compression;
using API.Archive;
using API.Data.Metadata;
@ -19,7 +20,7 @@ namespace API.Tests.Services
private readonly ArchiveService _archiveService;
private readonly ILogger<ArchiveService> _logger = Substitute.For<ILogger<ArchiveService>>();
private readonly ILogger<DirectoryService> _directoryServiceLogger = Substitute.For<ILogger<DirectoryService>>();
private readonly IDirectoryService _directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>());
private readonly IDirectoryService _directoryService = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), new MockFileSystem());
public ArchiveServiceTests(ITestOutputHelper testOutputHelper)
{
@ -159,7 +160,7 @@ namespace API.Tests.Services
[InlineData("sorting.zip", "sorting.expected.jpg")]
public void GetCoverImage_Default_Test(string inputFile, string expectedOutputFile)
{
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger));
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger, new MockFileSystem()));
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages");
var expectedBytes = File.ReadAllBytes(Path.Join(testDirectory, expectedOutputFile));
archiveService.Configure().CanOpen(Path.Join(testDirectory, inputFile)).Returns(ArchiveLibrary.Default);
@ -191,7 +192,7 @@ namespace API.Tests.Services
[InlineData("sorting.zip", "sorting.expected.jpg")]
public void GetCoverImage_SharpCompress_Test(string inputFile, string expectedOutputFile)
{
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger));
var archiveService = Substitute.For<ArchiveService>(_logger, new DirectoryService(_directoryServiceLogger, new MockFileSystem()));
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/CoverImages");
var expectedBytes = File.ReadAllBytes(Path.Join(testDirectory, expectedOutputFile));
@ -215,10 +216,23 @@ namespace API.Tests.Services
public void ShouldHaveComicInfo()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/ComicInfos");
var archive = Path.Join(testDirectory, "file in folder.zip");
var summaryInfo = "By all counts, Ryouta Sakamoto is a loser when he's not holed up in his room, bombing things into oblivion in his favorite online action RPG. But his very own uneventful life is blown to pieces when he's abducted and taken to an uninhabited island, where he soon learns the hard way that he's being pitted against others just like him in a explosives-riddled death match! How could this be happening? Who's putting them up to this? And why!? The name, not to mention the objective, of this very real survival game is eerily familiar to Ryouta, who has mastered its virtual counterpart-BTOOOM! Can Ryouta still come out on top when he's playing for his life!?";
var archive = Path.Join(testDirectory, "ComicInfo.zip");
const string summaryInfo = "By all counts, Ryouta Sakamoto is a loser when he's not holed up in his room, bombing things into oblivion in his favorite online action RPG. But his very own uneventful life is blown to pieces when he's abducted and taken to an uninhabited island, where he soon learns the hard way that he's being pitted against others just like him in a explosives-riddled death match! How could this be happening? Who's putting them up to this? And why!? The name, not to mention the objective, of this very real survival game is eerily familiar to Ryouta, who has mastered its virtual counterpart-BTOOOM! Can Ryouta still come out on top when he's playing for his life!?";
Assert.Equal(summaryInfo, _archiveService.GetComicInfo(archive).Summary);
var comicInfo = _archiveService.GetComicInfo(archive);
Assert.NotNull(comicInfo);
Assert.Equal(summaryInfo, comicInfo.Summary);
}
[Fact]
public void ShouldHaveComicInfo_WithAuthors()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/ComicInfos");
var archive = Path.Join(testDirectory, "ComicInfo_authors.zip");
var comicInfo = _archiveService.GetComicInfo(archive);
Assert.NotNull(comicInfo);
Assert.Equal("Junya Inoue", comicInfo.Writer);
}
[Fact]

View File

@ -27,5 +27,29 @@ namespace API.Tests.Services
Assert.Equal(expectedPages, _bookService.GetNumberOfPages(Path.Join(testDirectory, filePath)));
}
[Fact]
public void ShouldHaveComicInfo()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/BookService/EPUB");
var archive = Path.Join(testDirectory, "The Golden Harpoon; Or, Lost Among the Floes A Story of the Whaling Grounds.epub");
const string summaryInfo = "Book Description";
var comicInfo = _bookService.GetComicInfo(archive);
Assert.NotNull(comicInfo);
Assert.Equal(summaryInfo, comicInfo.Summary);
Assert.Equal("genre1, genre2", comicInfo.Genre);
}
[Fact]
public void ShouldHaveComicInfo_WithAuthors()
{
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/BookService/EPUB");
var archive = Path.Join(testDirectory, "The Golden Harpoon; Or, Lost Among the Floes A Story of the Whaling Grounds.epub");
var comicInfo = _bookService.GetComicInfo(archive);
Assert.NotNull(comicInfo);
Assert.Equal("Roger Starbuck,Junya Inoue", comicInfo.Writer);
}
}
}

View File

@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Abstractions.TestingHelpers;
using System.Linq;
using API.Services;
using Microsoft.Extensions.Logging;
@ -17,7 +18,7 @@ namespace API.Tests.Services
public DirectoryServiceTests()
{
_directoryService = new DirectoryService(_logger);
_directoryService = new DirectoryService(_logger, new MockFileSystem());
}
[Fact]
@ -26,7 +27,7 @@ namespace API.Tests.Services
var testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ScannerService/Manga");
// ReSharper disable once CollectionNeverQueried.Local
var files = new List<string>();
var fileCount = DirectoryService.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
var fileCount = _directoryService.TraverseTreeParallelForEach(testDirectory, s => files.Add(s),
API.Parser.Parser.ArchiveFileExtensions, _logger);
Assert.Equal(28, fileCount);

View File

@ -0,0 +1,44 @@
using System;
using System.Collections.Generic;
using System.IO.Abstractions.TestingHelpers;
using API.Services;
using Xunit;
namespace API.Tests.Services;
public class FileSystemTests
{
[Fact]
public void FileHasNotBeenModifiedSinceCreation()
{
var file = new MockFileData("Testing is meh.")
{
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ @"c:\myfile.txt", file }
});
var fileService = new FileService(fileSystem);
Assert.False(fileService.HasFileBeenModifiedSince(@"c:\myfile.txt", DateTime.Now));
}
[Fact]
public void FileHasBeenModifiedSinceCreation()
{
var file = new MockFileData("Testing is meh.")
{
LastWriteTime = DateTimeOffset.Now
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ @"c:\myfile.txt", file }
});
var fileService = new FileService(fileSystem);
Assert.True(fileService.HasFileBeenModifiedSince(@"c:\myfile.txt", DateTime.Now.Subtract(TimeSpan.FromMinutes(1))));
}
}

View File

@ -1,6 +1,9 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Abstractions.TestingHelpers;
using API.Entities;
using API.Helpers;
using API.Services;
using Xunit;
@ -10,6 +13,7 @@ namespace API.Tests.Services
{
private readonly string _testDirectory = Path.Join(Directory.GetCurrentDirectory(), "../../../Services/Test Data/ArchiveService/Archives");
private const string TestCoverImageFile = "thumbnail.jpg";
private const string TestCoverArchive = @"c:\file in folder.zip";
private readonly string _testCoverImageDirectory = Path.Join(Directory.GetCurrentDirectory(), @"../../../Services/Test Data/ArchiveService/CoverImages");
//private readonly MetadataService _metadataService;
// private readonly IUnitOfWork _unitOfWork = Substitute.For<IUnitOfWork>();
@ -18,116 +22,23 @@ namespace API.Tests.Services
// private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
// private readonly ILogger<MetadataService> _logger = Substitute.For<ILogger<MetadataService>>();
// private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
private readonly ICacheHelper _cacheHelper;
public MetadataServiceTests()
{
//_metadataService = new MetadataService(_unitOfWork, _logger, _archiveService, _bookService, _imageService, _messageHub);
}
[Fact]
public void ShouldUpdateCoverImage_OnFirstRun()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
var file = new MockFileData("")
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = DateTime.Now
}, false, false));
}
[Fact]
public void ShouldUpdateCoverImage_OnFirstRunSeries()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null,null, false, false));
}
[Fact]
public void ShouldUpdateCoverImage_OnFirstRun_FileModified()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime.Subtract(TimeSpan.FromDays(1))
}, false, false));
}
{ TestCoverArchive, file }
});
[Fact]
public void ShouldUpdateCoverImage_OnFirstRun_CoverImageLocked()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
}, false, true));
}
[Fact]
public void ShouldUpdateCoverImage_OnSecondRun_ForceUpdate()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
}, true, false));
}
[Fact]
public void ShouldUpdateCoverImage_OnSecondRun_NoFileChangeButNoCoverImage()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
}, false, false));
}
[Fact]
public void ShouldUpdateCoverImage_OnSecondRun_FileChangeButNoCoverImage()
{
// Represents first run
Assert.True(MetadataService.ShouldUpdateCoverImage(null, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime + TimeSpan.FromDays(1)
}, false, false));
}
[Fact]
public void ShouldNotUpdateCoverImage_OnSecondRun_CoverImageSet()
{
// Represents first run
Assert.False(MetadataService.ShouldUpdateCoverImage(TestCoverImageFile, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = new FileInfo(Path.Join(_testDirectory, "file in folder.zip")).LastWriteTime
}, false, false, _testCoverImageDirectory));
}
[Fact]
public void ShouldNotUpdateCoverImage_OnSecondRun_HasCoverImage_NoForceUpdate_NoLock()
{
Assert.False(MetadataService.ShouldUpdateCoverImage(TestCoverImageFile, new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = DateTime.Now
}, false, false, _testCoverImageDirectory));
}
[Fact]
public void ShouldUpdateCoverImage_OnSecondRun_HasCoverImage_NoForceUpdate_HasLock_CoverImageDoesntExist()
{
Assert.True(MetadataService.ShouldUpdateCoverImage(@"doesn't_exist.jpg", new MangaFile()
{
FilePath = Path.Join(_testDirectory, "file in folder.zip"),
LastModified = DateTime.Now
}, false, true, _testCoverImageDirectory));
var fileService = new FileService(fileSystem);
_cacheHelper = new CacheHelper(fileService);
}
}
}

View File

@ -3,11 +3,14 @@ using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Data.Common;
using System.IO;
using System.IO.Abstractions.TestingHelpers;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Helpers;
using API.Interfaces;
using API.Interfaces.Services;
using API.Parser;
@ -34,8 +37,9 @@ namespace API.Tests.Services
private readonly IArchiveService _archiveService = Substitute.For<IArchiveService>();
private readonly IBookService _bookService = Substitute.For<IBookService>();
private readonly IImageService _imageService = Substitute.For<IImageService>();
private readonly IDirectoryService _directoryService = Substitute.For<IDirectoryService>();
private readonly ILogger<MetadataService> _metadataLogger = Substitute.For<ILogger<MetadataService>>();
private readonly ICacheService _cacheService = Substitute.For<ICacheService>();
private readonly ICacheService _cacheService;
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
private readonly DbConnection _connection;
@ -54,9 +58,26 @@ namespace API.Tests.Services
IUnitOfWork unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
var file = new MockFileData("")
{
LastWriteTime = DateTimeOffset.Now.Subtract(TimeSpan.FromMinutes(1))
};
var fileSystem = new MockFileSystem(new Dictionary<string, MockFileData>
{
{ "/data/Darker than Black.zip", file },
{ "/data/Cage of Eden - v10.cbz", file },
{ "/data/Cage of Eden - v1.cbz", file },
});
IMetadataService metadataService = Substitute.For<MetadataService>(unitOfWork, _metadataLogger, _archiveService, _bookService, _imageService, _messageHub);
_scannerService = new ScannerService(unitOfWork, _logger, _archiveService, metadataService, _bookService, _cacheService, _messageHub);
var fileService = new FileService(fileSystem);
ICacheHelper cacheHelper = new CacheHelper(fileService);
IMetadataService metadataService =
Substitute.For<MetadataService>(unitOfWork, _metadataLogger, _archiveService,
_bookService, _imageService, _messageHub, cacheHelper);
_scannerService = new ScannerService(unitOfWork, _logger, _archiveService, metadataService, _bookService,
_cacheService, _messageHub, fileService, _directoryService);
}
private async Task<bool> SeedDb()
@ -78,6 +99,13 @@ namespace API.Tests.Services
return await _context.SaveChangesAsync() > 0;
}
[Fact]
public void AddOrUpdateFileForChapter()
{
// TODO: This can be tested, it has _filesystem mocked
}
[Fact]
public void FindSeriesNotOnDisk_Should_RemoveNothing_Test()
{
@ -138,24 +166,24 @@ namespace API.Tests.Services
// Assert.Equal(expected, actualName);
// }
[Fact]
public void RemoveMissingSeries_Should_RemoveSeries()
{
var existingSeries = new List<Series>()
{
EntityFactory.CreateSeries("Darker than Black Vol 1"),
EntityFactory.CreateSeries("Darker than Black"),
EntityFactory.CreateSeries("Beastars"),
};
var missingSeries = new List<Series>()
{
EntityFactory.CreateSeries("Darker than Black Vol 1"),
};
existingSeries = ScannerService.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
Assert.Equal(missingSeries.Count, removeCount);
}
// [Fact]
// public void RemoveMissingSeries_Should_RemoveSeries()
// {
// var existingSeries = new List<Series>()
// {
// EntityFactory.CreateSeries("Darker than Black Vol 1"),
// EntityFactory.CreateSeries("Darker than Black"),
// EntityFactory.CreateSeries("Beastars"),
// };
// var missingSeries = new List<Series>()
// {
// EntityFactory.CreateSeries("Darker than Black Vol 1"),
// };
// existingSeries = ScannerService.RemoveMissingSeries(existingSeries, missingSeries, out var removeCount).ToList();
//
// Assert.DoesNotContain(missingSeries[0].Name, existingSeries.Select(s => s.Name));
// Assert.Equal(missingSeries.Count, removeCount);
// }
private void AddToParsedInfo(IDictionary<ParsedSeries, List<ParserInfo>> collectedSeries, ParserInfo info)
{

View File

@ -0,0 +1,81 @@
<?xml version='1.0' encoding='UTF-8'?>
<package xmlns:opf="http://www.idpf.org/2007/opf" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.idpf.org/2007/opf" version="2.0" unique-identifier="id">
<metadata>
<dc:rights>Public domain in the USA.</dc:rights>
<dc:identifier opf:scheme="URI" id="id">http://www.gutenberg.org/64999</dc:identifier>
<dc:creator opf:file-as="Starbuck, Roger">Roger Starbuck</dc:creator>
<dc:title>The Golden Harpoon / Lost Among the Floes</dc:title>
<dc:language xsi:type="dcterms:RFC4646">en</dc:language>
<dc:date opf:event="publication">2021-04-05</dc:date>
<dc:date opf:event="conversion">2021-04-05T23:00:07.039989+00:00</dc:date>
<dc:source>https://www.gutenberg.org/files/64999/64999-h/64999-h.htm</dc:source>
<dc:description>Book Description</dc:description>
<dc:subject>Genre1, Genre2</dc:subject>
<dc:creator id="creator">Junya Inoue</dc:creator>
<meta refines="#creator" property="file-as">Inoue, Junya</meta>
<meta refines="#creator" property="role" scheme="marc:relators">aut</meta>
<meta name="cover" content="item1"/>
</metadata>
<manifest>
<!--Image: 1000 x 1572 size=145584 -->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@images@cover.jpg" id="item1" media-type="image/jpeg"/>
<item href="pgepub.css" id="item2" media-type="text/css"/>
<item href="0.css" id="item3" media-type="text/css"/>
<item href="1.css" id="item4" media-type="text/css"/>
<!--Chunk: size=3477 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-0.htm.html" id="cover" media-type="application/xhtml+xml"/>
<!--Chunk: size=3242 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-1.htm.html" id="item5" media-type="application/xhtml+xml"/>
<!--Chunk: size=27802 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-2.htm.html" id="item6" media-type="application/xhtml+xml"/>
<!--Chunk: size=13387 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-3.htm.html" id="item7" media-type="application/xhtml+xml"/>
<!--Chunk: size=25774 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-4.htm.html" id="item8" media-type="application/xhtml+xml"/>
<!--Chunk: size=17053 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-5.htm.html" id="item9" media-type="application/xhtml+xml"/>
<!--Chunk: size=19590 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-6.htm.html" id="item10" media-type="application/xhtml+xml"/>
<!--Chunk: size=16645 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-7.htm.html" id="item11" media-type="application/xhtml+xml"/>
<!--Chunk: size=19768 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-8.htm.html" id="item12" media-type="application/xhtml+xml"/>
<!--Chunk: size=30397 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-9.htm.html" id="item13" media-type="application/xhtml+xml"/>
<!--Chunk: size=41064 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-10.htm.html" id="item14" media-type="application/xhtml+xml"/>
<!--Chunk: size=35745 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-11.htm.html" id="item15" media-type="application/xhtml+xml"/>
<!--Chunk: size=3277 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-12.htm.html" id="item16" media-type="application/xhtml+xml"/>
<!--Chunk: size=5983 Split on div.chapter-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-13.htm.html" id="item17" media-type="application/xhtml+xml"/>
<!--Chunk: size=22066-->
<item href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-14.htm.html" id="item18" media-type="application/xhtml+xml"/>
<item href="toc.ncx" id="ncx" media-type="application/x-dtbncx+xml"/>
<item href="wrap0000.html" id="coverpage-wrapper" media-type="application/xhtml+xml"/>
</manifest>
<spine toc="ncx">
<itemref idref="coverpage-wrapper" linear="yes"/>
<itemref idref="cover" linear="yes"/>
<itemref idref="item5" linear="yes"/>
<itemref idref="item6" linear="yes"/>
<itemref idref="item7" linear="yes"/>
<itemref idref="item8" linear="yes"/>
<itemref idref="item9" linear="yes"/>
<itemref idref="item10" linear="yes"/>
<itemref idref="item11" linear="yes"/>
<itemref idref="item12" linear="yes"/>
<itemref idref="item13" linear="yes"/>
<itemref idref="item14" linear="yes"/>
<itemref idref="item15" linear="yes"/>
<itemref idref="item16" linear="yes"/>
<itemref idref="item17" linear="yes"/>
<itemref idref="item18" linear="yes"/>
</spine>
<guide>
<reference type="toc" title="CONTENTS" href="@public@vhost@g@gutenberg@html@files@64999@64999-h@64999-h-1.htm.html#pgepubid00001"/>
<reference type="cover" title="Cover" href="wrap0000.html"/>
</guide>
</package>

View File

@ -2,7 +2,7 @@
<PropertyGroup>
<AnalysisMode>Default</AnalysisMode>
<TargetFramework>net5.0</TargetFramework>
<TargetFramework>net6.0</TargetFramework>
<EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
</PropertyGroup>
@ -41,34 +41,35 @@
<PackageReference Include="ExCSS" Version="4.1.0" />
<PackageReference Include="Flurl" Version="3.0.2" />
<PackageReference Include="Flurl.Http" Version="3.2.0" />
<PackageReference Include="Hangfire" Version="1.7.25" />
<PackageReference Include="Hangfire.AspNetCore" Version="1.7.25" />
<PackageReference Include="Hangfire" Version="1.7.27" />
<PackageReference Include="Hangfire.AspNetCore" Version="1.7.27" />
<PackageReference Include="Hangfire.MaximumConcurrentExecutions" Version="1.1.0" />
<PackageReference Include="Hangfire.MemoryStorage.Core" Version="1.4.0" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.37" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.38" />
<PackageReference Include="MarkdownDeep.NET.Core" Version="1.5.0.4" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="5.0.10" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.OpenIdConnect" Version="5.0.10" />
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="5.0.10" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.JwtBearer" Version="6.0.0" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.OpenIdConnect" Version="6.0.0" />
<PackageReference Include="Microsoft.AspNetCore.Identity.EntityFrameworkCore" Version="6.0.0" />
<PackageReference Include="Microsoft.AspNetCore.SignalR" Version="1.1.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="5.0.10">
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="6.0.0">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="5.0.10" />
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="5.0.2" />
<PackageReference Include="Microsoft.IO.RecyclableMemoryStream" Version="2.1.3" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="6.0.0" />
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="6.0.0" />
<PackageReference Include="Microsoft.IO.RecyclableMemoryStream" Version="2.2.0" />
<PackageReference Include="NetVips" Version="2.0.1" />
<PackageReference Include="NetVips.Native" Version="8.11.4" />
<PackageReference Include="NReco.Logging.File" Version="1.1.2" />
<PackageReference Include="SharpCompress" Version="0.30.0" />
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.29.0.36737">
<PackageReference Include="SharpCompress" Version="0.30.1" />
<PackageReference Include="SonarAnalyzer.CSharp" Version="8.32.0.39516">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.2.2" />
<PackageReference Include="System.Drawing.Common" Version="5.0.2" />
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="6.12.2" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.2.3" />
<PackageReference Include="System.Drawing.Common" Version="6.0.0" />
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="6.14.1" />
<PackageReference Include="System.IO.Abstractions" Version="14.0.3" />
<PackageReference Include="VersOne.Epub" Version="3.0.3.1" />
</ItemGroup>

View File

@ -8,7 +8,7 @@ namespace API.Comparators
public class ChapterSortComparer : IComparer<double>
{
/// <summary>
/// Normal sort for 2 doubles. 0 always comes before anything else
/// Normal sort for 2 doubles. 0 always comes last
/// </summary>
/// <param name="x"></param>
/// <param name="y"></param>

View File

@ -3,9 +3,9 @@ using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Data;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Entities.Metadata;
using API.Extensions;
using API.Interfaces;
using Microsoft.AspNetCore.Authorization;

View File

@ -17,7 +17,6 @@ using API.Interfaces.Services;
using API.Services;
using Kavita.Common;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
namespace API.Controllers
{
@ -168,15 +167,8 @@ namespace API.Controllers
var user = await _unitOfWork.UserRepository.GetUserByIdAsync(userId);
var isAdmin = await _unitOfWork.UserRepository.IsUserAdmin(user);
IList<CollectionTagDto> tags;
if (isAdmin)
{
tags = (await _unitOfWork.CollectionTagRepository.GetAllTagDtosAsync()).ToList();
}
else
{
tags = (await _unitOfWork.CollectionTagRepository.GetAllPromotedTagDtosAsync()).ToList();
}
IList<CollectionTagDto> tags = isAdmin ? (await _unitOfWork.CollectionTagRepository.GetAllTagDtosAsync()).ToList()
: (await _unitOfWork.CollectionTagRepository.GetAllPromotedTagDtosAsync()).ToList();
var feed = CreateFeed("All Collections", $"{apiKey}/collections", apiKey);
@ -653,7 +645,7 @@ namespace API.Controllers
DirectoryService.GetHumanReadableBytes(DirectoryService.GetTotalSize(new List<string>()
{mangaFile.FilePath}));
var fileType = _downloadService.GetContentTypeFromFile(mangaFile.FilePath);
var filename = Uri.EscapeUriString(Path.GetFileName(mangaFile.FilePath) ?? string.Empty);
var filename = Uri.EscapeDataString(Path.GetFileName(mangaFile.FilePath) ?? string.Empty);
return new FeedEntry()
{
Id = mangaFile.Id.ToString(),

View File

@ -75,6 +75,7 @@ namespace API.Controllers
if (chapter == null) return BadRequest("Could not find Chapter");
var dto = await _unitOfWork.ChapterRepository.GetChapterInfoDtoAsync(chapterId);
if (dto == null) return BadRequest("Please perform a scan on this series or library and try again");
var mangaFile = (await _unitOfWork.ChapterRepository.GetFilesForChapterAsync(chapterId)).First();
return Ok(new ChapterInfoDto()
@ -89,6 +90,7 @@ namespace API.Controllers
LibraryId = dto.LibraryId,
IsSpecial = dto.IsSpecial,
Pages = dto.Pages,
ChapterTitle = dto.ChapterTitle
});
}

View File

@ -1,5 +1,4 @@
using System;
using System.Collections.Generic;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;

View File

@ -6,6 +6,7 @@ using API.Data;
using API.Data.Repositories;
using API.DTOs;
using API.DTOs.Filtering;
using API.DTOs.Metadata;
using API.Entities;
using API.Extensions;
using API.Helpers;
@ -187,7 +188,7 @@ namespace API.Controllers
series.Name = updateSeries.Name.Trim();
series.LocalizedName = updateSeries.LocalizedName.Trim();
series.SortName = updateSeries.SortName?.Trim();
series.Summary = updateSeries.Summary?.Trim();
series.Metadata.Summary = updateSeries.Summary?.Trim();
var needsRefreshMetadata = false;
// This is when you hit Reset
@ -294,6 +295,7 @@ namespace API.Controllers
else
{
series.Metadata.CollectionTags ??= new List<CollectionTag>();
// TODO: Move this merging logic into a reusable code as it can be used for any Tag
var newTags = new List<CollectionTag>();
// I want a union of these 2 lists. Return only elements that are in both lists, but the list types are different
@ -391,7 +393,5 @@ namespace API.Controllers
var userId = await _unitOfWork.UserRepository.GetUserIdByUsernameAsync(User.GetUsername());
return Ok(await _unitOfWork.SeriesRepository.GetSeriesDtoForIdsAsync(dto.SeriesIds, userId));
}
}
}

View File

@ -6,7 +6,6 @@ using API.DTOs.Stats;
using API.DTOs.Update;
using API.Extensions;
using API.Interfaces.Services;
using API.Services.Tasks;
using Kavita.Common;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;

View File

@ -50,5 +50,17 @@ namespace API.DTOs
/// When chapter was created
/// </summary>
public DateTime Created { get; init; }
/// <summary>
/// Title of the Chapter/Issue
/// </summary>
public string TitleName { get; set; }
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Penciller { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Inker { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Colorist { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Letterer { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> CoverArtist { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Editor { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Publisher { get; set; } = new List<PersonDto>();
}
}

View File

@ -0,0 +1,19 @@
using System.Collections.Generic;
namespace API.DTOs.Metadata
{
public class ChapterMetadataDto
{
public int Id { get; set; }
public string Title { get; set; }
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Penciller { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Inker { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Colorist { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Letterer { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> CoverArtist { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Editor { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Publisher { get; set; } = new List<PersonDto>();
public int ChapterId { get; set; }
}
}

View File

@ -0,0 +1,9 @@
namespace API.DTOs.Metadata
{
public class GenreTagDto
{
public int Id { get; set; }
public string Title { get; set; }
}
}

View File

@ -14,5 +14,6 @@ namespace API.DTOs.Reader
public int LibraryId { get; set; }
public int Pages { get; set; }
public bool IsSpecial { get; set; }
public string ChapterTitle { get; set; }
}
}

View File

@ -13,6 +13,7 @@ namespace API.DTOs.Reader
public int LibraryId { get; set; }
public int Pages { get; set; }
public bool IsSpecial { get; set; }
public string ChapterTitle { get; set; }
}
}

View File

@ -1,16 +1,25 @@
using System.Collections.Generic;
using API.DTOs.CollectionTags;
using API.Entities;
using API.DTOs.Metadata;
namespace API.DTOs
{
public class SeriesMetadataDto
{
public int Id { get; set; }
public ICollection<string> Genres { get; set; }
public string Summary { get; set; }
public ICollection<CollectionTagDto> Tags { get; set; }
public ICollection<Person> Persons { get; set; }
public string Publisher { get; set; }
public ICollection<GenreTagDto> Genres { get; set; }
public ICollection<PersonDto> Writers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Artists { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Publishers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Characters { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Pencillers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Inkers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Colorists { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Letterers { get; set; } = new List<PersonDto>();
public ICollection<PersonDto> Editors { get; set; } = new List<PersonDto>();
public int SeriesId { get; set; }
}
}
}

View File

@ -1,6 +1,4 @@
using System;
namespace API.DTOs.Update
namespace API.DTOs.Update
{
/// <summary>
/// Update Notification denoting a new release available for user to update to

View File

@ -4,6 +4,7 @@ using System.Threading;
using System.Threading.Tasks;
using API.Entities;
using API.Entities.Interfaces;
using API.Entities.Metadata;
using Microsoft.AspNetCore.Identity;
using Microsoft.AspNetCore.Identity.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore;
@ -23,7 +24,6 @@ namespace API.Data
public DbSet<Library> Library { get; set; }
public DbSet<Series> Series { get; set; }
public DbSet<Chapter> Chapter { get; set; }
public DbSet<Volume> Volume { get; set; }
public DbSet<AppUser> AppUser { get; set; }
@ -37,6 +37,8 @@ namespace API.Data
public DbSet<AppUserBookmark> AppUserBookmark { get; set; }
public DbSet<ReadingList> ReadingList { get; set; }
public DbSet<ReadingListItem> ReadingListItem { get; set; }
public DbSet<Person> Person { get; set; }
public DbSet<Genre> Genre { get; set; }
protected override void OnModelCreating(ModelBuilder builder)

View File

@ -1,7 +1,11 @@
using System;
using System.Collections.Generic;
using System.IO;
using API.Data.Metadata;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Extensions;
using API.Parser;
using API.Services.Tasks;
@ -21,12 +25,16 @@ namespace API.Data
LocalizedName = name,
NormalizedName = Parser.Parser.Normalize(name),
SortName = name,
Summary = string.Empty,
Volumes = new List<Volume>(),
Metadata = SeriesMetadata(Array.Empty<CollectionTag>())
};
}
public static SeriesMetadata SeriesMetadata(ComicInfo info)
{
return SeriesMetadata(Array.Empty<CollectionTag>());
}
public static Volume Volume(string volumeNumber)
{
return new Volume()
@ -57,7 +65,8 @@ namespace API.Data
{
return new SeriesMetadata()
{
CollectionTags = collectionTags
CollectionTags = collectionTags,
Summary = string.Empty
};
}
@ -72,5 +81,37 @@ namespace API.Data
Promoted = promoted
};
}
public static Genre Genre(string name, bool external)
{
return new Genre()
{
Title = name.Trim().SentenceCase(),
NormalizedTitle = Parser.Parser.Normalize(name),
ExternalTag = external
};
}
public static Person Person(string name, PersonRole role)
{
return new Person()
{
Name = name.Trim(),
NormalizedName = Parser.Parser.Normalize(name),
Role = role
};
}
public static MangaFile MangaFile(string filePath, MangaFormat format, int pages)
{
return new MangaFile()
{
FilePath = filePath,
Format = format,
Pages = pages,
LastModified = DateTime.Now //File.GetLastWriteTime(filePath)
};
}
}
}

View File

@ -34,11 +34,19 @@
public string AlternativeSeries { get; set; }
public string AlternativeNumber { get; set; }
/// <summary>
/// This is Epub only: calibre:title_sort
/// Represents the sort order for the title
/// </summary>
public string TitleSort { get; set; }
/// <summary>
/// This is the Author. For Books, we map creator tag in OPF to this field. Comma separated if multiple.
/// </summary>
public string Writer { get; set; } // TODO: Validate if we should make this a list of writers
public string Writer { get; set; }
public string Penciller { get; set; }
public string Inker { get; set; }
public string Colorist { get; set; }

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,203 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class MetadataFoundation : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "Summary",
table: "Series");
migrationBuilder.AddColumn<string>(
name: "Summary",
table: "SeriesMetadata",
type: "TEXT",
nullable: true);
migrationBuilder.CreateTable(
name: "ChapterMetadata",
columns: table => new
{
Id = table.Column<int>(type: "INTEGER", nullable: false)
.Annotation("Sqlite:Autoincrement", true),
Title = table.Column<string>(type: "TEXT", nullable: true),
Year = table.Column<string>(type: "TEXT", nullable: true),
StoryArc = table.Column<string>(type: "TEXT", nullable: true),
ChapterId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_ChapterMetadata", x => x.Id);
table.ForeignKey(
name: "FK_ChapterMetadata_Chapter_ChapterId",
column: x => x.ChapterId,
principalTable: "Chapter",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateTable(
name: "Genre",
columns: table => new
{
Id = table.Column<int>(type: "INTEGER", nullable: false)
.Annotation("Sqlite:Autoincrement", true),
Name = table.Column<string>(type: "TEXT", nullable: true),
NormalizedName = table.Column<string>(type: "TEXT", nullable: true)
},
constraints: table =>
{
table.PrimaryKey("PK_Genre", x => x.Id);
});
migrationBuilder.CreateTable(
name: "Person",
columns: table => new
{
Id = table.Column<int>(type: "INTEGER", nullable: false)
.Annotation("Sqlite:Autoincrement", true),
Name = table.Column<string>(type: "TEXT", nullable: true),
NormalizedName = table.Column<string>(type: "TEXT", nullable: true),
Role = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_Person", x => x.Id);
});
migrationBuilder.CreateTable(
name: "GenreSeriesMetadata",
columns: table => new
{
GenresId = table.Column<int>(type: "INTEGER", nullable: false),
SeriesMetadatasId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_GenreSeriesMetadata", x => new { x.GenresId, x.SeriesMetadatasId });
table.ForeignKey(
name: "FK_GenreSeriesMetadata_Genre_GenresId",
column: x => x.GenresId,
principalTable: "Genre",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_GenreSeriesMetadata_SeriesMetadata_SeriesMetadatasId",
column: x => x.SeriesMetadatasId,
principalTable: "SeriesMetadata",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateTable(
name: "ChapterMetadataPerson",
columns: table => new
{
ChapterMetadatasId = table.Column<int>(type: "INTEGER", nullable: false),
PeopleId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_ChapterMetadataPerson", x => new { x.ChapterMetadatasId, x.PeopleId });
table.ForeignKey(
name: "FK_ChapterMetadataPerson_ChapterMetadata_ChapterMetadatasId",
column: x => x.ChapterMetadatasId,
principalTable: "ChapterMetadata",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_ChapterMetadataPerson_Person_PeopleId",
column: x => x.PeopleId,
principalTable: "Person",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateTable(
name: "PersonSeriesMetadata",
columns: table => new
{
PeopleId = table.Column<int>(type: "INTEGER", nullable: false),
SeriesMetadatasId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_PersonSeriesMetadata", x => new { x.PeopleId, x.SeriesMetadatasId });
table.ForeignKey(
name: "FK_PersonSeriesMetadata_Person_PeopleId",
column: x => x.PeopleId,
principalTable: "Person",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_PersonSeriesMetadata_SeriesMetadata_SeriesMetadatasId",
column: x => x.SeriesMetadatasId,
principalTable: "SeriesMetadata",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata",
column: "ChapterId",
unique: true);
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadataPerson_PeopleId",
table: "ChapterMetadataPerson",
column: "PeopleId");
migrationBuilder.CreateIndex(
name: "IX_Genre_NormalizedName",
table: "Genre",
column: "NormalizedName",
unique: true);
migrationBuilder.CreateIndex(
name: "IX_GenreSeriesMetadata_SeriesMetadatasId",
table: "GenreSeriesMetadata",
column: "SeriesMetadatasId");
migrationBuilder.CreateIndex(
name: "IX_PersonSeriesMetadata_SeriesMetadatasId",
table: "PersonSeriesMetadata",
column: "SeriesMetadatasId");
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "ChapterMetadataPerson");
migrationBuilder.DropTable(
name: "GenreSeriesMetadata");
migrationBuilder.DropTable(
name: "PersonSeriesMetadata");
migrationBuilder.DropTable(
name: "ChapterMetadata");
migrationBuilder.DropTable(
name: "Genre");
migrationBuilder.DropTable(
name: "Person");
migrationBuilder.DropColumn(
name: "Summary",
table: "SeriesMetadata");
migrationBuilder.AddColumn<string>(
name: "Summary",
table: "Series",
type: "TEXT",
nullable: true);
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,138 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class RemoveChapterMetadata : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable(
name: "ChapterMetadataPerson");
migrationBuilder.DropIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata");
migrationBuilder.AddColumn<int>(
name: "ChapterMetadataId",
table: "Person",
type: "INTEGER",
nullable: true);
migrationBuilder.AddColumn<string>(
name: "TitleName",
table: "Chapter",
type: "TEXT",
nullable: true);
migrationBuilder.CreateTable(
name: "ChapterPerson",
columns: table => new
{
ChapterMetadatasId = table.Column<int>(type: "INTEGER", nullable: false),
PeopleId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_ChapterPerson", x => new { x.ChapterMetadatasId, x.PeopleId });
table.ForeignKey(
name: "FK_ChapterPerson_Chapter_ChapterMetadatasId",
column: x => x.ChapterMetadatasId,
principalTable: "Chapter",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_ChapterPerson_Person_PeopleId",
column: x => x.PeopleId,
principalTable: "Person",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_Person_ChapterMetadataId",
table: "Person",
column: "ChapterMetadataId");
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata",
column: "ChapterId");
migrationBuilder.CreateIndex(
name: "IX_ChapterPerson_PeopleId",
table: "ChapterPerson",
column: "PeopleId");
migrationBuilder.AddForeignKey(
name: "FK_Person_ChapterMetadata_ChapterMetadataId",
table: "Person",
column: "ChapterMetadataId",
principalTable: "ChapterMetadata",
principalColumn: "Id");
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropForeignKey(
name: "FK_Person_ChapterMetadata_ChapterMetadataId",
table: "Person");
migrationBuilder.DropTable(
name: "ChapterPerson");
migrationBuilder.DropIndex(
name: "IX_Person_ChapterMetadataId",
table: "Person");
migrationBuilder.DropIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata");
migrationBuilder.DropColumn(
name: "ChapterMetadataId",
table: "Person");
migrationBuilder.DropColumn(
name: "TitleName",
table: "Chapter");
migrationBuilder.CreateTable(
name: "ChapterMetadataPerson",
columns: table => new
{
ChapterMetadatasId = table.Column<int>(type: "INTEGER", nullable: false),
PeopleId = table.Column<int>(type: "INTEGER", nullable: false)
},
constraints: table =>
{
table.PrimaryKey("PK_ChapterMetadataPerson", x => new { x.ChapterMetadatasId, x.PeopleId });
table.ForeignKey(
name: "FK_ChapterMetadataPerson_ChapterMetadata_ChapterMetadatasId",
column: x => x.ChapterMetadatasId,
principalTable: "ChapterMetadata",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
table.ForeignKey(
name: "FK_ChapterMetadataPerson_Person_PeopleId",
column: x => x.PeopleId,
principalTable: "Person",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata",
column: "ChapterId",
unique: true);
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadataPerson_PeopleId",
table: "ChapterMetadataPerson",
column: "PeopleId");
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,86 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class GenreProvider : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropForeignKey(
name: "FK_Person_ChapterMetadata_ChapterMetadataId",
table: "Person");
migrationBuilder.DropTable(
name: "ChapterMetadata");
migrationBuilder.DropIndex(
name: "IX_Person_ChapterMetadataId",
table: "Person");
migrationBuilder.DropColumn(
name: "ChapterMetadataId",
table: "Person");
migrationBuilder.AddColumn<bool>(
name: "ExternalTag",
table: "Genre",
type: "INTEGER",
nullable: false,
defaultValue: false);
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropColumn(
name: "ExternalTag",
table: "Genre");
migrationBuilder.AddColumn<int>(
name: "ChapterMetadataId",
table: "Person",
type: "INTEGER",
nullable: true);
migrationBuilder.CreateTable(
name: "ChapterMetadata",
columns: table => new
{
Id = table.Column<int>(type: "INTEGER", nullable: false)
.Annotation("Sqlite:Autoincrement", true),
ChapterId = table.Column<int>(type: "INTEGER", nullable: false),
StoryArc = table.Column<string>(type: "TEXT", nullable: true),
Title = table.Column<string>(type: "TEXT", nullable: true),
Year = table.Column<string>(type: "TEXT", nullable: true)
},
constraints: table =>
{
table.PrimaryKey("PK_ChapterMetadata", x => x.Id);
table.ForeignKey(
name: "FK_ChapterMetadata_Chapter_ChapterId",
column: x => x.ChapterId,
principalTable: "Chapter",
principalColumn: "Id",
onDelete: ReferentialAction.Cascade);
});
migrationBuilder.CreateIndex(
name: "IX_Person_ChapterMetadataId",
table: "Person",
column: "ChapterMetadataId");
migrationBuilder.CreateIndex(
name: "IX_ChapterMetadata_ChapterId",
table: "ChapterMetadata",
column: "ChapterId");
migrationBuilder.AddForeignKey(
name: "FK_Person_ChapterMetadata_ChapterMetadataId",
table: "Person",
column: "ChapterMetadataId",
principalTable: "ChapterMetadata",
principalColumn: "Id");
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,85 @@
using Microsoft.EntityFrameworkCore.Migrations;
#nullable disable
namespace API.Data.Migrations
{
public partial class GenreTitle : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropIndex(
name: "IX_Genre_NormalizedName",
table: "Genre");
migrationBuilder.RenameColumn(
name: "NormalizedName",
table: "Genre",
newName: "Title");
migrationBuilder.RenameColumn(
name: "Name",
table: "Genre",
newName: "NormalizedTitle");
migrationBuilder.AddColumn<int>(
name: "GenreId",
table: "Chapter",
type: "INTEGER",
nullable: true);
migrationBuilder.CreateIndex(
name: "IX_Genre_NormalizedTitle_ExternalTag",
table: "Genre",
columns: new[] { "NormalizedTitle", "ExternalTag" },
unique: true);
migrationBuilder.CreateIndex(
name: "IX_Chapter_GenreId",
table: "Chapter",
column: "GenreId");
migrationBuilder.AddForeignKey(
name: "FK_Chapter_Genre_GenreId",
table: "Chapter",
column: "GenreId",
principalTable: "Genre",
principalColumn: "Id");
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropForeignKey(
name: "FK_Chapter_Genre_GenreId",
table: "Chapter");
migrationBuilder.DropIndex(
name: "IX_Genre_NormalizedTitle_ExternalTag",
table: "Genre");
migrationBuilder.DropIndex(
name: "IX_Chapter_GenreId",
table: "Chapter");
migrationBuilder.DropColumn(
name: "GenreId",
table: "Chapter");
migrationBuilder.RenameColumn(
name: "Title",
table: "Genre",
newName: "NormalizedName");
migrationBuilder.RenameColumn(
name: "NormalizedTitle",
table: "Genre",
newName: "Name");
migrationBuilder.CreateIndex(
name: "IX_Genre_NormalizedName",
table: "Genre",
column: "NormalizedName",
unique: true);
}
}
}

View File

@ -5,6 +5,8 @@ using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
#nullable disable
namespace API.Data.Migrations
{
[DbContext(typeof(DataContext))]
@ -13,8 +15,7 @@ namespace API.Data.Migrations
protected override void BuildModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("ProductVersion", "5.0.8");
modelBuilder.HasAnnotation("ProductVersion", "6.0.0");
modelBuilder.Entity("API.Entities.AppRole", b =>
{
@ -40,7 +41,7 @@ namespace API.Data.Migrations
.IsUnique()
.HasDatabaseName("RoleNameIndex");
b.ToTable("AspNetRoles");
b.ToTable("AspNetRoles", (string)null);
});
modelBuilder.Entity("API.Entities.AppUser", b =>
@ -118,7 +119,7 @@ namespace API.Data.Migrations
.IsUnique()
.HasDatabaseName("UserNameIndex");
b.ToTable("AspNetUsers");
b.ToTable("AspNetUsers", (string)null);
});
modelBuilder.Entity("API.Entities.AppUserBookmark", b =>
@ -279,7 +280,7 @@ namespace API.Data.Migrations
b.HasIndex("RoleId");
b.ToTable("AspNetUserRoles");
b.ToTable("AspNetUserRoles", (string)null);
});
modelBuilder.Entity("API.Entities.Chapter", b =>
@ -297,6 +298,9 @@ namespace API.Data.Migrations
b.Property<DateTime>("Created")
.HasColumnType("TEXT");
b.Property<int?>("GenreId")
.HasColumnType("INTEGER");
b.Property<bool>("IsSpecial")
.HasColumnType("INTEGER");
@ -315,11 +319,16 @@ namespace API.Data.Migrations
b.Property<string>("Title")
.HasColumnType("TEXT");
b.Property<string>("TitleName")
.HasColumnType("TEXT");
b.Property<int>("VolumeId")
.HasColumnType("INTEGER");
b.HasKey("Id");
b.HasIndex("GenreId");
b.HasIndex("VolumeId");
b.ToTable("Chapter");
@ -382,6 +391,29 @@ namespace API.Data.Migrations
b.ToTable("FolderPath");
});
modelBuilder.Entity("API.Entities.Genre", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("INTEGER");
b.Property<bool>("ExternalTag")
.HasColumnType("INTEGER");
b.Property<string>("NormalizedTitle")
.HasColumnType("TEXT");
b.Property<string>("Title")
.HasColumnType("TEXT");
b.HasKey("Id");
b.HasIndex("NormalizedTitle", "ExternalTag")
.IsUnique();
b.ToTable("Genre");
});
modelBuilder.Entity("API.Entities.Library", b =>
{
b.Property<int>("Id")
@ -439,6 +471,53 @@ namespace API.Data.Migrations
b.ToTable("MangaFile");
});
modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("INTEGER");
b.Property<uint>("RowVersion")
.IsConcurrencyToken()
.HasColumnType("INTEGER");
b.Property<int>("SeriesId")
.HasColumnType("INTEGER");
b.Property<string>("Summary")
.HasColumnType("TEXT");
b.HasKey("Id");
b.HasIndex("SeriesId")
.IsUnique();
b.HasIndex("Id", "SeriesId")
.IsUnique();
b.ToTable("SeriesMetadata");
});
modelBuilder.Entity("API.Entities.Person", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("INTEGER");
b.Property<string>("Name")
.HasColumnType("TEXT");
b.Property<string>("NormalizedName")
.HasColumnType("TEXT");
b.Property<int>("Role")
.HasColumnType("INTEGER");
b.HasKey("Id");
b.ToTable("Person");
});
modelBuilder.Entity("API.Entities.ReadingList", b =>
{
b.Property<int>("Id")
@ -546,9 +625,6 @@ namespace API.Data.Migrations
b.Property<string>("SortName")
.HasColumnType("TEXT");
b.Property<string>("Summary")
.HasColumnType("TEXT");
b.HasKey("Id");
b.HasIndex("LibraryId");
@ -559,30 +635,6 @@ namespace API.Data.Migrations
b.ToTable("Series");
});
modelBuilder.Entity("API.Entities.SeriesMetadata", b =>
{
b.Property<int>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("INTEGER");
b.Property<uint>("RowVersion")
.IsConcurrencyToken()
.HasColumnType("INTEGER");
b.Property<int>("SeriesId")
.HasColumnType("INTEGER");
b.HasKey("Id");
b.HasIndex("SeriesId")
.IsUnique();
b.HasIndex("Id", "SeriesId")
.IsUnique();
b.ToTable("SeriesMetadata");
});
modelBuilder.Entity("API.Entities.ServerSetting", b =>
{
b.Property<int>("Key")
@ -649,6 +701,21 @@ namespace API.Data.Migrations
b.ToTable("AppUserLibrary");
});
modelBuilder.Entity("ChapterPerson", b =>
{
b.Property<int>("ChapterMetadatasId")
.HasColumnType("INTEGER");
b.Property<int>("PeopleId")
.HasColumnType("INTEGER");
b.HasKey("ChapterMetadatasId", "PeopleId");
b.HasIndex("PeopleId");
b.ToTable("ChapterPerson");
});
modelBuilder.Entity("CollectionTagSeriesMetadata", b =>
{
b.Property<int>("CollectionTagsId")
@ -664,6 +731,21 @@ namespace API.Data.Migrations
b.ToTable("CollectionTagSeriesMetadata");
});
modelBuilder.Entity("GenreSeriesMetadata", b =>
{
b.Property<int>("GenresId")
.HasColumnType("INTEGER");
b.Property<int>("SeriesMetadatasId")
.HasColumnType("INTEGER");
b.HasKey("GenresId", "SeriesMetadatasId");
b.HasIndex("SeriesMetadatasId");
b.ToTable("GenreSeriesMetadata");
});
modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityRoleClaim<int>", b =>
{
b.Property<int>("Id")
@ -683,7 +765,7 @@ namespace API.Data.Migrations
b.HasIndex("RoleId");
b.ToTable("AspNetRoleClaims");
b.ToTable("AspNetRoleClaims", (string)null);
});
modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserClaim<int>", b =>
@ -705,7 +787,7 @@ namespace API.Data.Migrations
b.HasIndex("UserId");
b.ToTable("AspNetUserClaims");
b.ToTable("AspNetUserClaims", (string)null);
});
modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserLogin<int>", b =>
@ -726,7 +808,7 @@ namespace API.Data.Migrations
b.HasIndex("UserId");
b.ToTable("AspNetUserLogins");
b.ToTable("AspNetUserLogins", (string)null);
});
modelBuilder.Entity("Microsoft.AspNetCore.Identity.IdentityUserToken<int>", b =>
@ -745,7 +827,22 @@ namespace API.Data.Migrations
b.HasKey("UserId", "LoginProvider", "Name");
b.ToTable("AspNetUserTokens");
b.ToTable("AspNetUserTokens", (string)null);
});
modelBuilder.Entity("PersonSeriesMetadata", b =>
{
b.Property<int>("PeopleId")
.HasColumnType("INTEGER");
b.Property<int>("SeriesMetadatasId")
.HasColumnType("INTEGER");
b.HasKey("PeopleId", "SeriesMetadatasId");
b.HasIndex("SeriesMetadatasId");
b.ToTable("PersonSeriesMetadata");
});
modelBuilder.Entity("API.Entities.AppUserBookmark", b =>
@ -813,6 +910,10 @@ namespace API.Data.Migrations
modelBuilder.Entity("API.Entities.Chapter", b =>
{
b.HasOne("API.Entities.Genre", null)
.WithMany("Chapters")
.HasForeignKey("GenreId");
b.HasOne("API.Entities.Volume", "Volume")
.WithMany("Chapters")
.HasForeignKey("VolumeId")
@ -844,6 +945,17 @@ namespace API.Data.Migrations
b.Navigation("Chapter");
});
modelBuilder.Entity("API.Entities.Metadata.SeriesMetadata", b =>
{
b.HasOne("API.Entities.Series", "Series")
.WithOne("Metadata")
.HasForeignKey("API.Entities.Metadata.SeriesMetadata", "SeriesId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Series");
});
modelBuilder.Entity("API.Entities.ReadingList", b =>
{
b.HasOne("API.Entities.AppUser", "AppUser")
@ -901,17 +1013,6 @@ namespace API.Data.Migrations
b.Navigation("Library");
});
modelBuilder.Entity("API.Entities.SeriesMetadata", b =>
{
b.HasOne("API.Entities.Series", "Series")
.WithOne("Metadata")
.HasForeignKey("API.Entities.SeriesMetadata", "SeriesId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.Navigation("Series");
});
modelBuilder.Entity("API.Entities.Volume", b =>
{
b.HasOne("API.Entities.Series", "Series")
@ -938,6 +1039,21 @@ namespace API.Data.Migrations
.IsRequired();
});
modelBuilder.Entity("ChapterPerson", b =>
{
b.HasOne("API.Entities.Chapter", null)
.WithMany()
.HasForeignKey("ChapterMetadatasId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("API.Entities.Person", null)
.WithMany()
.HasForeignKey("PeopleId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("CollectionTagSeriesMetadata", b =>
{
b.HasOne("API.Entities.CollectionTag", null)
@ -946,7 +1062,22 @@ namespace API.Data.Migrations
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("API.Entities.SeriesMetadata", null)
b.HasOne("API.Entities.Metadata.SeriesMetadata", null)
.WithMany()
.HasForeignKey("SeriesMetadatasId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("GenreSeriesMetadata", b =>
{
b.HasOne("API.Entities.Genre", null)
.WithMany()
.HasForeignKey("GenresId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("API.Entities.Metadata.SeriesMetadata", null)
.WithMany()
.HasForeignKey("SeriesMetadatasId")
.OnDelete(DeleteBehavior.Cascade)
@ -989,6 +1120,21 @@ namespace API.Data.Migrations
.IsRequired();
});
modelBuilder.Entity("PersonSeriesMetadata", b =>
{
b.HasOne("API.Entities.Person", null)
.WithMany()
.HasForeignKey("PeopleId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("API.Entities.Metadata.SeriesMetadata", null)
.WithMany()
.HasForeignKey("SeriesMetadatasId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("API.Entities.AppRole", b =>
{
b.Navigation("UserRoles");
@ -1014,6 +1160,11 @@ namespace API.Data.Migrations
b.Navigation("Files");
});
modelBuilder.Entity("API.Entities.Genre", b =>
{
b.Navigation("Chapters");
});
modelBuilder.Entity("API.Entities.Library", b =>
{
b.Navigation("Folders");

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.DTOs;
@ -41,7 +42,7 @@ namespace API.Data.Repositories
/// <returns></returns>
public async Task<IChapterInfoDto> GetChapterInfoDtoAsync(int chapterId)
{
return await _context.Chapter
var chapterInfo = await _context.Chapter
.Where(c => c.Id == chapterId)
.Join(_context.Volume, c => c.VolumeId, v => v.Id, (chapter, volume) => new
{
@ -49,8 +50,9 @@ namespace API.Data.Repositories
VolumeNumber = volume.Number,
VolumeId = volume.Id,
chapter.IsSpecial,
chapter.TitleName,
volume.SeriesId,
chapter.Pages
chapter.Pages,
})
.Join(_context.Series, data => data.SeriesId, series => series.Id, (data, series) => new
{
@ -60,11 +62,12 @@ namespace API.Data.Repositories
data.IsSpecial,
data.SeriesId,
data.Pages,
data.TitleName,
SeriesFormat = series.Format,
SeriesName = series.Name,
series.LibraryId
})
.Select(data => new BookInfoDto()
.Select(data => new ChapterInfoDto()
{
ChapterNumber = data.ChapterNumber,
VolumeNumber = data.VolumeNumber + string.Empty,
@ -74,10 +77,13 @@ namespace API.Data.Repositories
SeriesFormat = data.SeriesFormat,
SeriesName = data.SeriesName,
LibraryId = data.LibraryId,
Pages = data.Pages
Pages = data.Pages,
ChapterTitle = data.TitleName
})
.AsNoTracking()
.SingleAsync();
.SingleOrDefaultAsync();
return chapterInfo;
}
public Task<int> GetChapterTotalPagesAsync(int chapterId)

View File

@ -1,7 +1,6 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;
using API.Interfaces.Repositories;

View File

@ -1,35 +0,0 @@
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Interfaces.Repositories;
using Microsoft.EntityFrameworkCore;
namespace API.Data.Repositories
{
public class FileRepository : IFileRepository
{
private readonly DataContext _dbContext;
public FileRepository(DataContext context)
{
_dbContext = context;
}
public async Task<IEnumerable<string>> GetFileExtensions()
{
var fileExtensions = await _dbContext.MangaFile
.AsNoTracking()
.Select(x => x.FilePath.ToLower())
.Distinct()
.ToArrayAsync();
var uniqueFileTypes = fileExtensions
.Select(Path.GetExtension)
.Where(x => x is not null)
.Distinct();
return uniqueFileTypes;
}
}
}

View File

@ -0,0 +1,56 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Entities;
using API.Interfaces.Repositories;
using AutoMapper;
using Microsoft.EntityFrameworkCore;
namespace API.Data.Repositories;
public class GenreRepository : IGenreRepository
{
private readonly DataContext _context;
private readonly IMapper _mapper;
public GenreRepository(DataContext context, IMapper mapper)
{
_context = context;
_mapper = mapper;
}
public void Attach(Genre genre)
{
_context.Genre.Attach(genre);
}
public void Remove(Genre genre)
{
_context.Genre.Remove(genre);
}
public async Task<Genre> FindByNameAsync(string genreName)
{
var normalizedName = Parser.Parser.Normalize(genreName);
return await _context.Genre
.FirstOrDefaultAsync(g => g.NormalizedTitle.Equals(normalizedName));
}
public async Task RemoveAllGenreNoLongerAssociated(bool removeExternal = false)
{
var genresWithNoConnections = await _context.Genre
.Include(p => p.SeriesMetadatas)
.Include(p => p.Chapters)
.Where(p => p.SeriesMetadatas.Count == 0 && p.Chapters.Count == 0 && p.ExternalTag == removeExternal)
.ToListAsync();
_context.Genre.RemoveRange(genresWithNoConnections);
await _context.SaveChangesAsync();
}
public async Task<IList<Genre>> GetAllGenres()
{
return await _context.Genre.ToListAsync();;
}
}

View File

@ -0,0 +1,60 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using API.Entities;
using API.Interfaces.Repositories;
using AutoMapper;
using Microsoft.EntityFrameworkCore;
namespace API.Data.Repositories
{
public class PersonRepository : IPersonRepository
{
private readonly DataContext _context;
private readonly IMapper _mapper;
public PersonRepository(DataContext context, IMapper mapper)
{
_context = context;
_mapper = mapper;
}
public void Attach(Person person)
{
_context.Person.Attach(person);
}
public void Remove(Person person)
{
_context.Person.Remove(person);
}
public async Task<Person> FindByNameAsync(string name)
{
var normalizedName = Parser.Parser.Normalize(name);
return await _context.Person
.Where(p => normalizedName.Equals(p.NormalizedName))
.SingleOrDefaultAsync();
}
public async Task RemoveAllPeopleNoLongerAssociated(bool removeExternal = false)
{
var peopleWithNoConnections = await _context.Person
.Include(p => p.SeriesMetadatas)
.Include(p => p.ChapterMetadatas)
.Where(p => p.SeriesMetadatas.Count == 0 && p.ChapterMetadatas.Count == 0)
.ToListAsync();
_context.Person.RemoveRange(peopleWithNoConnections);
await _context.SaveChangesAsync();
}
public async Task<IList<Person>> GetAllPeople()
{
return await _context.Person
.ToListAsync();
}
}
}

View File

@ -1,4 +1,5 @@
using API.Entities;
using API.Entities.Metadata;
using API.Interfaces.Repositories;
namespace API.Data.Repositories

View File

@ -8,6 +8,7 @@ using API.DTOs.CollectionTags;
using API.DTOs.Filtering;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Extensions;
using API.Helpers;
using API.Interfaces.Repositories;
@ -85,6 +86,12 @@ namespace API.Data.Repositories
var query = _context.Series
.Where(s => s.LibraryId == libraryId)
.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Include(s => s.Metadata)
.ThenInclude(m => m.Genres)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(cm => cm.People)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Files)
@ -104,9 +111,15 @@ namespace API.Data.Repositories
return await _context.Series
.Where(s => s.Id == seriesId)
.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Include(s => s.Metadata)
.ThenInclude(m => m.Genres)
.Include(s => s.Library)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(cm => cm.People)
.Include(s => s.Volumes)
.ThenInclude(v => v.Chapters)
.ThenInclude(c => c.Files)
.AsSplitQuery()
.SingleOrDefaultAsync();
@ -180,6 +193,10 @@ namespace API.Data.Repositories
.Include(s => s.Volumes)
.Include(s => s.Metadata)
.ThenInclude(m => m.CollectionTags)
.Include(s => s.Metadata)
.ThenInclude(m => m.Genres)
.Include(s => s.Metadata)
.ThenInclude(m => m.People)
.Where(s => s.Id == seriesId)
.SingleOrDefaultAsync();
}
@ -374,6 +391,7 @@ namespace API.Data.Repositories
{
var metadataDto = await _context.SeriesMetadata
.Where(metadata => metadata.SeriesId == seriesId)
.Include(m => m.Genres)
.AsNoTracking()
.ProjectTo<SeriesMetadataDto>(_mapper.ConfigurationProvider)
.SingleOrDefaultAsync();
@ -481,17 +499,7 @@ namespace API.Data.Repositories
/// <returns></returns>
private async Task<Tuple<int, int>> GetChunkSize(int libraryId = 0)
{
// TODO: Think about making this bigger depending on number of files a user has in said library
// and number of cores and amount of memory. We can then make an optimal choice
var totalSeries = await GetSeriesCount(libraryId);
// var procCount = Math.Max(Environment.ProcessorCount - 1, 1);
//
// if (totalSeries < procCount * 2 || totalSeries < 50)
// {
// return new Tuple<int, int>(totalSeries, totalSeries);
// }
//
// return new Tuple<int, int>(totalSeries, Math.Max(totalSeries / procCount, 50));
return new Tuple<int, int>(totalSeries, 50);
}

View File

@ -157,6 +157,7 @@ namespace API.Data.Repositories
var volumes = await _context.Volume
.Where(vol => vol.SeriesId == seriesId)
.Include(vol => vol.Chapters)
.ThenInclude(c => c.People) // TODO: Measure cost of this
.OrderBy(volume => volume.Number)
.ProjectTo<VolumeDto>(_mapper.ConfigurationProvider)
.AsNoTracking()

View File

@ -31,10 +31,11 @@ namespace API.Data
public IAppUserProgressRepository AppUserProgressRepository => new AppUserProgressRepository(_context);
public ICollectionTagRepository CollectionTagRepository => new CollectionTagRepository(_context, _mapper);
public IFileRepository FileRepository => new FileRepository(_context);
public IChapterRepository ChapterRepository => new ChapterRepository(_context, _mapper);
public IReadingListRepository ReadingListRepository => new ReadingListRepository(_context, _mapper);
public ISeriesMetadataRepository SeriesMetadataRepository => new SeriesMetadataRepository(_context);
public IPersonRepository PersonRepository => new PersonRepository(_context, _mapper);
public IGenreRepository GenreRepository => new GenreRepository(_context, _mapper);
/// <summary>
/// Commits changes to the DB. Completes the open transaction.

View File

@ -1,5 +1,5 @@
#This Dockerfile pulls the latest git commit and builds Kavita from source
FROM mcr.microsoft.com/dotnet/sdk:5.0-focal AS builder
FROM mcr.microsoft.com/dotnet/sdk:6.0-focal AS builder
ENV DEBIAN_FRONTEND=noninteractive
ARG TARGETPLATFORM
@ -37,4 +37,4 @@ EXPOSE 5000
WORKDIR /kavita
ENTRYPOINT ["/bin/bash"]
CMD ["/entrypoint.sh"]
CMD ["/entrypoint.sh"]

View File

@ -2,6 +2,7 @@
using System.Collections.Generic;
using API.Entities.Enums;
using API.Entities.Interfaces;
using API.Entities.Metadata;
using API.Parser;
namespace API.Entities
@ -42,10 +43,30 @@ namespace API.Entities
/// </summary>
public string Title { get; set; }
/// <summary>
/// Chapter title
/// </summary>
/// <remarks>This should not be confused with Title which is used for special filenames.</remarks>
public string TitleName { get; set; } = string.Empty;
// public string Year { get; set; } // Only time I can think this will be more than 1 year is for a volume which will be a spread
/// <summary>
/// All people attached at a Chapter level. Usually Comics will have different people per issue.
/// </summary>
public ICollection<Person> People { get; set; } = new List<Person>();
// Relationships
public Volume Volume { get; set; }
public int VolumeId { get; set; }
//public ChapterMetadata ChapterMetadata { get; set; }
//public int ChapterMetadataId { get; set; }
public void UpdateFrom(ParserInfo info)
{
Files ??= new List<MangaFile>();

View File

@ -1,4 +1,5 @@
using System.Collections.Generic;
using API.Entities.Metadata;
using Microsoft.EntityFrameworkCore;
namespace API.Entities

View File

@ -5,15 +5,27 @@
/// <summary>
/// Another role, not covered by other types
/// </summary>
Other = 0,
/// <summary>
/// Author
/// </summary>
Author = 1,
Other = 1,
/// <summary>
/// Artist
/// </summary>
Artist = 2,
//Artist = 2,
/// <summary>
/// Author or Writer
/// </summary>
Writer = 3,
Penciller = 4,
Inker = 5,
Colorist = 6,
Letterer = 7,
CoverArtist = 8,
Editor = 9,
Publisher = 10,
/// <summary>
/// Represents a character/person within the story
/// </summary>
Character = 11
}
}
}

View File

@ -1,22 +1,18 @@
using System.ComponentModel.DataAnnotations;
using API.Entities.Interfaces;
using System.Collections.Generic;
using API.Entities.Metadata;
using Microsoft.EntityFrameworkCore;
namespace API.Entities
{
public class Genre : IHasConcurrencyToken
[Index(nameof(NormalizedTitle), nameof(ExternalTag), IsUnique = true)]
public class Genre
{
public int Id { get; set; }
public string Name { get; set; }
// MetadataUpdate add ProviderId
public string Title { get; set; } // TODO: Rename this to Title
public string NormalizedTitle { get; set; }
public bool ExternalTag { get; set; }
/// <inheritdoc />
[ConcurrencyCheck]
public uint RowVersion { get; private set; }
/// <inheritdoc />
public void OnSavingChanges()
{
RowVersion++;
}
public ICollection<SeriesMetadata> SeriesMetadatas { get; set; }
public ICollection<Chapter> Chapters { get; set; }
}
}

View File

@ -25,25 +25,18 @@ namespace API.Entities
/// </summary>
public DateTime LastModified { get; set; }
// Relationship Mapping
public Chapter Chapter { get; set; }
public int ChapterId { get; set; }
// Methods
/// <summary>
/// If the File on disk's last modified time is after what is stored in MangaFile
/// </summary>
/// <returns></returns>
public bool HasFileBeenModified()
{
return File.GetLastWriteTime(FilePath) > LastModified;
}
/// <summary>
/// Updates the Last Modified time of the underlying file
/// </summary>
public void UpdateLastModified()
{
// Should this be DateTime.Now ?
LastModified = File.GetLastWriteTime(FilePath);
}
}

View File

@ -0,0 +1,34 @@
using System.Collections.Generic;
namespace API.Entities.Metadata
{
/// <summary>
/// Has a 1-to-1 relationship with a Chapter. Represents metadata about a chapter.
/// </summary>
public class ChapterMetadata
{
public int Id { get; set; }
/// <summary>
/// Chapter title
/// </summary>
/// <remarks>This should not be confused with Chapter.Title which is used for special filenames.</remarks>
public string Title { get; set; } = string.Empty;
public string Year { get; set; } // Only time I can think this will be more than 1 year is for a volume which will be a spread
public string StoryArc { get; set; } // This might be a list
/// <summary>
/// All people attached at a Chapter level. Usually Comics will have different people per issue.
/// </summary>
public ICollection<Person> People { get; set; } = new List<Person>();
// Relationships
public Chapter Chapter { get; set; }
public int ChapterId { get; set; }
}
}

View File

@ -3,15 +3,25 @@ using System.ComponentModel.DataAnnotations;
using API.Entities.Interfaces;
using Microsoft.EntityFrameworkCore;
namespace API.Entities
namespace API.Entities.Metadata
{
[Index(nameof(Id), nameof(SeriesId), IsUnique = true)]
public class SeriesMetadata : IHasConcurrencyToken
{
public int Id { get; set; }
public string Summary { get; set; }
public ICollection<CollectionTag> CollectionTags { get; set; }
public ICollection<Genre> Genres { get; set; } = new List<Genre>();
/// <summary>
/// All people attached at a Series level.
/// </summary>
public ICollection<Person> People { get; set; } = new List<Person>();
// Relationship
public Series Series { get; set; }
public int SeriesId { get; set; }

View File

@ -1,23 +1,24 @@
using System.ComponentModel.DataAnnotations;
using System.Collections.Generic;
using API.Entities.Enums;
using API.Entities.Interfaces;
using API.Entities.Metadata;
namespace API.Entities
{
public class Person : IHasConcurrencyToken
public enum ProviderSource
{
Local = 1,
External = 2
}
public class Person
{
public int Id { get; set; }
public string Name { get; set; }
public string NormalizedName { get; set; }
public PersonRole Role { get; set; }
//public ProviderSource Source { get; set; }
/// <inheritdoc />
[ConcurrencyCheck]
public uint RowVersion { get; private set; }
/// <inheritdoc />
public void OnSavingChanges()
{
RowVersion++;
}
// Relationships
public ICollection<SeriesMetadata> SeriesMetadatas { get; set; }
public ICollection<Chapter> ChapterMetadatas { get; set; }
}
}

View File

@ -2,6 +2,7 @@
using System.Collections.Generic;
using API.Entities.Enums;
using API.Entities.Interfaces;
using API.Entities.Metadata;
using Microsoft.EntityFrameworkCore;
namespace API.Entities
@ -30,10 +31,6 @@ namespace API.Entities
/// Original Name on disk. Not exposed to UI.
/// </summary>
public string OriginalName { get; set; }
/// <summary>
/// Summary information related to the Series
/// </summary>
public string Summary { get; set; } // NOTE: Migrate into SeriesMetdata (with Metadata update)
public DateTime Created { get; set; }
public DateTime LastModified { get; set; }
/// <summary>

View File

@ -1,4 +1,5 @@
using API.Data;
using System.IO.Abstractions;
using API.Data;
using API.Helpers;
using API.Interfaces;
using API.Interfaces.Services;
@ -38,6 +39,11 @@ namespace API.Extensions
services.AddScoped<IReaderService, ReaderService>();
services.AddScoped<IAccountService, AccountService>();
services.AddScoped<IFileSystem, FileSystem>();
services.AddScoped<IFileService, FileService>();
services.AddScoped<ICacheHelper, CacheHelper>();
services.AddScoped<IPresenceTracker, PresenceTracker>();
services.AddSqLite(config, env);

View File

@ -0,0 +1,18 @@
using System;
namespace API.Extensions;
public static class DateTimeExtensions
{
/// <summary>
/// <para>Truncates a DateTime to a specified resolution.</para>
/// <para>A convenient source for resolution is TimeSpan.TicksPerXXXX constants.</para>
/// </summary>
/// <param name="date">The DateTime object to truncate</param>
/// <param name="resolution">e.g. to round to nearest second, TimeSpan.TicksPerSecond</param>
/// <returns>Truncated DateTime</returns>
public static DateTime Truncate(this DateTime date, long resolution)
{
return new DateTime(date.Ticks - (date.Ticks % resolution), date.Kind);
}
}

View File

@ -1,21 +1,14 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace API.Extensions
{
public static class EnumerableExtensions
{
public static IEnumerable<TSource> DistinctBy<TSource, TKey>
(this IEnumerable<TSource> source, Func<TSource, TKey> keySelector)
{
var seenKeys = new HashSet<TKey>();
foreach (var element in source)
{
if (seenKeys.Add(keySelector(element)))
{
yield return element;
}
}
}
}
}
}

View File

@ -0,0 +1,13 @@
using System.Text.RegularExpressions;
namespace API.Extensions;
public static class StringExtensions
{
private static readonly Regex SentenceCaseRegex = new Regex(@"(^[a-z])|\.\s+(.)", RegexOptions.ExplicitCapture | RegexOptions.Compiled);
public static string SentenceCase(this string value)
{
return SentenceCaseRegex.Replace(value.ToLower(), s => s.Value.ToUpper());
}
}

View File

@ -1,5 +1,6 @@
using System.Collections.Generic;
using System.Linq;
using API.Comparators;
using API.Entities;
using API.Entities.Enums;
@ -7,11 +8,11 @@ namespace API.Extensions
{
public static class VolumeListExtensions
{
public static Volume FirstWithChapters(this IList<Volume> volumes, bool inBookSeries)
public static Volume FirstWithChapters(this IEnumerable<Volume> volumes, bool inBookSeries)
{
return inBookSeries
? volumes.FirstOrDefault(v => v.Chapters.Any())
: volumes.FirstOrDefault(v => v.Chapters.Any() && (v.Number == 1));
: volumes.OrderBy(v => v.Number, new ChapterSortComparer()).FirstOrDefault(v => v.Chapters.Any());
}
/// <summary>

View File

@ -2,10 +2,13 @@
using System.Linq;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.DTOs.Metadata;
using API.DTOs.Reader;
using API.DTOs.ReadingLists;
using API.DTOs.Settings;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Helpers.Converters;
using AutoMapper;
@ -21,16 +24,98 @@ namespace API.Helpers
CreateMap<MangaFile, MangaFileDto>();
CreateMap<Chapter, ChapterDto>();
CreateMap<Chapter, ChapterDto>()
.ForMember(dest => dest.Writers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Writer)))
.ForMember(dest => dest.CoverArtist,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.CoverArtist)))
.ForMember(dest => dest.Colorist,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Colorist)))
.ForMember(dest => dest.Inker,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Inker)))
.ForMember(dest => dest.Letterer,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Letterer)))
.ForMember(dest => dest.Penciller,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Penciller)))
.ForMember(dest => dest.Publisher,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Publisher)))
.ForMember(dest => dest.Editor,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Editor)));
CreateMap<Series, SeriesDto>();
CreateMap<CollectionTag, CollectionTagDto>();
CreateMap<SeriesMetadata, SeriesMetadataDto>();
CreateMap<Person, PersonDto>();
CreateMap<Genre, GenreTagDto>();
CreateMap<SeriesMetadata, SeriesMetadataDto>()
.ForMember(dest => dest.Writers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Writer)))
.ForMember(dest => dest.Artists,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.CoverArtist)))
.ForMember(dest => dest.Characters,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Character)))
.ForMember(dest => dest.Publishers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Publisher)))
.ForMember(dest => dest.Colorists,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Colorist)))
.ForMember(dest => dest.Inkers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Inker)))
.ForMember(dest => dest.Letterers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Letterer)))
.ForMember(dest => dest.Pencillers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Penciller)))
.ForMember(dest => dest.Editors,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Editor)));
CreateMap<ChapterMetadata, ChapterMetadataDto>()
.ForMember(dest => dest.Writers,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Writer)))
.ForMember(dest => dest.CoverArtist,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.CoverArtist)))
.ForMember(dest => dest.Colorist,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Colorist)))
.ForMember(dest => dest.Inker,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Inker)))
.ForMember(dest => dest.Letterer,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Letterer)))
.ForMember(dest => dest.Penciller,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Penciller)))
.ForMember(dest => dest.Publisher,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Publisher)))
.ForMember(dest => dest.Editor,
opt =>
opt.MapFrom(src => src.People.Where(p => p.Role == PersonRole.Editor)));
CreateMap<AppUserPreferences, UserPreferencesDto>();
CreateMap<AppUserBookmark, BookmarkDto>();
@ -55,8 +140,10 @@ namespace API.Helpers
CreateMap<RegisterDto, AppUser>();
CreateMap<IEnumerable<ServerSetting>, ServerSettingDto>()
.ConvertUsing<ServerSettingConverter>();
}
}
}

View File

@ -0,0 +1,74 @@
using System;
using System.IO;
using API.Entities;
using API.Entities.Interfaces;
using API.Services;
namespace API.Helpers;
public interface ICacheHelper
{
bool ShouldUpdateCoverImage(string coverPath, MangaFile firstFile, DateTime chapterCreated,
bool forceUpdate = false,
bool isCoverLocked = false);
bool CoverImageExists(string path);
bool HasFileNotChangedSinceCreationOrLastScan(IEntityDate chapter, bool forceUpdate, MangaFile firstFile);
}
public class CacheHelper : ICacheHelper
{
private readonly IFileService _fileService;
public CacheHelper(IFileService fileService)
{
_fileService = fileService;
}
/// <summary>
/// Determines whether an entity should regenerate cover image.
/// </summary>
/// <remarks>If a cover image is locked but the underlying file has been deleted, this will allow regenerating. </remarks>
/// <param name="coverPath">This should just be the filename, no path information</param>
/// <param name="firstFile"></param>
/// <param name="forceUpdate">If the user has told us to force the refresh</param>
/// <param name="isCoverLocked">If cover has been locked by user. This will force false</param>
/// <returns></returns>
public bool ShouldUpdateCoverImage(string coverPath, MangaFile firstFile, DateTime chapterCreated, bool forceUpdate = false,
bool isCoverLocked = false)
{
if (firstFile == null) return true;
var fileExists = !string.IsNullOrEmpty(coverPath) && _fileService.Exists(coverPath);
if (isCoverLocked && fileExists) return false;
if (forceUpdate) return true;
return (_fileService.HasFileBeenModifiedSince(coverPath, chapterCreated)) || !fileExists;
}
/// <summary>
/// Has the file been modified since last scan or is user forcing an update
/// </summary>
/// <param name="chapter"></param>
/// <param name="forceUpdate"></param>
/// <param name="firstFile"></param>
/// <returns></returns>
public bool HasFileNotChangedSinceCreationOrLastScan(IEntityDate chapter, bool forceUpdate, MangaFile firstFile)
{
return firstFile != null &&
(!forceUpdate &&
!(_fileService.HasFileBeenModifiedSince(firstFile.FilePath, chapter.Created)
|| _fileService.HasFileBeenModifiedSince(firstFile.FilePath, firstFile.LastModified)));
}
/// <summary>
/// Determines if a given coverImage path exists
/// </summary>
/// <param name="path"></param>
/// <returns></returns>
public bool CoverImageExists(string path)
{
return !string.IsNullOrEmpty(path) && _fileService.Exists(path);
}
}

View File

@ -0,0 +1,76 @@
using System;
using System.Collections.Generic;
using System.Linq;
using API.Data;
using API.Entities;
namespace API.Helpers;
public static class GenreHelper
{
/// <summary>
///
/// </summary>
/// <param name="allPeople"></param>
/// <param name="names"></param>
/// <param name="isExternal"></param>
/// <param name="action"></param>
public static void UpdateGenre(ICollection<Genre> allPeople, IEnumerable<string> names, bool isExternal, Action<Genre> action)
{
foreach (var name in names)
{
if (string.IsNullOrEmpty(name.Trim())) continue;
var normalizedName = Parser.Parser.Normalize(name);
var genre = allPeople.FirstOrDefault(p =>
p.NormalizedTitle.Equals(normalizedName) && p.ExternalTag == isExternal);
if (genre == null)
{
genre = DbFactory.Genre(name, false);
allPeople.Add(genre);
}
action(genre);
}
}
public static void KeepOnlySameGenreBetweenLists(ICollection<Genre> existingGenres, ICollection<Genre> removeAllExcept, Action<Genre> action = null)
{
// var normalizedNames = names.Select(s => Parser.Parser.Normalize(s.Trim()))
// .Where(s => !string.IsNullOrEmpty(s)).ToList();
// var localNamesNotInComicInfos = seriesGenres.Where(g =>
// !normalizedNames.Contains(g.NormalizedName) && g.ExternalTag == isExternal);
//
// foreach (var nonExisting in localNamesNotInComicInfos)
// {
// // TODO: Maybe I need to do a cleanup here
// action(nonExisting);
// }
var existing = existingGenres.ToList();
foreach (var genre in existing)
{
var existingPerson = removeAllExcept.FirstOrDefault(g => g.ExternalTag == genre.ExternalTag && genre.NormalizedTitle.Equals(g.NormalizedTitle));
if (existingPerson == null)
{
existingGenres.Remove(genre);
action?.Invoke(genre);
}
}
}
/// <summary>
/// Adds the genre to the list if it's not already in there. This will ignore the ExternalTag.
/// </summary>
/// <param name="metadataGenres"></param>
/// <param name="genre"></param>
public static void AddGenreIfNotExists(ICollection<Genre> metadataGenres, Genre genre)
{
var existingGenre = metadataGenres.FirstOrDefault(p =>
p.NormalizedTitle == Parser.Parser.Normalize(genre.Title));
if (existingGenre == null)
{
metadataGenres.Add(genre);
}
}
}

View File

@ -0,0 +1,96 @@
using System;
using System.Collections.Generic;
using System.Linq;
using API.Data;
using API.Entities;
using API.Entities.Enums;
namespace API.Helpers;
public static class PersonHelper
{
/// <summary>
/// Given a list of all existing people, this will check the new names and roles and if it doesn't exist in allPeople, will create and
/// add an entry. For each person in name, the callback will be executed.
/// </summary>
/// <remarks>This is used to add new people to a list without worrying about duplicating rows in the DB</remarks>
/// <param name="allPeople"></param>
/// <param name="names"></param>
/// <param name="role"></param>
/// <param name="action"></param>
public static void UpdatePeople(ICollection<Person> allPeople, IEnumerable<string> names, PersonRole role, Action<Person> action)
{
var allPeopleTypeRole = allPeople.Where(p => p.Role == role).ToList();
foreach (var name in names)
{
var normalizedName = Parser.Parser.Normalize(name);
var person = allPeopleTypeRole.FirstOrDefault(p =>
p.NormalizedName.Equals(normalizedName));
if (person == null)
{
person = DbFactory.Person(name, role);
allPeople.Add(person);
}
action(person);
}
}
/// <summary>
/// Remove people on a list for a given role
/// </summary>
/// <remarks>Used to remove before we update/add new people</remarks>
/// <param name="existingPeople">Existing people on Entity</param>
/// <param name="people">People from metadata</param>
/// <param name="role">Role to filter on</param>
/// <param name="action">Callback which will be executed for each person removed</param>
public static void RemovePeople(ICollection<Person> existingPeople, IEnumerable<string> people, PersonRole role, Action<Person> action = null)
{
var normalizedPeople = people.Select(Parser.Parser.Normalize).ToList();
foreach (var person in normalizedPeople)
{
var existingPerson = existingPeople.FirstOrDefault(p => p.Role == role && person.Equals(p.NormalizedName));
if (existingPerson == null) continue;
existingPeople.Remove(existingPerson);
action?.Invoke(existingPerson);
}
}
/// <summary>
/// Removes all people that are not present in the removeAllExcept list.
/// </summary>
/// <param name="existingPeople"></param>
/// <param name="removeAllExcept"></param>
/// <param name="action">Callback for all entities that was removed</param>
public static void KeepOnlySamePeopleBetweenLists(ICollection<Person> existingPeople, ICollection<Person> removeAllExcept, Action<Person> action = null)
{
var existing = existingPeople.ToList();
foreach (var person in existing)
{
var existingPerson = removeAllExcept.FirstOrDefault(p => p.Role == person.Role && person.NormalizedName.Equals(p.NormalizedName));
if (existingPerson == null)
{
existingPeople.Remove(person);
action?.Invoke(person);
}
}
}
/// <summary>
/// Adds the person to the list if it's not already in there
/// </summary>
/// <param name="metadataPeople"></param>
/// <param name="person"></param>
public static void AddPersonIfNotExists(ICollection<Person> metadataPeople, Person person)
{
var existingPerson = metadataPeople.SingleOrDefault(p =>
p.NormalizedName == Parser.Parser.Normalize(person.Name) && p.Role == person.Role);
if (existingPerson == null)
{
metadataPeople.Add(person);
}
}
}

View File

@ -0,0 +1,44 @@
using System.Collections.Generic;
using System.Linq;
using API.Entities;
using API.Entities.Enums;
using API.Services.Tasks.Scanner;
namespace API.Helpers;
public static class SeriesHelper
{
/// <summary>
/// Given a parsedSeries checks if any of the names match against said Series and the format matches
/// </summary>
/// <param name="series"></param>
/// <param name="parsedInfoKey"></param>
/// <returns></returns>
public static bool FindSeries(Series series, ParsedSeries parsedInfoKey)
{
return (series.NormalizedName.Equals(parsedInfoKey.NormalizedName) || Parser.Parser.Normalize(series.OriginalName).Equals(parsedInfoKey.NormalizedName))
&& (series.Format == parsedInfoKey.Format || series.Format == MangaFormat.Unknown);
}
/// <summary>
/// Removes all instances of missingSeries' Series from existingSeries Collection. Existing series is updated by
/// reference and the removed element count is returned.
/// </summary>
/// <param name="existingSeries">Existing Series in DB</param>
/// <param name="missingSeries">Series not found on disk or can't be parsed</param>
/// <param name="removeCount"></param>
/// <returns>the updated existingSeries</returns>
public static IEnumerable<Series> RemoveMissingSeries(IList<Series> existingSeries, IEnumerable<Series> missingSeries, out int removeCount)
{
var existingCount = existingSeries.Count;
var missingList = missingSeries.ToList();
existingSeries = existingSeries.Where(
s => !missingList.Exists(
m => m.NormalizedName.Equals(s.NormalizedName) && m.Format == s.Format)).ToList();
removeCount = existingCount - existingSeries.Count;
return existingSeries;
}
}

View File

@ -12,10 +12,11 @@ namespace API.Interfaces
ISettingsRepository SettingsRepository { get; }
IAppUserProgressRepository AppUserProgressRepository { get; }
ICollectionTagRepository CollectionTagRepository { get; }
IFileRepository FileRepository { get; }
IChapterRepository ChapterRepository { get; }
IReadingListRepository ReadingListRepository { get; }
ISeriesMetadataRepository SeriesMetadataRepository { get; }
IPersonRepository PersonRepository { get; }
IGenreRepository GenreRepository { get; }
bool Commit();
Task<bool> CommitAsync();
bool HasChanges();

View File

@ -1,6 +1,5 @@
using System.Collections.Generic;
using System.Threading.Tasks;
using API.DTOs;
using API.DTOs.CollectionTags;
using API.Entities;

View File

@ -1,10 +0,0 @@
using System.Collections.Generic;
using System.Threading.Tasks;
namespace API.Interfaces.Repositories
{
public interface IFileRepository
{
Task<IEnumerable<string>> GetFileExtensions();
}
}

View File

@ -0,0 +1,15 @@
using System.Collections.Generic;
using System.Threading.Tasks;
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface IGenreRepository
{
void Attach(Genre genre);
void Remove(Genre genre);
Task<Genre> FindByNameAsync(string genreName);
Task<IList<Genre>> GetAllGenres();
Task RemoveAllGenreNoLongerAssociated(bool removeExternal = false);
}
}

View File

@ -0,0 +1,14 @@
using System.Collections.Generic;
using System.Threading.Tasks;
using API.Entities;
namespace API.Interfaces.Repositories
{
public interface IPersonRepository
{
void Attach(Person person);
void Remove(Person person);
Task<IList<Person>> GetAllPeople();
Task RemoveAllPeopleNoLongerAssociated(bool removeExternal = false);
}
}

View File

@ -1,4 +1,5 @@
using API.Entities;
using API.Entities.Metadata;
namespace API.Interfaces.Repositories
{

View File

@ -5,6 +5,7 @@ using API.DTOs;
using API.DTOs.Filtering;
using API.Entities;
using API.Entities.Enums;
using API.Entities.Metadata;
using API.Helpers;
namespace API.Interfaces.Repositories

View File

@ -1,6 +1,7 @@
using System.Collections.Generic;
using System.IO;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace API.Interfaces.Services
{
@ -16,5 +17,6 @@ namespace API.Interfaces.Services
bool CopyFilesToDirectory(IEnumerable<string> filePaths, string directoryPath, string prepend = "");
bool Exists(string directory);
void CopyFileToDirectory(string fullFilePath, string targetDirectory);
int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger);
}
}

View File

@ -1,5 +1,4 @@
using System.Threading.Tasks;
using API.Entities;
namespace API.Interfaces.Services
{
@ -11,10 +10,6 @@ namespace API.Interfaces.Services
/// <param name="libraryId"></param>
/// <param name="forceUpdate"></param>
Task RefreshMetadata(int libraryId, bool forceUpdate = false);
public bool UpdateMetadata(Chapter chapter, bool forceUpdate);
public bool UpdateMetadata(Volume volume, bool forceUpdate);
public bool UpdateMetadata(Series series, bool forceUpdate);
/// <summary>
/// Performs a forced refresh of metatdata just for a series and it's nested entities
/// </summary>

View File

@ -48,6 +48,8 @@ namespace API.Parser
MatchOptions, RegexTimeout);
private static readonly Regex ArchiveFileRegex = new Regex(ArchiveFileExtensions,
MatchOptions, RegexTimeout);
private static readonly Regex ComicInfoArchiveRegex = new Regex(@"\.cbz|\.cbr|\.cb7|\.cbt",
MatchOptions, RegexTimeout);
private static readonly Regex XmlRegex = new Regex(XmlRegexExtensions,
MatchOptions, RegexTimeout);
private static readonly Regex BookFileRegex = new Regex(BookFileExtensions,
@ -862,7 +864,6 @@ namespace API.Parser
}
}
// TODO: Since we have loops like this, think about using a method
foreach (var regex in MangaEditionRegex)
{
var matches = regex.Matches(title);
@ -997,6 +998,10 @@ namespace API.Parser
{
return ArchiveFileRegex.IsMatch(Path.GetExtension(filePath));
}
public static bool IsComicInfoExtension(string filePath)
{
return ComicInfoArchiveRegex.IsMatch(Path.GetExtension(filePath));
}
public static bool IsBook(string filePath)
{
return BookFileRegex.IsMatch(Path.GetExtension(filePath));
@ -1062,5 +1067,17 @@ namespace API.Parser
{
return Path.GetExtension(filePath).ToLower() == ".pdf";
}
/// <summary>
/// Cleans an author's name
/// </summary>
/// <remarks>If the author is Last, First, this will reverse</remarks>
/// <param name="author"></param>
/// <returns></returns>
public static string CleanAuthor(string author)
{
if (string.IsNullOrEmpty(author)) return string.Empty;
return string.Join(" ", author.Split(",").Reverse().Select(s => s.Trim()));
}
}
}

View File

@ -1,4 +1,5 @@
using API.Entities.Enums;
using API.Data.Metadata;
using API.Entities.Enums;
namespace API.Parser
{
@ -15,7 +16,11 @@ namespace API.Parser
/// <summary>
/// Represents the parsed series from the file or folder
/// </summary>
public string Series { get; set; } = "";
public string Series { get; set; } = string.Empty;
/// <summary>
/// This can be filled in from ComicInfo.xml/Epub during scanning. Will update the SortName field on <see cref="Entities.Series"/>
/// </summary>
public string SeriesSort { get; set; } = string.Empty;
/// <summary>
/// Represents the parsed volumes from a file. By default, will be 0 which means that nothing could be parsed.
/// If Volumes is 0 and Chapters is 0, the file is a special. If Chapters is non-zero, then no volume could be parsed.
@ -55,16 +60,26 @@ namespace API.Parser
/// <remarks>Manga does not use this field</remarks>
/// </summary>
public string Title { get; set; } = string.Empty;
/// <summary>
/// If the ParserInfo has the IsSpecial tag or both volumes and chapters are default aka 0
/// </summary>
/// <returns></returns>
public bool IsSpecialInfo()
{
{
return (IsSpecial || (Volumes == "0" && Chapters == "0"));
}
// (TODO: Make this a ValueType). Has at least 1 year, maybe 2 representing a range
// public string YearRange { get; set; }
// public IList<string> Genres { get; set; } = new List<string>();
/// <summary>
/// This will contain any EXTRA comicInfo information parsed from the epub or archive. If there is an archive with comicInfo.xml AND it contains
/// series, volume information, that will override what we parsed.
/// </summary>
public ComicInfo ComicInfo { get; set; }
/// <summary>
/// Merges non empty/null properties from info2 into this entity.
/// </summary>
@ -80,4 +95,4 @@ namespace API.Parser
IsSpecial = IsSpecial || info2.IsSpecial;
}
}
}
}

View File

@ -1,7 +1,5 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Threading.Tasks;
using API.Data;
@ -41,7 +39,7 @@ namespace API
{
Console.WriteLine("Generating JWT TokenKey for encrypting user sessions...");
var rBytes = new byte[128];
using (var crypto = new RNGCryptoServiceProvider()) crypto.GetBytes(rBytes);
RandomNumberGenerator.Create().GetBytes(rBytes);
Configuration.JwtToken = Convert.ToBase64String(rBytes).Replace("/", string.Empty);
}

View File

@ -322,7 +322,12 @@ namespace API.Services
return null;
}
public ComicInfo GetComicInfo(string archivePath)
/// <summary>
/// This can be null if nothing is found or any errors occur during access
/// </summary>
/// <param name="archivePath"></param>
/// <returns></returns>
public ComicInfo? GetComicInfo(string archivePath)
{
if (!IsValidArchive(archivePath)) return null;
@ -336,7 +341,7 @@ namespace API.Services
case ArchiveLibrary.Default:
{
using var archive = ZipFile.OpenRead(archivePath);
var entry = archive.Entries.SingleOrDefault(x =>
var entry = archive.Entries.FirstOrDefault(x =>
!Parser.Parser.HasBlacklistedFolderInPath(x.FullName)
&& Path.GetFileNameWithoutExtension(x.Name)?.ToLower() == ComicInfoFilename
&& !Path.GetFileNameWithoutExtension(x.Name)
@ -346,7 +351,18 @@ namespace API.Services
{
using var stream = entry.Open();
var serializer = new XmlSerializer(typeof(ComicInfo));
return (ComicInfo) serializer.Deserialize(stream);
var info = (ComicInfo) serializer.Deserialize(stream);
if (info != null)
{
info.Writer = Parser.Parser.CleanAuthor(info.Writer);
info.Colorist = Parser.Parser.CleanAuthor(info.Colorist);
info.Editor = Parser.Parser.CleanAuthor(info.Editor);
info.Inker = Parser.Parser.CleanAuthor(info.Inker);
info.Letterer = Parser.Parser.CleanAuthor(info.Letterer);
info.Penciller = Parser.Parser.CleanAuthor(info.Penciller);
info.Publisher = Parser.Parser.CleanAuthor(info.Publisher);
}
return info;
}
break;
@ -354,7 +370,7 @@ namespace API.Services
case ArchiveLibrary.SharpCompress:
{
using var archive = ArchiveFactory.Open(archivePath);
return FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
var info = FindComicInfoXml(archive.Entries.Where(entry => !entry.IsDirectory
&& !Parser.Parser
.HasBlacklistedFolderInPath(
Path.GetDirectoryName(
@ -365,6 +381,18 @@ namespace API.Services
.Parser
.MacOsMetadataFileStartsWith)
&& Parser.Parser.IsXml(entry.Key)));
if (info != null)
{
info.Writer = Parser.Parser.CleanAuthor(info.Writer);
info.Colorist = Parser.Parser.CleanAuthor(info.Colorist);
info.Editor = Parser.Parser.CleanAuthor(info.Editor);
info.Inker = Parser.Parser.CleanAuthor(info.Inker);
info.Letterer = Parser.Parser.CleanAuthor(info.Letterer);
info.Penciller = Parser.Parser.CleanAuthor(info.Penciller);
info.Publisher = Parser.Parser.CleanAuthor(info.Publisher);
}
return info;
}
case ArchiveLibrary.NotSupported:
_logger.LogWarning("[GetComicInfo] This archive cannot be read: {ArchivePath}", archivePath);

View File

@ -201,11 +201,15 @@ namespace API.Services
var info = new ComicInfo()
{
// TODO: Summary is in html, we need to turn it into string
Summary = epubBook.Schema.Package.Metadata.Description,
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators),
Writer = string.Join(",", epubBook.Schema.Package.Metadata.Creators.Select(c => Parser.Parser.CleanAuthor(c.Creator))),
Publisher = string.Join(",", epubBook.Schema.Package.Metadata.Publishers),
Month = !string.IsNullOrEmpty(publicationDate) ? DateTime.Parse(publicationDate).Month : 0,
Year = !string.IsNullOrEmpty(publicationDate) ? DateTime.Parse(publicationDate).Year : 0,
Title = epubBook.Title,
Genre = string.Join(",", epubBook.Schema.Package.Metadata.Subjects.Select(s => s.ToLower().Trim())),
};
// Parse tags not exposed via Library
foreach (var metadataItem in epubBook.Schema.Package.Metadata.MetaItems)
@ -215,6 +219,9 @@ namespace API.Services
case "calibre:rating":
info.UserRating = float.Parse(metadataItem.Content);
break;
case "calibre:title_sort":
info.TitleSort = metadataItem.Content;
break;
}
}
@ -305,8 +312,6 @@ namespace API.Services
{
using var epubBook = EpubReader.OpenBook(filePath);
// If the epub has the following tags, we can group the books as Volumes
// <meta content="5.0" name="calibre:series_index"/>
// <meta content="The Dark Tower" name="calibre:series"/>
// <meta content="Wolves of the Calla" name="calibre:title_sort"/>
// If all three are present, we can take that over dc:title and format as:
@ -323,6 +328,7 @@ namespace API.Services
var series = string.Empty;
var specialName = string.Empty;
var groupPosition = string.Empty;
var titleSort = string.Empty;
foreach (var metadataItem in epubBook.Schema.Package.Metadata.MetaItems)
@ -338,6 +344,7 @@ namespace API.Services
break;
case "calibre:title_sort":
specialName = metadataItem.Content;
titleSort = metadataItem.Content;
break;
}
@ -363,18 +370,26 @@ namespace API.Services
{
specialName = epubBook.Title;
}
return new ParserInfo()
var info = new ParserInfo()
{
Chapters = Parser.Parser.DefaultChapter,
Edition = string.Empty,
Format = MangaFormat.Epub,
Filename = Path.GetFileName(filePath),
Title = specialName.Trim(),
Title = specialName?.Trim(),
FullFilePath = filePath,
IsSpecial = false,
Series = series.Trim(),
Volumes = seriesIndex
};
// Don't set titleSort if the book belongs to a group
if (!string.IsNullOrEmpty(titleSort) && string.IsNullOrEmpty(seriesIndex))
{
info.SeriesSort = titleSort;
}
return info;
}
}
catch (Exception)
@ -392,7 +407,7 @@ namespace API.Services
FullFilePath = filePath,
IsSpecial = false,
Series = epubBook.Title.Trim(),
Volumes = Parser.Parser.DefaultVolume
Volumes = Parser.Parser.DefaultVolume,
};
}
catch (Exception ex)
@ -494,6 +509,7 @@ namespace API.Services
private static void GetPdfPage(IDocReader docReader, int pageNumber, Stream stream)
{
// TODO: BUG: Most of this Bitmap code is only supported on Windows. Refactor.
using var pageReader = docReader.GetPageReader(pageNumber);
var rawBytes = pageReader.GetImage(new NaiveTransparencyRemover());
var width = pageReader.GetPageWidth();

View File

@ -2,6 +2,7 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.IO;
using System.IO.Abstractions;
using System.Linq;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
@ -13,6 +14,8 @@ namespace API.Services
public class DirectoryService : IDirectoryService
{
private readonly ILogger<DirectoryService> _logger;
private readonly IFileSystem _fileSystem;
private static readonly Regex ExcludeDirectories = new Regex(
@"@eaDir|\.DS_Store",
RegexOptions.Compiled | RegexOptions.IgnoreCase);
@ -23,9 +26,10 @@ namespace API.Services
public static readonly string BackupDirectory = Path.Join(Directory.GetCurrentDirectory(), "config", "backups");
public static readonly string ConfigDirectory = Path.Join(Directory.GetCurrentDirectory(), "config");
public DirectoryService(ILogger<DirectoryService> logger)
public DirectoryService(ILogger<DirectoryService> logger, IFileSystem fileSystem)
{
_logger = logger;
_logger = logger;
_fileSystem = fileSystem;
}
/// <summary>
@ -91,6 +95,11 @@ namespace API.Services
return paths;
}
/// <summary>
/// Does Directory Exist
/// </summary>
/// <param name="directory"></param>
/// <returns></returns>
public bool Exists(string directory)
{
var di = new DirectoryInfo(directory);
@ -365,7 +374,7 @@ namespace API.Services
/// <param name="searchPattern">Regex pattern to search against</param>
/// <param name="logger"></param>
/// <exception cref="ArgumentException"></exception>
public static int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger)
public int TraverseTreeParallelForEach(string root, Action<string> action, string searchPattern, ILogger logger)
{
//Count of files traversed and timer for diagnostic output
var fileCount = 0;

View File

@ -0,0 +1,46 @@
using System;
using System.IO.Abstractions;
using API.Extensions;
namespace API.Services;
public interface IFileService
{
IFileSystem GetFileSystem();
bool HasFileBeenModifiedSince(string filePath, DateTime time);
bool Exists(string filePath);
}
public class FileService : IFileService
{
private readonly IFileSystem _fileSystem;
public FileService(IFileSystem fileSystem)
{
_fileSystem = fileSystem;
}
public FileService() : this(fileSystem: new FileSystem()) { }
public IFileSystem GetFileSystem()
{
return _fileSystem;
}
/// <summary>
/// If the File on disk's last modified time is after passed time
/// </summary>
/// <remarks>This has a resolution to the minute. Will ignore seconds and milliseconds</remarks>
/// <param name="filePath">Full qualified path of file</param>
/// <param name="time"></param>
/// <returns></returns>
public bool HasFileBeenModifiedSince(string filePath, DateTime time)
{
return !string.IsNullOrEmpty(filePath) && _fileSystem.File.GetLastWriteTime(filePath).Truncate(TimeSpan.TicksPerMinute) > time.Truncate(TimeSpan.TicksPerMinute);
}
public bool Exists(string filePath)
{
return _fileSystem.File.Exists(filePath);
}
}

View File

@ -5,8 +5,10 @@ using System.IO;
using System.Linq;
using System.Threading.Tasks;
using API.Comparators;
using API.Data;
using API.Data.Metadata;
using API.Data.Repositories;
using API.Data.Scanner;
using API.Entities;
using API.Entities.Enums;
using API.Extensions;
@ -17,317 +19,499 @@ using API.SignalR;
using Microsoft.AspNetCore.SignalR;
using Microsoft.Extensions.Logging;
namespace API.Services
namespace API.Services;
public class MetadataService : IMetadataService
{
public class MetadataService : IMetadataService
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<MetadataService> _logger;
private readonly IArchiveService _archiveService;
private readonly IBookService _bookService;
private readonly IImageService _imageService;
private readonly IHubContext<MessageHub> _messageHub;
private readonly ICacheHelper _cacheHelper;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IArchiveService archiveService, IBookService bookService, IImageService imageService,
IHubContext<MessageHub> messageHub, ICacheHelper cacheHelper)
{
private readonly IUnitOfWork _unitOfWork;
private readonly ILogger<MetadataService> _logger;
private readonly IArchiveService _archiveService;
private readonly IBookService _bookService;
private readonly IImageService _imageService;
private readonly IHubContext<MessageHub> _messageHub;
private readonly ChapterSortComparerZeroFirst _chapterSortComparerForInChapterSorting = new ChapterSortComparerZeroFirst();
_unitOfWork = unitOfWork;
_logger = logger;
_archiveService = archiveService;
_bookService = bookService;
_imageService = imageService;
_messageHub = messageHub;
_cacheHelper = cacheHelper;
}
public MetadataService(IUnitOfWork unitOfWork, ILogger<MetadataService> logger,
IArchiveService archiveService, IBookService bookService, IImageService imageService, IHubContext<MessageHub> messageHub)
/// <summary>
/// Gets the cover image for the file
/// </summary>
/// <remarks>Has side effect of marking the file as updated</remarks>
/// <param name="file"></param>
/// <param name="volumeId"></param>
/// <param name="chapterId"></param>
/// <returns></returns>
private string GetCoverImage(MangaFile file, int volumeId, int chapterId)
{
//file.UpdateLastModified();
switch (file.Format)
{
_unitOfWork = unitOfWork;
_logger = logger;
_archiveService = archiveService;
_bookService = bookService;
_imageService = imageService;
_messageHub = messageHub;
case MangaFormat.Pdf:
case MangaFormat.Epub:
return _bookService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Image:
var coverImage = _imageService.GetCoverFile(file);
return _imageService.GetCoverImage(coverImage, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Archive:
return _archiveService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Unknown:
default:
return string.Empty;
}
/// <summary>
/// Determines whether an entity should regenerate cover image.
/// </summary>
/// <remarks>If a cover image is locked but the underlying file has been deleted, this will allow regenerating. </remarks>
/// <param name="coverImage"></param>
/// <param name="firstFile"></param>
/// <param name="forceUpdate"></param>
/// <param name="isCoverLocked"></param>
/// <param name="coverImageDirectory">Directory where cover images are. Defaults to <see cref="DirectoryService.CoverImageDirectory"/></param>
/// <returns></returns>
public static bool ShouldUpdateCoverImage(string coverImage, MangaFile firstFile, bool forceUpdate = false,
bool isCoverLocked = false, string coverImageDirectory = null)
{
if (string.IsNullOrEmpty(coverImageDirectory))
{
coverImageDirectory = DirectoryService.CoverImageDirectory;
}
}
var fileExists = File.Exists(Path.Join(coverImageDirectory, coverImage));
if (isCoverLocked && fileExists) return false;
if (forceUpdate) return true;
return (firstFile != null && firstFile.HasFileBeenModified()) || !HasCoverImage(coverImage, fileExists);
}
private static bool HasCoverImage(string coverImage)
{
return HasCoverImage(coverImage, File.Exists(coverImage));
}
private static bool HasCoverImage(string coverImage, bool fileExists)
{
return !string.IsNullOrEmpty(coverImage) && fileExists;
}
private string GetCoverImage(MangaFile file, int volumeId, int chapterId)
{
file.UpdateLastModified();
switch (file.Format)
{
case MangaFormat.Pdf:
case MangaFormat.Epub:
return _bookService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Image:
var coverImage = _imageService.GetCoverFile(file);
return _imageService.GetCoverImage(coverImage, ImageService.GetChapterFormat(chapterId, volumeId));
case MangaFormat.Archive:
return _archiveService.GetCoverImage(file.FilePath, ImageService.GetChapterFormat(chapterId, volumeId));
default:
return string.Empty;
}
}
/// <summary>
/// Updates the metadata for a Chapter
/// </summary>
/// <param name="chapter"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (ShouldUpdateCoverImage(chapter.CoverImage, firstFile, forceUpdate, chapter.CoverImageLocked))
{
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile?.FilePath);
chapter.CoverImage = GetCoverImage(firstFile, chapter.VolumeId, chapter.Id);
return true;
}
/// <summary>
/// Updates the metadata for a Chapter
/// </summary>
/// <param name="chapter"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private bool UpdateChapterCoverImage(Chapter chapter, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (!_cacheHelper.ShouldUpdateCoverImage(Path.Join(DirectoryService.CoverImageDirectory, chapter.CoverImage), firstFile, chapter.Created, forceUpdate, chapter.CoverImageLocked))
return false;
_logger.LogDebug("[MetadataService] Generating cover image for {File}", firstFile?.FilePath);
chapter.CoverImage = GetCoverImage(firstFile, chapter.VolumeId, chapter.Id);
return true;
}
private void UpdateChapterMetadata(Chapter chapter, ICollection<Person> allPeople, bool forceUpdate)
{
var firstFile = chapter.Files.OrderBy(x => x.Chapter).FirstOrDefault();
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(chapter, forceUpdate, firstFile)) return;
UpdateChapterFromComicInfo(chapter, allPeople, firstFile);
firstFile.UpdateLastModified();
}
private void UpdateChapterFromComicInfo(Chapter chapter, ICollection<Person> allPeople, MangaFile firstFile)
{
var comicInfo = GetComicInfo(firstFile); // TODO: Think about letting the higher level loop have access for series to avoid duplicate IO operations
if (comicInfo == null) return;
if (!string.IsNullOrEmpty(comicInfo.Title))
{
chapter.TitleName = comicInfo.Title.Trim();
}
/// <summary>
/// Updates the metadata for a Volume
/// </summary>
/// <param name="volume"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Volume volume, bool forceUpdate)
if (!string.IsNullOrEmpty(comicInfo.Colorist))
{
// We need to check if Volume coverImage matches first chapters if forceUpdate is false
if (volume == null || !ShouldUpdateCoverImage(volume.CoverImage, null, forceUpdate)) return false;
volume.Chapters ??= new List<Chapter>();
var firstChapter = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).FirstOrDefault();
if (firstChapter == null) return false;
volume.CoverImage = firstChapter.CoverImage;
return true;
var people = comicInfo.Colorist.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Colorist);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Colorist,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
/// <summary>
/// Updates metadata for Series
/// </summary>
/// <param name="series"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public bool UpdateMetadata(Series series, bool forceUpdate)
if (!string.IsNullOrEmpty(comicInfo.Writer))
{
var madeUpdate = false;
if (series == null) return false;
var people = comicInfo.Writer.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Writer);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Writer,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
// NOTE: This will fail if we replace the cover of the first volume on a first scan. Because the series will already have a cover image
if (ShouldUpdateCoverImage(series.CoverImage, null, forceUpdate, series.CoverImageLocked))
if (!string.IsNullOrEmpty(comicInfo.Editor))
{
var people = comicInfo.Editor.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Editor);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Editor,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
if (!string.IsNullOrEmpty(comicInfo.Inker))
{
var people = comicInfo.Inker.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Inker);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Inker,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
if (!string.IsNullOrEmpty(comicInfo.Letterer))
{
var people = comicInfo.Letterer.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Letterer);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Letterer,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
if (!string.IsNullOrEmpty(comicInfo.Penciller))
{
var people = comicInfo.Penciller.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Penciller);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Penciller,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
if (!string.IsNullOrEmpty(comicInfo.CoverArtist))
{
var people = comicInfo.CoverArtist.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.CoverArtist);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.CoverArtist,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
if (!string.IsNullOrEmpty(comicInfo.Publisher))
{
var people = comicInfo.Publisher.Split(",");
PersonHelper.RemovePeople(chapter.People, people, PersonRole.Publisher);
PersonHelper.UpdatePeople(allPeople, people, PersonRole.Publisher,
person => PersonHelper.AddPersonIfNotExists(chapter.People, person));
}
}
/// <summary>
/// Updates the cover image for a Volume
/// </summary>
/// <param name="volume"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private bool UpdateVolumeCoverImage(Volume volume, bool forceUpdate)
{
// We need to check if Volume coverImage matches first chapters if forceUpdate is false
if (volume == null || !_cacheHelper.ShouldUpdateCoverImage(Path.Join(DirectoryService.CoverImageDirectory, volume.CoverImage), null, volume.Created, forceUpdate)) return false;
volume.Chapters ??= new List<Chapter>();
var firstChapter = volume.Chapters.OrderBy(x => double.Parse(x.Number), _chapterSortComparerForInChapterSorting).FirstOrDefault();
if (firstChapter == null) return false;
volume.CoverImage = firstChapter.CoverImage;
return true;
}
/// <summary>
/// Updates metadata for Series
/// </summary>
/// <param name="series"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
private void UpdateSeriesCoverImage(Series series, bool forceUpdate)
{
if (series == null) return;
// NOTE: This will fail if we replace the cover of the first volume on a first scan. Because the series will already have a cover image
if (!_cacheHelper.ShouldUpdateCoverImage(Path.Join(DirectoryService.CoverImageDirectory, series.CoverImage), null, series.Created, forceUpdate, series.CoverImageLocked))
return;
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Format);
string coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
series.Volumes ??= new List<Volume>();
var firstCover = series.Volumes.GetCoverImage(series.Format);
string coverImage = null;
if (firstCover == null && series.Volumes.Any())
{
// If firstCover is null and one volume, the whole series is Chapters under Vol 0.
if (series.Volumes.Count == 1)
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault(c => !c.IsSpecial)?.CoverImage;
madeUpdate = true;
}
if (!HasCoverImage(coverImage))
{
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault()?.CoverImage;
madeUpdate = true;
}
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault(c => !c.IsSpecial)?.CoverImage;
}
return UpdateSeriesSummary(series, forceUpdate) || madeUpdate ;
}
private bool UpdateSeriesSummary(Series series, bool forceUpdate)
{
// NOTE: This can be problematic when the file changes and a summary already exists, but it is likely
// better to let the user kick off a refresh metadata on an individual Series than having overhead of
// checking File last write time.
if (!string.IsNullOrEmpty(series.Summary) && !forceUpdate) return false;
var isBook = series.Library.Type == LibraryType.Book;
var firstVolume = series.Volumes.FirstWithChapters(isBook);
var firstChapter = firstVolume?.Chapters.GetFirstChapterWithFiles();
var firstFile = firstChapter?.Files.FirstOrDefault();
if (firstFile == null || (!forceUpdate && !firstFile.HasFileBeenModified())) return false;
if (Parser.Parser.IsPdf(firstFile.FilePath)) return false;
var comicInfo = GetComicInfo(series.Format, firstFile);
if (string.IsNullOrEmpty(comicInfo?.Summary)) return false;
series.Summary = comicInfo.Summary;
return true;
}
private ComicInfo GetComicInfo(MangaFormat format, MangaFile firstFile)
{
if (format is MangaFormat.Archive or MangaFormat.Epub)
if (!_cacheHelper.CoverImageExists(coverImage))
{
return Parser.Parser.IsEpub(firstFile.FilePath) ? _bookService.GetComicInfo(firstFile.FilePath) : _archiveService.GetComicInfo(firstFile.FilePath);
coverImage = series.Volumes[0].Chapters.OrderBy(c => double.Parse(c.Number), _chapterSortComparerForInChapterSorting)
.FirstOrDefault()?.CoverImage;
}
}
series.CoverImage = firstCover?.CoverImage ?? coverImage;
}
return null;
private void UpdateSeriesMetadata(Series series, ICollection<Person> allPeople, ICollection<Genre> allGenres, bool forceUpdate)
{
var isBook = series.Library.Type == LibraryType.Book;
var firstVolume = series.Volumes.OrderBy(c => c.Number, new ChapterSortComparer()).FirstWithChapters(isBook);
var firstChapter = firstVolume?.Chapters.GetFirstChapterWithFiles();
var firstFile = firstChapter?.Files.FirstOrDefault();
if (firstFile == null || _cacheHelper.HasFileNotChangedSinceCreationOrLastScan(firstChapter, forceUpdate, firstFile)) return;
if (Parser.Parser.IsPdf(firstFile.FilePath)) return;
var comicInfo = GetComicInfo(firstFile);
if (comicInfo == null) return;
// Summary Info
if (!string.IsNullOrEmpty(comicInfo.Summary))
{
series.Metadata.Summary = comicInfo.Summary; // NOTE: I can move this to the bottom as I have a comicInfo selection, save me an extra read
}
/// <summary>
/// Refreshes Metadata for a whole library
/// </summary>
/// <remarks>This can be heavy on memory first run</remarks>
/// <param name="libraryId"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public async Task RefreshMetadata(int libraryId, bool forceUpdate = false)
foreach (var chapter in series.Volumes.SelectMany(volume => volume.Chapters))
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name);
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Writer).Select(p => p.Name), PersonRole.Writer,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.CoverArtist).Select(p => p.Name), PersonRole.CoverArtist,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
var i = 0;
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++, i++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Publisher).Select(p => p.Name), PersonRole.Publisher,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Character).Select(p => p.Name), PersonRole.Character,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
Parallel.ForEach(nonLibrarySeries, series =>
{
try
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
{
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
}
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Colorist).Select(p => p.Name), PersonRole.Colorist,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
}
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Editor).Select(p => p.Name), PersonRole.Editor,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
UpdateMetadata(series, volumeUpdated || forceUpdate);
}
catch (Exception)
{
/* Swallow exception */
}
});
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Inker).Select(p => p.Name), PersonRole.Inker,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Letterer).Select(p => p.Name), PersonRole.Letterer,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
foreach (var series in nonLibrarySeries)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(library.Id, series.Id));
}
}
else
{
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
}
var progress = Math.Max(0F, Math.Min(1F, i * 1F / chunkInfo.TotalChunks));
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 1F));
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
PersonHelper.UpdatePeople(allPeople, chapter.People.Where(p => p.Role == PersonRole.Penciller).Select(p => p.Name), PersonRole.Penciller,
person => PersonHelper.AddPersonIfNotExists(series.Metadata.People, person));
}
var comicInfos = series.Volumes
.SelectMany(volume => volume.Chapters)
.SelectMany(c => c.Files)
.Select(GetComicInfo)
.Where(ci => ci != null)
.ToList();
/// <summary>
/// Refreshes Metadata for a Series. Will always force updates.
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = false)
var genres = comicInfos.SelectMany(i => i.Genre.Split(",")).Distinct().ToList();
var people = series.Volumes.SelectMany(volume => volume.Chapters).SelectMany(c => c.People).ToList();
PersonHelper.KeepOnlySamePeopleBetweenLists(series.Metadata.People,
people, person => series.Metadata.People.Remove(person));
GenreHelper.UpdateGenre(allGenres, genres, false, genre => GenreHelper.AddGenreIfNotExists(series.Metadata.Genres, genre));
GenreHelper.KeepOnlySameGenreBetweenLists(series.Metadata.Genres, genres.Select(g => DbFactory.Genre(g, false)).ToList(),
genre => series.Metadata.Genres.Remove(genre));
}
private ComicInfo GetComicInfo(MangaFile firstFile)
{
if (firstFile?.Format is MangaFormat.Archive or MangaFormat.Epub)
{
return Parser.Parser.IsEpub(firstFile.FilePath) ? _bookService.GetComicInfo(firstFile.FilePath) : _archiveService.GetComicInfo(firstFile.FilePath);
}
return null;
}
/// <summary>
///
/// </summary>
/// <remarks>This cannot have any Async code within. It is used within Parallel.ForEach</remarks>
/// <param name="series"></param>
/// <param name="forceUpdate"></param>
private void ProcessSeriesMetadataUpdate(Series series, IDictionary<int, IList<int>> chapterIds, ICollection<Person> allPeople, ICollection<Genre> allGenres, bool forceUpdate)
{
_logger.LogDebug("[MetadataService] Processing series {SeriesName}", series.OriginalName);
try
{
var sw = Stopwatch.StartNew();
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
if (series == null)
{
_logger.LogError("[MetadataService] Series {SeriesId} was not found on Library {LibraryId}", seriesId, libraryId);
return;
}
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {SeriesName}", series.Name);
var volumeUpdated = false;
foreach (var volume in series.Volumes)
{
var chapterUpdated = false;
foreach (var chapter in volume.Chapters)
{
chapterUpdated = UpdateMetadata(chapter, forceUpdate);
chapterUpdated = UpdateChapterCoverImage(chapter, forceUpdate);
UpdateChapterMetadata(chapter, allPeople, forceUpdate || chapterUpdated);
}
volumeUpdated = UpdateMetadata(volume, chapterUpdated || forceUpdate);
volumeUpdated = UpdateVolumeCoverImage(volume, chapterUpdated || forceUpdate);
}
UpdateMetadata(series, volumeUpdated || forceUpdate);
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(series.LibraryId, series.Id));
}
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
UpdateSeriesCoverImage(series, volumeUpdated || forceUpdate);
UpdateSeriesMetadata(series, allPeople, allGenres, forceUpdate);
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during updating metadata for {SeriesName} ", series.Name);
}
}
/// <summary>
/// Refreshes Metadata for a whole library
/// </summary>
/// <remarks>This can be heavy on memory first run</remarks>
/// <param name="libraryId"></param>
/// <param name="forceUpdate">Force updating cover image even if underlying file has not been modified or chapter already has a cover image</param>
public async Task RefreshMetadata(int libraryId, bool forceUpdate = false)
{
var library = await _unitOfWork.LibraryRepository.GetLibraryForIdAsync(libraryId, LibraryIncludes.None);
_logger.LogInformation("[MetadataService] Beginning metadata refresh of {LibraryName}", library.Name);
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
_logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
new UserParams()
{
PageNumber = chunk,
PageSize = chunkInfo.ChunkSize
});
_logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(nonLibrarySeries.Select(s => s.Id).ToArray());
var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
var allGenres = await _unitOfWork.GenreRepository.GetAllGenres();
var seriesIndex = 0;
foreach (var series in nonLibrarySeries)
{
try
{
ProcessSeriesMetadataUpdate(series, chapterIds, allPeople, allGenres, forceUpdate);
}
catch (Exception ex)
{
_logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name);
}
var index = chunk * seriesIndex;
var progress = Math.Max(0F, Math.Min(1F, index * 1F / chunkInfo.TotalSize));
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
seriesIndex++;
}
await _unitOfWork.CommitAsync();
foreach (var series in nonLibrarySeries)
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(library.Id, series.Id));
}
_logger.LogInformation(
"[MetadataService] Processed {SeriesStart} - {SeriesEnd} out of {TotalSeries} series in {ElapsedScanTime} milliseconds for {LibraryName}",
chunk * chunkInfo.ChunkSize, (chunk * chunkInfo.ChunkSize) + nonLibrarySeries.Count, chunkInfo.TotalSize, stopwatch.ElapsedMilliseconds, library.Name);
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 1F));
// TODO: Remove any leftover People from DB
await _unitOfWork.PersonRepository.RemoveAllPeopleNoLongerAssociated();
await _unitOfWork.GenreRepository.RemoveAllGenreNoLongerAssociated();
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesNumber} series in library {LibraryName} in {ElapsedMilliseconds} milliseconds total", chunkInfo.TotalSize, library.Name, totalTime);
}
// TODO: I can probably refactor RefreshMetadata and RefreshMetadataForSeries to be the same by utilizing chunk size of 1, so most of the code can be the same.
private async Task PerformScan(Library library, bool forceUpdate, Action<int, Chunk> action)
{
var chunkInfo = await _unitOfWork.SeriesRepository.GetChunkInfo(library.Id);
var stopwatch = Stopwatch.StartNew();
var totalTime = 0L;
_logger.LogInformation("[MetadataService] Refreshing Library {LibraryName}. Total Items: {TotalSize}. Total Chunks: {TotalChunks} with {ChunkSize} size", library.Name, chunkInfo.TotalSize, chunkInfo.TotalChunks, chunkInfo.ChunkSize);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(library.Id, 0F));
for (var chunk = 1; chunk <= chunkInfo.TotalChunks; chunk++)
{
if (chunkInfo.TotalChunks == 0) continue;
totalTime += stopwatch.ElapsedMilliseconds;
stopwatch.Restart();
action(chunk, chunkInfo);
// _logger.LogInformation("[MetadataService] Processing chunk {ChunkNumber} / {TotalChunks} with size {ChunkSize}. Series ({SeriesStart} - {SeriesEnd}",
// chunk, chunkInfo.TotalChunks, chunkInfo.ChunkSize, chunk * chunkInfo.ChunkSize, (chunk + 1) * chunkInfo.ChunkSize);
// var nonLibrarySeries = await _unitOfWork.SeriesRepository.GetFullSeriesForLibraryIdAsync(library.Id,
// new UserParams()
// {
// PageNumber = chunk,
// PageSize = chunkInfo.ChunkSize
// });
// _logger.LogDebug("[MetadataService] Fetched {SeriesCount} series for refresh", nonLibrarySeries.Count);
//
// var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(nonLibrarySeries.Select(s => s.Id).ToArray());
// var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
// var allGenres = await _unitOfWork.GenreRepository.GetAllGenres();
//
//
// var seriesIndex = 0;
// foreach (var series in nonLibrarySeries)
// {
// try
// {
// ProcessSeriesMetadataUpdate(series, chapterIds, allPeople, allGenres, forceUpdate);
// }
// catch (Exception ex)
// {
// _logger.LogError(ex, "[MetadataService] There was an exception during metadata refresh for {SeriesName}", series.Name);
// }
// var index = chunk * seriesIndex;
// var progress = Math.Max(0F, Math.Min(1F, index * 1F / chunkInfo.TotalSize));
//
// await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
// MessageFactory.RefreshMetadataProgressEvent(library.Id, progress));
// seriesIndex++;
// }
await _unitOfWork.CommitAsync();
}
}
/// <summary>
/// Refreshes Metadata for a Series. Will always force updates.
/// </summary>
/// <param name="libraryId"></param>
/// <param name="seriesId"></param>
public async Task RefreshMetadataForSeries(int libraryId, int seriesId, bool forceUpdate = true)
{
var sw = Stopwatch.StartNew();
var series = await _unitOfWork.SeriesRepository.GetFullSeriesForSeriesIdAsync(seriesId);
if (series == null)
{
_logger.LogError("[MetadataService] Series {SeriesId} was not found on Library {LibraryId}", seriesId, libraryId);
return;
}
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(libraryId, 0F));
var chapterIds = await _unitOfWork.SeriesRepository.GetChapterIdWithSeriesIdForSeriesAsync(new [] { seriesId });
var allPeople = await _unitOfWork.PersonRepository.GetAllPeople();
var allGenres = await _unitOfWork.GenreRepository.GetAllGenres();
ProcessSeriesMetadataUpdate(series, chapterIds, allPeople, allGenres, forceUpdate);
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadataProgress,
MessageFactory.RefreshMetadataProgressEvent(libraryId, 1F));
if (_unitOfWork.HasChanges() && await _unitOfWork.CommitAsync())
{
await _messageHub.Clients.All.SendAsync(SignalREvents.RefreshMetadata, MessageFactory.RefreshMetadataEvent(series.LibraryId, series.Id));
}
_logger.LogInformation("[MetadataService] Updated metadata for {SeriesName} in {ElapsedMilliseconds} milliseconds", series.Name, sw.ElapsedMilliseconds);
}
}

View File

@ -1,5 +1,4 @@
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using API.Entities.Enums;

View File

@ -4,6 +4,7 @@ using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using API.Data.Metadata;
using API.Entities;
using API.Entities.Enums;
using API.Interfaces.Services;
@ -25,6 +26,8 @@ namespace API.Services.Tasks.Scanner
private readonly ConcurrentDictionary<ParsedSeries, List<ParserInfo>> _scannedSeries;
private readonly IBookService _bookService;
private readonly ILogger _logger;
private readonly IArchiveService _archiveService;
private readonly IDirectoryService _directoryService;
/// <summary>
/// An instance of a pipeline for processing files and returning a Map of Series -> ParserInfos.
@ -32,10 +35,13 @@ namespace API.Services.Tasks.Scanner
/// </summary>
/// <param name="bookService"></param>
/// <param name="logger"></param>
public ParseScannedFiles(IBookService bookService, ILogger logger)
public ParseScannedFiles(IBookService bookService, ILogger logger, IArchiveService archiveService,
IDirectoryService directoryService)
{
_bookService = bookService;
_logger = logger;
_archiveService = archiveService;
_directoryService = directoryService;
_scannedSeries = new ConcurrentDictionary<ParsedSeries, List<ParserInfo>>();
}
@ -53,6 +59,20 @@ namespace API.Services.Tasks.Scanner
return existingKey != null ? parsedSeries[existingKey] : new List<ParserInfo>();
}
private ComicInfo GetComicInfo(string path)
{
if (Parser.Parser.IsEpub(path))
{
return _bookService.GetComicInfo(path);
}
if (Parser.Parser.IsComicInfoExtension(path))
{
return _archiveService.GetComicInfo(path);
}
return null;
}
/// <summary>
/// Processes files found during a library scan.
/// Populates a collection of <see cref="ParserInfo"/> for DB updates later.
@ -90,9 +110,32 @@ namespace API.Services.Tasks.Scanner
info.Merge(info2);
}
// TODO: Think about doing this before the Fallback code to speed up
info.ComicInfo = GetComicInfo(path);
if (info.ComicInfo != null)
{
var sw = Stopwatch.StartNew();
if (!string.IsNullOrEmpty(info.ComicInfo.Volume))
{
info.Volumes = info.ComicInfo.Volume;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Series))
{
info.Series = info.ComicInfo.Series;
}
if (!string.IsNullOrEmpty(info.ComicInfo.Number))
{
info.Chapters = info.ComicInfo.Number;
}
_logger.LogDebug("ComicInfo read added {Time} ms to processing", sw.ElapsedMilliseconds);
}
TrackSeries(info);
}
/// <summary>
/// Attempts to either add a new instance of a show mapping to the _scannedSeries bag or adds to an existing.
/// This will check if the name matches an existing series name (multiple fields) <see cref="MergeName"/>
@ -161,12 +204,12 @@ namespace API.Services.Tasks.Scanner
{
var sw = Stopwatch.StartNew();
totalFiles = 0;
var searchPattern = GetLibrarySearchPattern();
var searchPattern = Parser.Parser.SupportedExtensions;
foreach (var folderPath in folders)
{
try
{
totalFiles += DirectoryService.TraverseTreeParallelForEach(folderPath, (f) =>
totalFiles += _directoryService.TraverseTreeParallelForEach(folderPath, (f) =>
{
try
{
@ -191,11 +234,6 @@ namespace API.Services.Tasks.Scanner
return SeriesWithInfos();
}
private static string GetLibrarySearchPattern()
{
return Parser.Parser.SupportedExtensions;
}
/// <summary>
/// Returns any series where there were parsed infos
/// </summary>

File diff suppressed because it is too large Load Diff

View File

@ -50,7 +50,7 @@ namespace API.SignalR
{
return new SignalRMessage()
{
Name = SignalREvents.ScanLibrary,
Name = SignalREvents.ScanLibraryProgress,
Body = new
{
LibraryId = libraryId,

View File

@ -12,12 +12,29 @@
/// Event sent out during Refresh Metadata for progress tracking
/// </summary>
public const string RefreshMetadataProgress = "RefreshMetadataProgress";
public const string ScanLibrary = "ScanLibrary";
/// <summary>
/// Series is added to server
/// </summary>
public const string SeriesAdded = "SeriesAdded";
/// <summary>
/// Series is removed from server
/// </summary>
public const string SeriesRemoved = "SeriesRemoved";
/// <summary>
/// Progress event for Scan library
/// </summary>
public const string ScanLibraryProgress = "ScanLibraryProgress";
/// <summary>
/// When a user is connects/disconnects from server
/// </summary>
public const string OnlineUsers = "OnlineUsers";
/// <summary>
/// When a series is added to a collection
/// </summary>
public const string SeriesAddedToCollection = "SeriesAddedToCollection";
/// <summary>
/// When an error occurs during a scan library task
/// </summary>
public const string ScanLibraryError = "ScanLibraryError";
/// <summary>
/// Event sent out during backing up the database

View File

@ -1,5 +1,8 @@
namespace API.SignalR
{
/// <summary>
/// Payload for SignalR messages to Frontend
/// </summary>
public class SignalRMessage
{
public object Body { get; set; }

View File

@ -5,8 +5,6 @@ using System.Linq;
using System.Net;
using System.Net.Sockets;
using API.Extensions;
using API.Interfaces;
using API.Interfaces.Repositories;
using API.Middleware;
using API.Services;
using API.Services.HostedServices;

Some files were not shown because too many files have changed in this diff Show More