mirror of
https://github.com/Kareadita/Kavita.git
synced 2025-06-22 06:50:32 -04:00
* Staging the code for the new scan loop. * Implemented a basic idea of changes on drives triggering scan loop. Issues: 1. Scan by folder does not work, 2. Queuing system is very hacky and needs a separate thread, 3. Performance degregation could be very real. * Started writing unit test for new loop code * Implemented a basic method to scan a folder path with ignore support (not implemented, code in place) * Added some code to the parser to build out the idea of processing series in batches based on some top level folder. * Scan Series now uses the new code (folder based parsing) and now handles the LocalizedSeries issue. * Got library scan working with the new folder-based scan loop. Updated code to set FolderPath (for improved scan times and partial scan support). * Wrote some notes on update library scan loop. * Removed migration for merge * Reapplied the SeriesFolder migration after merge * Refactored a check that used multiple db calls into one. * Made lots of progress on ignore support, but some confusion on underlying library. Ticket created. On hold till then. * Updated Scan Library and Scan Series to exit early if no changes are on the underlying folders that need to be scanned. * Implemented the ability to have .kavitaignore files within your directories and Kavita will parse them and ignore files and directories based on rules within them. * Fixed an issue where ignore files nested wouldn't stack with higher level ignores * Wrote out some basic code that showcases how we can scan series or library based on file events on the underlying system. Very buggy, needs lots of edge case testing and logging and dupplication checking. * Things are working kinda. I'm getting lost in my own code and complexity. I'm not sure it's worth it. * Refactored ScanFiles out to Directory Service. * Refactored more code out to keep the code clean. * More unit tests * Refactored the signature of ParsedSeries to use IList. Started writing unit tests and reworked the UpdateLibrary to work how it used to with new scan loop code (note: using async update library/series does not work). * Fixed the bug where processSeriesInfos was being invoked twice per series and made the code work very similar to old code (except loose leaf files dont work) but with folder based scanning. * Prep for unit tests (updating broken ones with new implementations) * Just some notes. Not sure I want to finish this work. * Refactored the LibraryWatcher with some comments and state variables. * Undid the migrations in case I don't move forward with this branch * Started to clean the code and prepare for finishing this work. * Fixed a bad merge * Updated signatures to cleanup the code and commit to the new strategy for scanning. * Swapped out the code with async processing of series on a small library * The new scan loop is working in both Sync and Async methods. The code is slow and not optimized. This represents a good point to start polling and applying optimizations. * Refactored UpdateSeries out of Scanner and into a dedicated file. * Refactored how ProcessTasks are awaited to allow more async * Fixed an issue where side nav item wouldn't show correct highlight and migrated to OnPush * Moved where we start to stopwatch to encapsulate the full scan * Cleaned up SignalR events to report correctly (still needs a redesign) * Remove the "remove" code until I figure it out * Put in extremely expensive series deletion code for library scan. * Have Genre and Tag update the DB immediately to avoid dup issues * Taking a break * Moving to a lock with People was successful. Need to apply to others. * Refactored code for series level and tag and genre with new locking strategy. * New scan loop works. Next up optimization * Swapped out the Kavita log with svg for faster load * Refactored metadata updates to occur when the series are being updated. * Code cleanup * Added a new type of generic message (Info) to inform the user. * Code cleanup * Implemented an optimization which prevents any I/O (other than an attribute lookup) for Library/Series Scan. This can bring a recently updated library on network storage (650 series) to fully process in 2 seconds. Fixed a bug where File Analysis was running everytime for each non-epub file. * Fixed ARM x64 builds not being able to view PDF cover images due to a bad update in DocNet. * Some code cleanup * Added experimental signalr update code to have a more natural refresh of library-detail page * Hooked in ability to send new series events to UI * Moved all scan (file scan only) tasks into Scan Queue. Made it so scheduled ScanLibraries will now check if any existing task is being run and reschedule for 3 hours, and 10 mins for scan series. * Implemented the info event in the events widget and added a clear all button to dismiss all infos and errors. Added --event-widget-info-bg-color * Remove --drawer-background-color since it's not used * When new series added, inject directly into the view. * Some debug code cleanup * Fixed up the unit tests * Ensure all config directories exist on startup * Disabled Library Watching (that will go in next build) * Ensure update for series is admin only * Lots of code changes, scan series kinda works, specials are splitting, optimizations are failing. Demotivated on this work again. * Removed SeriesFolder migration * Added the SeriesFolder migration * Added a new pipe for dates so we can provide some nicer defaults. Added folder path to the series detail. * The scan optimizations now work for NTFS systems. * Removed a TODO * Migrated all the times to use DateTime.Now and not Utc. * Refactored some repo calls to use the includes flag pattern * Implemented a check for the library scan optimization check to validate if the library was updated (type change, library rename, folder change, or series deleted) and let the optimization be bypassed. * Added another optimization which will use just folder attribute of last write time if the drive is not NTFS. * Fixed a unit test * Some code cleanup
521 lines
19 KiB
C#
521 lines
19 KiB
C#
using System.Collections.Generic;
|
|
using System.Data.Common;
|
|
using System.IO;
|
|
using System.IO.Abstractions.TestingHelpers;
|
|
using System.Linq;
|
|
using System.Threading.Tasks;
|
|
using API.Data;
|
|
using API.Data.Metadata;
|
|
using API.Entities;
|
|
using API.Entities.Enums;
|
|
using API.Parser;
|
|
using API.Services;
|
|
using API.SignalR;
|
|
using AutoMapper;
|
|
using Microsoft.AspNetCore.SignalR;
|
|
using Microsoft.Data.Sqlite;
|
|
using Microsoft.EntityFrameworkCore;
|
|
using Microsoft.EntityFrameworkCore.Infrastructure;
|
|
using Microsoft.Extensions.Logging;
|
|
using NSubstitute;
|
|
using Xunit;
|
|
|
|
namespace API.Tests.Services
|
|
{
|
|
internal class MockReadingItemServiceForCacheService : IReadingItemService
|
|
{
|
|
private readonly DirectoryService _directoryService;
|
|
|
|
public MockReadingItemServiceForCacheService(DirectoryService directoryService)
|
|
{
|
|
_directoryService = directoryService;
|
|
}
|
|
|
|
public ComicInfo GetComicInfo(string filePath)
|
|
{
|
|
return null;
|
|
}
|
|
|
|
public int GetNumberOfPages(string filePath, MangaFormat format)
|
|
{
|
|
return 1;
|
|
}
|
|
|
|
public string GetCoverImage(string fileFilePath, string fileName, MangaFormat format)
|
|
{
|
|
return string.Empty;
|
|
}
|
|
|
|
public void Extract(string fileFilePath, string targetDirectory, MangaFormat format, int imageCount = 1)
|
|
{
|
|
throw new System.NotImplementedException();
|
|
}
|
|
|
|
public ParserInfo Parse(string path, string rootPath, LibraryType type)
|
|
{
|
|
throw new System.NotImplementedException();
|
|
}
|
|
|
|
public ParserInfo ParseFile(string path, string rootPath, LibraryType type)
|
|
{
|
|
throw new System.NotImplementedException();
|
|
}
|
|
}
|
|
public class CacheServiceTests
|
|
{
|
|
private readonly ILogger<CacheService> _logger = Substitute.For<ILogger<CacheService>>();
|
|
private readonly IUnitOfWork _unitOfWork;
|
|
private readonly IHubContext<MessageHub> _messageHub = Substitute.For<IHubContext<MessageHub>>();
|
|
|
|
private readonly DbConnection _connection;
|
|
private readonly DataContext _context;
|
|
|
|
private const string CacheDirectory = "C:/kavita/config/cache/";
|
|
private const string CoverImageDirectory = "C:/kavita/config/covers/";
|
|
private const string BackupDirectory = "C:/kavita/config/backups/";
|
|
private const string DataDirectory = "C:/data/";
|
|
|
|
public CacheServiceTests()
|
|
{
|
|
var contextOptions = new DbContextOptionsBuilder()
|
|
.UseSqlite(CreateInMemoryDatabase())
|
|
.Options;
|
|
_connection = RelationalOptionsExtension.Extract(contextOptions).Connection;
|
|
|
|
_context = new DataContext(contextOptions);
|
|
Task.Run(SeedDb).GetAwaiter().GetResult();
|
|
|
|
_unitOfWork = new UnitOfWork(_context, Substitute.For<IMapper>(), null);
|
|
}
|
|
|
|
#region Setup
|
|
|
|
private static DbConnection CreateInMemoryDatabase()
|
|
{
|
|
var connection = new SqliteConnection("Filename=:memory:");
|
|
|
|
connection.Open();
|
|
|
|
return connection;
|
|
}
|
|
|
|
public void Dispose() => _connection.Dispose();
|
|
|
|
private async Task<bool> SeedDb()
|
|
{
|
|
await _context.Database.MigrateAsync();
|
|
var filesystem = CreateFileSystem();
|
|
|
|
await Seed.SeedSettings(_context, new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem));
|
|
|
|
var setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.CacheDirectory).SingleAsync();
|
|
setting.Value = CacheDirectory;
|
|
|
|
setting = await _context.ServerSetting.Where(s => s.Key == ServerSettingKey.BackupDirectory).SingleAsync();
|
|
setting.Value = BackupDirectory;
|
|
|
|
_context.ServerSetting.Update(setting);
|
|
|
|
_context.Library.Add(new Library()
|
|
{
|
|
Name = "Manga",
|
|
Folders = new List<FolderPath>()
|
|
{
|
|
new FolderPath()
|
|
{
|
|
Path = "C:/data/"
|
|
}
|
|
}
|
|
});
|
|
return await _context.SaveChangesAsync() > 0;
|
|
}
|
|
|
|
private async Task ResetDB()
|
|
{
|
|
_context.Series.RemoveRange(_context.Series.ToList());
|
|
|
|
await _context.SaveChangesAsync();
|
|
}
|
|
|
|
private static MockFileSystem CreateFileSystem()
|
|
{
|
|
var fileSystem = new MockFileSystem();
|
|
fileSystem.Directory.SetCurrentDirectory("C:/kavita/");
|
|
fileSystem.AddDirectory("C:/kavita/config/");
|
|
fileSystem.AddDirectory(CacheDirectory);
|
|
fileSystem.AddDirectory(CoverImageDirectory);
|
|
fileSystem.AddDirectory(BackupDirectory);
|
|
fileSystem.AddDirectory(DataDirectory);
|
|
|
|
return fileSystem;
|
|
}
|
|
|
|
#endregion
|
|
|
|
#region Ensure
|
|
|
|
[Fact]
|
|
public async Task Ensure_DirectoryAlreadyExists_DontExtractAnything()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddFile($"{DataDirectory}Test v1.zip", new MockFileData(""));
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
await ResetDB();
|
|
var s = DbFactory.Series("Test");
|
|
var v = DbFactory.Volume("1");
|
|
var c = new Chapter()
|
|
{
|
|
Number = "1",
|
|
Files = new List<MangaFile>()
|
|
{
|
|
new MangaFile()
|
|
{
|
|
Format = MangaFormat.Archive,
|
|
FilePath = $"{DataDirectory}Test v1.zip",
|
|
}
|
|
}
|
|
};
|
|
v.Chapters.Add(c);
|
|
s.Volumes.Add(v);
|
|
s.LibraryId = 1;
|
|
_context.Series.Add(s);
|
|
|
|
await _context.SaveChangesAsync();
|
|
|
|
await cleanupService.Ensure(1);
|
|
Assert.Empty(ds.GetFiles(filesystem.Path.Join(CacheDirectory, "1"), searchOption:SearchOption.AllDirectories));
|
|
}
|
|
|
|
// [Fact]
|
|
// public async Task Ensure_DirectoryAlreadyExists_ExtractsImages()
|
|
// {
|
|
// // TODO: Figure out a way to test this
|
|
// var filesystem = CreateFileSystem();
|
|
// filesystem.AddFile($"{DataDirectory}Test v1.zip", new MockFileData(""));
|
|
// filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
// var archiveService = Substitute.For<IArchiveService>();
|
|
// archiveService.ExtractArchive($"{DataDirectory}Test v1.zip",
|
|
// filesystem.Path.Join(CacheDirectory, "1"));
|
|
// var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
|
// new ReadingItemService(archiveService, Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds));
|
|
//
|
|
// await ResetDB();
|
|
// var s = DbFactory.Series("Test");
|
|
// var v = DbFactory.Volume("1");
|
|
// var c = new Chapter()
|
|
// {
|
|
// Number = "1",
|
|
// Files = new List<MangaFile>()
|
|
// {
|
|
// new MangaFile()
|
|
// {
|
|
// Format = MangaFormat.Archive,
|
|
// FilePath = $"{DataDirectory}Test v1.zip",
|
|
// }
|
|
// }
|
|
// };
|
|
// v.Chapters.Add(c);
|
|
// s.Volumes.Add(v);
|
|
// s.LibraryId = 1;
|
|
// _context.Series.Add(s);
|
|
//
|
|
// await _context.SaveChangesAsync();
|
|
//
|
|
// await cleanupService.Ensure(1);
|
|
// Assert.Empty(ds.GetFiles(filesystem.Path.Join(CacheDirectory, "1"), searchOption:SearchOption.AllDirectories));
|
|
// }
|
|
|
|
|
|
#endregion
|
|
|
|
#region CleanupChapters
|
|
|
|
[Fact]
|
|
public void CleanupChapters_AllFilesShouldBeDeleted()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{CacheDirectory}1/001.jpg", new MockFileData(""));
|
|
filesystem.AddFile($"{CacheDirectory}1/002.jpg", new MockFileData(""));
|
|
filesystem.AddFile($"{CacheDirectory}3/003.jpg", new MockFileData(""));
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cleanupService = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
cleanupService.CleanupChapters(new []{1, 3});
|
|
Assert.Empty(ds.GetFiles(CacheDirectory, searchOption:SearchOption.AllDirectories));
|
|
}
|
|
|
|
|
|
#endregion
|
|
|
|
#region GetCachedEpubFile
|
|
|
|
[Fact]
|
|
public void GetCachedEpubFile_ShouldReturnFirstEpub()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{DataDirectory}1.epub", new MockFileData(""));
|
|
filesystem.AddFile($"{DataDirectory}2.epub", new MockFileData(""));
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
var c = new Chapter()
|
|
{
|
|
Files = new List<MangaFile>()
|
|
{
|
|
new MangaFile()
|
|
{
|
|
FilePath = $"{DataDirectory}1.epub"
|
|
},
|
|
new MangaFile()
|
|
{
|
|
FilePath = $"{DataDirectory}2.epub"
|
|
}
|
|
}
|
|
};
|
|
cs.GetCachedFile(c);
|
|
Assert.Same($"{DataDirectory}1.epub", cs.GetCachedFile(c));
|
|
}
|
|
|
|
#endregion
|
|
|
|
#region GetCachedPagePath
|
|
|
|
[Fact]
|
|
public void GetCachedPagePath_ReturnNullIfNoFiles()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
|
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
|
|
|
var c = new Chapter()
|
|
{
|
|
Id = 1,
|
|
Files = new List<MangaFile>()
|
|
};
|
|
|
|
var fileIndex = 0;
|
|
foreach (var file in c.Files)
|
|
{
|
|
for (var i = 0; i < file.Pages - 1; i++)
|
|
{
|
|
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
|
}
|
|
|
|
fileIndex++;
|
|
}
|
|
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
// Flatten to prepare for how GetFullPath expects
|
|
ds.Flatten($"{CacheDirectory}1/");
|
|
|
|
var path = cs.GetCachedPagePath(c, 11);
|
|
Assert.Equal(string.Empty, path);
|
|
}
|
|
|
|
[Fact]
|
|
public void GetCachedPagePath_GetFileFromFirstFile()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
|
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
|
|
|
var c = new Chapter()
|
|
{
|
|
Id = 1,
|
|
Files = new List<MangaFile>()
|
|
{
|
|
new MangaFile()
|
|
{
|
|
Id = 1,
|
|
FilePath = $"{DataDirectory}1.zip",
|
|
Pages = 10
|
|
|
|
},
|
|
new MangaFile()
|
|
{
|
|
Id = 2,
|
|
FilePath = $"{DataDirectory}2.zip",
|
|
Pages = 5
|
|
}
|
|
}
|
|
};
|
|
|
|
var fileIndex = 0;
|
|
foreach (var file in c.Files)
|
|
{
|
|
for (var i = 0; i < file.Pages; i++)
|
|
{
|
|
filesystem.AddFile($"{CacheDirectory}1/00{fileIndex}_00{i+1}.jpg", new MockFileData(""));
|
|
}
|
|
|
|
fileIndex++;
|
|
}
|
|
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
// Flatten to prepare for how GetFullPath expects
|
|
ds.Flatten($"{CacheDirectory}1/");
|
|
|
|
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/000_001.jpg"), ds.FileSystem.Path.GetFullPath(cs.GetCachedPagePath(c, 0)));
|
|
|
|
}
|
|
|
|
|
|
[Fact]
|
|
public void GetCachedPagePath_GetLastPageFromSingleFile()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
|
|
|
var c = new Chapter()
|
|
{
|
|
Id = 1,
|
|
Files = new List<MangaFile>()
|
|
{
|
|
new MangaFile()
|
|
{
|
|
Id = 1,
|
|
FilePath = $"{DataDirectory}1.zip",
|
|
Pages = 10
|
|
|
|
}
|
|
}
|
|
};
|
|
c.Pages = c.Files.Sum(f => f.Pages);
|
|
|
|
var fileIndex = 0;
|
|
foreach (var file in c.Files)
|
|
{
|
|
for (var i = 0; i < file.Pages; i++)
|
|
{
|
|
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
|
}
|
|
|
|
fileIndex++;
|
|
}
|
|
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
// Flatten to prepare for how GetFullPath expects
|
|
ds.Flatten($"{CacheDirectory}1/");
|
|
|
|
// Remember that we start at 0, so this is the 10th file
|
|
var path = cs.GetCachedPagePath(c, c.Pages);
|
|
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/000_0{c.Pages}.jpg"), ds.FileSystem.Path.GetFullPath(path));
|
|
}
|
|
|
|
[Fact]
|
|
public void GetCachedPagePath_GetFileFromSecondFile()
|
|
{
|
|
var filesystem = CreateFileSystem();
|
|
filesystem.AddDirectory($"{CacheDirectory}1/");
|
|
filesystem.AddFile($"{DataDirectory}1.zip", new MockFileData(""));
|
|
filesystem.AddFile($"{DataDirectory}2.zip", new MockFileData(""));
|
|
|
|
var c = new Chapter()
|
|
{
|
|
Id = 1,
|
|
Files = new List<MangaFile>()
|
|
{
|
|
new MangaFile()
|
|
{
|
|
Id = 1,
|
|
FilePath = $"{DataDirectory}1.zip",
|
|
Pages = 10
|
|
|
|
},
|
|
new MangaFile()
|
|
{
|
|
Id = 2,
|
|
FilePath = $"{DataDirectory}2.zip",
|
|
Pages = 5
|
|
}
|
|
}
|
|
};
|
|
|
|
var fileIndex = 0;
|
|
foreach (var file in c.Files)
|
|
{
|
|
for (var i = 0; i < file.Pages; i++)
|
|
{
|
|
filesystem.AddFile($"{CacheDirectory}1/{fileIndex}/{i+1}.jpg", new MockFileData(""));
|
|
}
|
|
|
|
fileIndex++;
|
|
}
|
|
|
|
var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), filesystem);
|
|
var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
new ReadingItemService(Substitute.For<IArchiveService>(),
|
|
Substitute.For<IBookService>(), Substitute.For<IImageService>(), ds), Substitute.For<IBookmarkService>());
|
|
|
|
// Flatten to prepare for how GetFullPath expects
|
|
ds.Flatten($"{CacheDirectory}1/");
|
|
|
|
// Remember that we start at 0, so this is the page + 1 file
|
|
var path = cs.GetCachedPagePath(c, 10);
|
|
Assert.Equal(ds.FileSystem.Path.GetFullPath($"{CacheDirectory}/1/001_001.jpg"), ds.FileSystem.Path.GetFullPath(path));
|
|
}
|
|
|
|
#endregion
|
|
|
|
#region ExtractChapterFiles
|
|
|
|
// [Fact]
|
|
// public void ExtractChapterFiles_ShouldExtractOnlyImages()
|
|
// {
|
|
// const string testDirectory = "/manga/";
|
|
// var fileSystem = new MockFileSystem();
|
|
// for (var i = 0; i < 10; i++)
|
|
// {
|
|
// fileSystem.AddFile($"{testDirectory}file_{i}.zip", new MockFileData(""));
|
|
// }
|
|
//
|
|
// fileSystem.AddDirectory(CacheDirectory);
|
|
//
|
|
// var ds = new DirectoryService(Substitute.For<ILogger<DirectoryService>>(), fileSystem);
|
|
// var cs = new CacheService(_logger, _unitOfWork, ds,
|
|
// new MockReadingItemServiceForCacheService(ds));
|
|
//
|
|
//
|
|
// cs.ExtractChapterFiles(CacheDirectory, new List<MangaFile>()
|
|
// {
|
|
// new MangaFile()
|
|
// {
|
|
// ChapterId = 1,
|
|
// Format = MangaFormat.Archive,
|
|
// Pages = 2,
|
|
// FilePath =
|
|
// }
|
|
// })
|
|
// }
|
|
|
|
#endregion
|
|
}
|
|
}
|