diff --git a/CHANGELOG.md b/CHANGELOG.md new file mode 100644 index 0000000000000000000000000000000000000000..b34f8d857f2a70f4d4188362879bb3e973c508fc --- /dev/null +++ b/CHANGELOG.md @@ -0,0 +1,48 @@ +# [1.29.0](https://github.com/ghoshRitesh12/aniwatch-api/compare/v1.28.0...v1.29.0) (2024-03-25) + + +### Features + +* **advancedSearch:** add advanced related constants ([68e4c70](https://github.com/ghoshRitesh12/aniwatch-api/commit/68e4c70dd887805bc2784bcbfabf5328a1ad752a)) +* **advancedSearch:** add advanced search feature ([1c02c9c](https://github.com/ghoshRitesh12/aniwatch-api/commit/1c02c9cf4f9c364c57a2f30471e676b5a5e5b5ca)) +* **advancedSearch:** add helper types ([71f0905](https://github.com/ghoshRitesh12/aniwatch-api/commit/71f0905115e88a96f59aa4a52d1ce69a793ebe0c)) +* **advancedSearch:** add utility methods related to advanced search ([79d0bdf](https://github.com/ghoshRitesh12/aniwatch-api/commit/79d0bdf05f86c5d5411f9473889442000786322f)) +* **advancedSearch:** add utility props to search response ([d6f9f0f](https://github.com/ghoshRitesh12/aniwatch-api/commit/d6f9f0f665c9d03b38b88baa8156892b9a32b0af)) +* **advancedSearch:** feat: add search filter parsing ([fef106d](https://github.com/ghoshRitesh12/aniwatch-api/commit/fef106da27270dcb86031e511a3cc428e40f41ff)) + + + +# [1.28.0](https://github.com/ghoshRitesh12/aniwatch-api/compare/v1.27.1...v1.28.0) (2024-03-12) + + +### Features + +* add json rate limit response; replace `max` with `limit` ([870fae7](https://github.com/ghoshRitesh12/aniwatch-api/commit/870fae700b56cc20010296387e3d9cda8330560c)) +* disabled `ratelimit` & `dont_sleep` for personal deployments ([8565619](https://github.com/ghoshRitesh12/aniwatch-api/commit/8565619f3ab2616b7fbeca7681f063730693a82e)) +* update api home page ([112e532](https://github.com/ghoshRitesh12/aniwatch-api/commit/112e532331fa3001d263457bff001c201b89d136)) + + + +## [1.27.1](https://github.com/ghoshRitesh12/aniwatch-api/compare/v1.27.0...v1.27.1) (2024-03-03) + + + +# [1.27.0](https://github.com/ghoshRitesh12/aniwatch-api/compare/v1.26.0...v1.27.0) (2024-03-03) + + +### Features + +* add explicit interval time for convenience ([a4b08c4](https://github.com/ghoshRitesh12/aniwatch-api/commit/a4b08c435c0ed62c57a1a6a985e3eed25bb82c92)) + + + +# [1.26.0](https://github.com/ghoshRitesh12/aniwatch-api/compare/v1.25.0...v1.26.0) (2024-03-03) + + +### Features + +* add axios config for future code reusability ([4782a8d](https://github.com/ghoshRitesh12/aniwatch-api/commit/4782a8dd708ec1f68bf469907024c082d606dc79)) +* update rebranded domain name ([a6f99bf](https://github.com/ghoshRitesh12/aniwatch-api/commit/a6f99bf681d27483d6f214c48673b875d3cbf6ab)) + + + diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 0000000000000000000000000000000000000000..d470d246b3462ae4d88263e5c29168365b928494 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,105 @@ +# Contributing to aniwatch-api + +Thank you for your interest in contributing to aniwatch-api. We appreciate whatever form of contribution you are willing to make. Every contribution counts ✨ + +## Table of Contents + +- [Types of contributions we are looking for](#types-of-contributions-we-are-looking-for) +- [Ground Rules & Expectations](#ground-rules--expectations) +- [How To Contribute](#how-to-contribute) +- [Prerequisites](#prerequisites) + - [Clone the repository](#clone-the-repository) + - [Project Structure](#project-structure) +- [Commit Messages](#commit-messages) + +## Types of contributions we are looking for + +In short, we welcome any sort of contribution you are willing to make as each and every contribution counts. We gladly accept contributions such as: + +- Documentation improvements: from minor typos to major document overhauls +- Helping others by answering questions in pull requests. +- Fixing known [bugs](https://github.com/ghoshRitesh12/aniwatch-api/issues?q=is%3Aopen). + +## Ground Rules & Expectations + +Before we begin, here are a few things we anticipate from you and that you should expect from others: + +- Be respectful and thoughtful in your conversations around this project. Each person may have their own views and opinions about the project. Try to listen to each other and reach an agreement or compromise. + +## How To Contribute + +If you'd like to contribute, start by searching through the [issues](https://github.com/ghoshRitesh12/aniwatch-api/issues) and [pull requests](https://github.com/ghoshRitesh12/aniwatch-api/pulls) to see whether someone else has raised a similar idea or question. + +If you don't see your idea listed, and you think it fits into the goals of this guide, you may do one of the following: + +- **If your contribution is minor,** such as a typo fix or new provider, consider opening a pull request. +- **If your contribution is major,** such as a major refactor, start by opening an issue first. That way, other people can weigh in on the discussion before you do any work. + +## Prerequisites + +To contribute to this project, you must know the following: + +- [NodeJS](https://nodejs.org/) +- [TypeScript](https://www.typescriptlang.org/) +- Web Scraping + - [Cheerio](https://cheerio.js.org/) + - [Axios](https://axios-http.com/docs/intro) + - [CSS Selectors](https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Selectors) + - [Browser Dev Tools](https://developer.mozilla.org/en-US/docs/Learn/Common_questions/Tools_and_setup/What_are_browser_developer_tools) + +### Clone the repository + +1. [Fork the repository](https://github.com/ghoshRitesh12/aniwatch-api/fork) +2. Clone your fork to your local machine using the following command (replace with your actual GitHub username) + +```bash +git clone https://github.com//aniwatch-api +``` + +3. Creating a new branch
+ Replace \ with any of the following naming conventions:
+ - `feature/` - for adding new features + - `bug/` - for fixing known bugs + - `misc/` - for anything other than bug or features + +```bash +git checkout -b +``` + +### Project Structure + +- `src` directory contains all the source code required for this project + + - `controllers` directory contains all the controller logic + - `types` directory contains all types & interfaces used for this project + - `parsers` directory contains all the parsing aka scraping logic + - `routes` directory contains all the routers + - `utils` directory contains handy utility methods and properties + - `config` directory contains api configuration related files + - `extractors` directory contains anime streaming url extractor files +

+ +- `test` directory contains all the tests that needs to be evaluated + +## Commit Messages + +When you've made changes to one or more files, you have to commit that file. You also need a message for that commit. + +We follow [Conventional Commit Messages](https://www.conventionalcommits.org/en/v1.0.0/#summary). + +A brief overview: + +- `feat`: A feature, possibly improving something already existing +- `fix`: A fix, for example of a bug +- `perf`: Performance related change +- `refactor`: Refactoring a specific section of the codebase +- `style`: Everything related to styling code like whitespaces, tabs, indenting, etc. +- `test`: Everything related to testing +- `docs`: Everything related to documentation +- `chore`: Code maintenance + +Examples: + +- `docs: fixed typo in readme` +- `feat: added a new category parser` +- `fix: fixed search results bug` diff --git a/Dockerfile b/Dockerfile new file mode 100644 index 0000000000000000000000000000000000000000..755ab992e0dc9282d7a7761c6a8760a6186966a5 --- /dev/null +++ b/Dockerfile @@ -0,0 +1,58 @@ +# build stage for building .ts files +FROM node:20-alpine as build + +RUN mkdir /home/app + +WORKDIR /home/app + +COPY package.json . + +RUN npm install --ignore-scripts + +COPY . . + +RUN npm run build + +# prod stage for including only necessary files +FROM node:20-alpine as prod + +LABEL org.opencontainers.image.source=https://github.com/ghoshRitesh12/aniwatch-api +LABEL org.opencontainers.image.description="Node.js API for obtaining anime information from aniwatch.to (formerly zoro.to) written in TypeScript, made with Cheerio & Axios" +LABEL org.opencontainers.image.description "Node.js API for obtaining anime information from aniwatch.to (formerly zoro.to) written in TypeScript, made with Cheerio & Axios" +LABEL org.opencontainers.image.licenses=MIT + +# create a non-privileged user +RUN addgroup -S aniwatch && adduser -S zoro -G aniwatch + +# set secure folder permissions +RUN mkdir -p /app/public /app/dist && chown -R zoro:aniwatch /app + +# set non-privileged user +USER zoro + +# set working directory +WORKDIR /app + +# copy config file for better use of layers +COPY --chown=zoro:aniwatch package.json . + +# install dependencies +RUN npm install --omit=dev --ignore-scripts + +# copy public folder from build stage to prod +COPY --from=build --chown=zoro:aniwatch /home/app/public /app/public + +# copy dist folder from build stage to prod +COPY --from=build --chown=zoro:aniwatch /home/app/dist /app/dist + +HEALTHCHECK --interval=30s --timeout=3s --start-period=5s CMD [ "npm", "run", "healthcheck" ] + +ENV NODE_ENV=production +ENV PORT=4000 + +# exposed port +EXPOSE 4000 + +CMD [ "node", "dist/src/server.js" ] + +# exit \ No newline at end of file diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..0909ca65bdd1e9b1eece9c14aa4a981e2e242fe5 --- /dev/null +++ b/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2023 Ritesh Ghosh + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. \ No newline at end of file diff --git a/README.md b/README.md index f475576d6543453e87cc378a95e81e11caf01c90..81d43fc152668cf6da54c13ad817c697fb80b48c 100644 --- a/README.md +++ b/README.md @@ -7,4 +7,4 @@ sdk: docker pinned: false --- -Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference +Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference \ No newline at end of file diff --git a/api/index.ts b/api/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..be3392bb4e0938977bbb32fcf76d4ea0013b9c40 --- /dev/null +++ b/api/index.ts @@ -0,0 +1,3 @@ +import app from "../src/server.js"; + +export default app; diff --git a/package.json b/package.json new file mode 100644 index 0000000000000000000000000000000000000000..9a43f096976a9344e8af407c411eaefbf1631ca4 --- /dev/null +++ b/package.json @@ -0,0 +1,55 @@ +{ + "name": "aniwatch-api", + "version": "1.29.0", + "description": "Node.js API for obtaining anime information from hianime.to (formerly aniwatch.to) written in TypeScript, made with Cheerio & Axios", + "main": "src/server.ts", + "type": "module", + "scripts": { + "start": "tsx src/server.ts", + "dev": "tsx watch src/server.ts", + "build": "tsc -p tsconfig.json", + "vercel-build": "echo \"Hello\"", + "prepare": "husky install", + "test": "vitest run --config vitest.config.ts", + "healthcheck": "curl -f http://localhost:4000/health" + }, + "repository": { + "type": "git", + "url": "git+https://github.com/ghoshRitesh12/aniwatch-api.git" + }, + "bugs": { + "url": "https://github.com/ghoshRitesh12/aniwatch-api/issues" + }, + "homepage": "https://github.com/ghoshRitesh12/aniwatch-api#readme", + "keywords": [ + "anime", + "weeb", + "hianime", + "scraper" + ], + "author": "https://github.com/ghoshRitesh12", + "license": "MIT", + "dependencies": { + "axios": "^1.6.5", + "cheerio": "1.0.0-rc.12", + "cors": "^2.8.5", + "crypto-js": "^4.2.0", + "dotenv": "^16.3.1", + "express": "^4.18.2", + "express-rate-limit": "^7.1.5", + "http-errors": "^2.0.0", + "morgan": "^1.10.0" + }, + "devDependencies": { + "@types/cors": "^2.8.17", + "@types/crypto-js": "^4.2.1", + "@types/express": "^4.17.21", + "@types/http-errors": "^2.0.4", + "@types/morgan": "^1.9.9", + "@types/node": "^20.11.5", + "husky": "^8.0.3", + "tsx": "^4.7.0", + "typescript": "^5.3.3", + "vitest": "^1.2.1" + } +} \ No newline at end of file diff --git a/public/img/img1.gif b/public/img/img1.gif new file mode 100644 index 0000000000000000000000000000000000000000..18a74497d348baede08ea254f0b240d885ab47cd Binary files /dev/null and b/public/img/img1.gif differ diff --git a/public/index.html b/public/index.html new file mode 100644 index 0000000000000000000000000000000000000000..5f8593a1790e07d2dbea925acf03d9ab75e34d05 --- /dev/null +++ b/public/index.html @@ -0,0 +1,103 @@ + + + + + + Aniwatch API + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

+ Welcome to the unofficial + hianime.to (formerly aniwatch.to) + api + ⚔️ +

+ + + + + \ No newline at end of file diff --git a/public/robots.txt b/public/robots.txt new file mode 100644 index 0000000000000000000000000000000000000000..fdbb711278e4932d722896624a66b2a9e55dd604 --- /dev/null +++ b/public/robots.txt @@ -0,0 +1,5 @@ +# START aniwatch-api +Disallow: + +User-agent: * +# END aniwatch-api \ No newline at end of file diff --git a/render.yaml b/render.yaml new file mode 100644 index 0000000000000000000000000000000000000000..b23f039b33b5d793d661b73e4a052daa0f797c34 --- /dev/null +++ b/render.yaml @@ -0,0 +1,10 @@ +services: + - type: web + name: aniwatch-api + runtime: docker + repo: https://github.com/ghoshRitesh12/aniwatch-api.git + plan: free + branch: main + envVars: + - key: PORT + value: 4000 diff --git a/src/config/axios.ts b/src/config/axios.ts new file mode 100644 index 0000000000000000000000000000000000000000..6292e9f5001e0bd5f68de09db106fe28eace3cb4 --- /dev/null +++ b/src/config/axios.ts @@ -0,0 +1,21 @@ +import axios, { AxiosError, type AxiosRequestConfig } from "axios"; +import { + SRC_BASE_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, +} from "../utils/constants.js"; + +const clientConfig: AxiosRequestConfig = { + timeout: 10000, + baseURL: SRC_BASE_URL, + headers: { + Accept: ACCEPT_HEADER, + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + }, +}; + +const client = axios.create(clientConfig); + +export { client, AxiosError }; diff --git a/src/config/cors.ts b/src/config/cors.ts new file mode 100644 index 0000000000000000000000000000000000000000..b82f40a35ca0b1c2d385f5029a4c5cf8720967cd --- /dev/null +++ b/src/config/cors.ts @@ -0,0 +1,10 @@ +import cors from "cors"; + +const corsConfig = cors({ + origin: "*", + methods: "GET", + credentials: true, + optionsSuccessStatus: 200, +}); + +export default corsConfig; diff --git a/src/config/errorHandler.ts b/src/config/errorHandler.ts new file mode 100644 index 0000000000000000000000000000000000000000..560f141b3d799549fba0bb987d4cbd11004378cd --- /dev/null +++ b/src/config/errorHandler.ts @@ -0,0 +1,11 @@ +import type { ErrorRequestHandler } from "express"; + +const errorHandler: ErrorRequestHandler = (error, req, res, next) => { + const status = error?.status || 500; + res.status(status).json({ + status, + message: error?.message || "Something Went Wrong", + }); +}; + +export default errorHandler; diff --git a/src/config/notFoundHandler.ts b/src/config/notFoundHandler.ts new file mode 100644 index 0000000000000000000000000000000000000000..a372e8e34a0d36f9e75133367c6cf9731f9af490 --- /dev/null +++ b/src/config/notFoundHandler.ts @@ -0,0 +1,8 @@ +import type { RequestHandler } from "express"; +import createHttpError from "http-errors"; + +const notFoundHandler: RequestHandler = (req, res, next) => { + return next(createHttpError.NotFound()); +}; + +export default notFoundHandler; diff --git a/src/config/ratelimit.ts b/src/config/ratelimit.ts new file mode 100644 index 0000000000000000000000000000000000000000..67edfd83a24ea038d4ce2b332b2257e2cd894108 --- /dev/null +++ b/src/config/ratelimit.ts @@ -0,0 +1,17 @@ +import { config } from "dotenv"; +import createHttpError from "http-errors"; +import { rateLimit } from "express-rate-limit"; + +config(); + +export const ratelimit = rateLimit({ + windowMs: Number(process.env.WINDOWMS) || 30 * 60 * 1000, + limit: Number(process.env.MAX) || 50, + legacyHeaders: true, + standardHeaders: "draft-7", + handler: function (_, __, next) { + next( + createHttpError.TooManyRequests("Too many API requests, try again later") + ); + }, +}); diff --git a/src/controllers/animeAboutInfo.controller.ts b/src/controllers/animeAboutInfo.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..6d50b50a716d3da6363a6a1780b977ecbf43fb03 --- /dev/null +++ b/src/controllers/animeAboutInfo.controller.ts @@ -0,0 +1,31 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeAnimeAboutInfo } from "../parsers/index.js"; +import { type AnimeAboutInfoQueryParams } from "../types/controllers/index.js"; + +// /anime/info?id=${anime-id} +const getAnimeAboutInfo: RequestHandler< + unknown, + Awaited>, + unknown, + AnimeAboutInfoQueryParams +> = async (req, res, next) => { + try { + const animeId = req.query.id + ? decodeURIComponent(req.query.id as string) + : null; + + if (animeId === null) { + throw createHttpError.BadRequest("Anime unique id required"); + } + + const data = await scrapeAnimeAboutInfo(animeId); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeAboutInfo; diff --git a/src/controllers/animeCategory.controller.ts b/src/controllers/animeCategory.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..373b8e38841e925aba108c92c36d0f9641edb252 --- /dev/null +++ b/src/controllers/animeCategory.controller.ts @@ -0,0 +1,39 @@ +import createHttpError from "http-errors"; +import type { RequestHandler } from "express"; +import type { AnimeCategories } from "../types/anime.js"; +import { scrapeAnimeCategory } from "../parsers/index.js"; +import type { + CategoryAnimePathParams, + CategoryAnimeQueryParams, +} from "../types/controllers/index.js"; + +// /anime/:category?page=${page} +const getAnimeCategory: RequestHandler< + CategoryAnimePathParams, + Awaited>, + unknown, + CategoryAnimeQueryParams +> = async (req, res, next) => { + try { + const category = req.params.category + ? decodeURIComponent(req.params.category) + : null; + + const page: number = req.query.page + ? Number(decodeURIComponent(req.query?.page as string)) + : 1; + + if (category === null) { + throw createHttpError.BadRequest("category required"); + } + + const data = await scrapeAnimeCategory(category as AnimeCategories, page); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeCategory; diff --git a/src/controllers/animeEpisodeSrcs.controller.ts b/src/controllers/animeEpisodeSrcs.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..6190ce5071bc1d8d0285d3a3a20452bacaf93082 --- /dev/null +++ b/src/controllers/animeEpisodeSrcs.controller.ts @@ -0,0 +1,75 @@ +import axios from "axios"; +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { type CheerioAPI, load } from "cheerio"; +import { scrapeAnimeEpisodeSources } from "../parsers/index.js"; +import { USER_AGENT_HEADER, SRC_BASE_URL } from "../utils/constants.js"; +import { type AnimeServers, Servers } from "../types/anime.js"; +import { type AnimeEpisodeSrcsQueryParams } from "../types/controllers/index.js"; + +type AnilistID = number | null; +type MalID = number | null; + +// /anime/episode-srcs?id=${episodeId}?server=${server}&category=${category (dub or sub)} +const getAnimeEpisodeSources: RequestHandler< + unknown, + Awaited>, + unknown, + AnimeEpisodeSrcsQueryParams +> = async (req, res, next) => { + try { + const episodeId = req.query.id ? decodeURIComponent(req.query.id) : null; + + const server = ( + req.query.server + ? decodeURIComponent(req.query.server) + : Servers.VidStreaming + ) as AnimeServers; + + const category = ( + req.query.category ? decodeURIComponent(req.query.category) : "sub" + ) as "sub" | "dub"; + + if (episodeId === null) { + throw createHttpError.BadRequest("Anime episode id required"); + } + + let malID: MalID; + let anilistID: AnilistID; + const animeURL = new URL(episodeId?.split("?ep=")[0], SRC_BASE_URL)?.href; + + const [episodeSrcData, animeSrc] = await Promise.all([ + scrapeAnimeEpisodeSources(episodeId, server, category), + axios.get(animeURL, { + headers: { + Referer: SRC_BASE_URL, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + }, + }), + ]); + + const $: CheerioAPI = load(animeSrc?.data); + + try { + anilistID = Number( + JSON.parse($("body")?.find("#syncData")?.text())?.anilist_id + ); + malID = Number(JSON.parse($("body")?.find("#syncData")?.text())?.mal_id); + } catch (err) { + anilistID = null; + malID = null; + } + + res.status(200).json({ + ...episodeSrcData, + anilistID, + malID, + }); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeEpisodeSources; diff --git a/src/controllers/animeEpisodes.controller.ts b/src/controllers/animeEpisodes.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..cf815f59ef59e99f3b4d5a69403cc09f0b8c7344 --- /dev/null +++ b/src/controllers/animeEpisodes.controller.ts @@ -0,0 +1,31 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeAnimeEpisodes } from "../parsers/index.js"; +import { type AnimeEpisodePathParams } from "../types/controllers/index.js"; + +// /anime/episodes/${anime-id} +const getAnimeEpisodes: RequestHandler< + AnimeEpisodePathParams, + Awaited>, + unknown, + unknown +> = async (req, res, next) => { + try { + const animeId = req.params.animeId + ? decodeURIComponent(req.params.animeId) + : null; + + if (animeId === null) { + throw createHttpError.BadRequest("Anime Id required"); + } + + const data = await scrapeAnimeEpisodes(animeId); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeEpisodes; diff --git a/src/controllers/animeGenre.controller.ts b/src/controllers/animeGenre.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..486b4c77d5a9bd313856717af1c7538c986219d0 --- /dev/null +++ b/src/controllers/animeGenre.controller.ts @@ -0,0 +1,37 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeGenreAnime } from "../parsers/index.js"; +import type { + GenreAnimePathParams, + GenreAnimeQueryParams, +} from "../types/controllers/index.js"; + +// /anime/genre/${name}?page=${page} +const getGenreAnime: RequestHandler< + GenreAnimePathParams, + Awaited>, + unknown, + GenreAnimeQueryParams +> = async (req, res, next) => { + try { + const name: string | null = req.params.name + ? decodeURIComponent(req.params.name as string) + : null; + + const page: number = req.query.page + ? Number(decodeURIComponent(req.query?.page as string)) + : 1; + + if (name === null) { + throw createHttpError.BadRequest("Anime genre required"); + } + + const data = await scrapeGenreAnime(name, page); + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getGenreAnime; diff --git a/src/controllers/animeProducer.controller.ts b/src/controllers/animeProducer.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..3ebcd8a27e9b8fbce15c9c522f9b9110e4c87da8 --- /dev/null +++ b/src/controllers/animeProducer.controller.ts @@ -0,0 +1,37 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeProducerAnimes } from "../parsers/index.js"; +import type { + AnimeProducerPathParams, + AnimeProducerQueryParams, +} from "../types/controllers/index.js"; + +// /anime/producer/${name}?page=${page} +const getProducerAnimes: RequestHandler< + AnimeProducerPathParams, + Awaited>, + unknown, + AnimeProducerQueryParams +> = async (req, res, next) => { + try { + const name: string | null = req.params.name + ? decodeURIComponent(req.params.name as string) + : null; + + const page: number = req.query.page + ? Number(decodeURIComponent(req.query?.page as string)) + : 1; + + if (name === null) { + throw createHttpError.BadRequest("Anime producer name required"); + } + + const data = await scrapeProducerAnimes(name, page); + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getProducerAnimes; diff --git a/src/controllers/animeSearch.controller.ts b/src/controllers/animeSearch.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..8937b0aeeae6a3736fec2ed234613d2176eb25ea --- /dev/null +++ b/src/controllers/animeSearch.controller.ts @@ -0,0 +1,57 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeAnimeSearch } from "../parsers/index.js"; +import type { + SearchFilters, + AnimeSearchQueryParams, +} from "../types/controllers/index.js"; + +const searchFilters: Record = { + filter: true, + type: true, + status: true, + rated: true, + score: true, + season: true, + language: true, + start_date: true, + end_date: true, + sort: true, + genres: true, +} as const; + +// /anime/search?q=${query}&page=${page} +const getAnimeSearch: RequestHandler< + unknown, + Awaited>, + unknown, + AnimeSearchQueryParams +> = async (req, res, next) => { + try { + let { q: query, page, ...filters } = req.query; + + query = query ? decodeURIComponent(query) : undefined; + const pageNo = page ? Number(decodeURIComponent(page as string)) : 1; + + if (query === undefined) { + throw createHttpError.BadRequest("Search keyword required"); + } + + const parsedFilters: SearchFilters = {}; + for (const key in filters) { + if (searchFilters[key]) { + parsedFilters[key as keyof SearchFilters] = + filters[key as keyof SearchFilters]; + } + } + + const data = await scrapeAnimeSearch(query, pageNo, parsedFilters); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeSearch; diff --git a/src/controllers/animeSearchSuggestion.controller.ts b/src/controllers/animeSearchSuggestion.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..ed8784f3770308e6bbcd2be32bacc0c258092771 --- /dev/null +++ b/src/controllers/animeSearchSuggestion.controller.ts @@ -0,0 +1,31 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeAnimeSearchSuggestion } from "../parsers/index.js"; +import { type AnimeSearchSuggestQueryParams } from "../types/controllers/index.js"; + +// /anime/search/suggest?q=${query} +const getAnimeSearchSuggestion: RequestHandler< + unknown, + Awaited>, + unknown, + AnimeSearchSuggestQueryParams +> = async (req, res, next) => { + try { + const query: string | null = req.query.q + ? decodeURIComponent(req.query.q as string) + : null; + + if (query === null) { + throw createHttpError.BadRequest("Search keyword required"); + } + + const data = await scrapeAnimeSearchSuggestion(query); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getAnimeSearchSuggestion; diff --git a/src/controllers/episodeServers.controller.ts b/src/controllers/episodeServers.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..16e1ca585cbae3d601c37df06c69132b5ab193c2 --- /dev/null +++ b/src/controllers/episodeServers.controller.ts @@ -0,0 +1,30 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeEpisodeServers } from "../parsers/index.js"; +import { type EpisodeServersQueryParams } from "../types/controllers/index.js"; + +// /anime/servers?episodeId=${id} +const getEpisodeServers: RequestHandler< + unknown, + Awaited>, + unknown, + EpisodeServersQueryParams +> = async (req, res, next) => { + try { + const episodeId = req.query.episodeId + ? decodeURIComponent(req.query?.episodeId as string) + : null; + + if (episodeId === null) { + throw createHttpError.BadRequest("Episode id required"); + } + + const data = await scrapeEpisodeServers(episodeId); + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getEpisodeServers; diff --git a/src/controllers/estimatedSchedule.controller.ts b/src/controllers/estimatedSchedule.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..fef25161ca5209b1d72239f27c5952a8c2e1efa2 --- /dev/null +++ b/src/controllers/estimatedSchedule.controller.ts @@ -0,0 +1,36 @@ +import createHttpError from "http-errors"; +import { type RequestHandler } from "express"; +import { scrapeEstimatedSchedule } from "../parsers/index.js"; +import { type EstimatedScheduleQueryParams } from "../types/controllers/index.js"; + +// /anime/schedule?date=${date} +const getEstimatedSchedule: RequestHandler< + unknown, + Awaited>, + unknown, + EstimatedScheduleQueryParams +> = async (req, res, next) => { + try { + const dateQuery = req.query.date + ? decodeURIComponent(req.query.date as string) + : null; + + if (dateQuery === null) { + throw createHttpError.BadRequest("Date payload required"); + } + if (!/^\d{4}-\d{2}-\d{2}$/.test(dateQuery)) { + throw createHttpError.BadRequest( + "Invalid date payload format. Months and days must have 2 digits" + ); + } + + const data = await scrapeEstimatedSchedule(dateQuery); + + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getEstimatedSchedule; diff --git a/src/controllers/homePage.controller.ts b/src/controllers/homePage.controller.ts new file mode 100644 index 0000000000000000000000000000000000000000..36e1fe9c31a7151f8cc67802b47df025bf5b3349 --- /dev/null +++ b/src/controllers/homePage.controller.ts @@ -0,0 +1,18 @@ +import { type RequestHandler } from "express"; +import { scrapeHomePage } from "../parsers/index.js"; + +// /anime/home +const getHomePageInfo: RequestHandler< + unknown, + Awaited> +> = async (req, res, next) => { + try { + const data = await scrapeHomePage(); + res.status(200).json(data); + } catch (err: any) { + console.error(err); + next(err); + } +}; + +export default getHomePageInfo; diff --git a/src/controllers/index.ts b/src/controllers/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..6e904407e63d398343836dffb4b329ad6b1a5142 --- /dev/null +++ b/src/controllers/index.ts @@ -0,0 +1,25 @@ +import getGenreAnime from "./animeGenre.controller.js"; +import getHomePageInfo from "./homePage.controller.js"; +import getAnimeSearch from "./animeSearch.controller.js"; +import getAnimeEpisodes from "./animeEpisodes.controller.js"; +import getAnimeCategory from "./animeCategory.controller.js"; +import getProducerAnimes from "./animeProducer.controller.js"; +import getEpisodeServers from "./episodeServers.controller.js"; +import getAnimeAboutInfo from "./animeAboutInfo.controller.js"; +import getEstimatedSchedule from "./estimatedSchedule.controller.js"; +import getAnimeEpisodeSources from "./animeEpisodeSrcs.controller.js"; +import getAnimeSearchSuggestion from "./animeSearchSuggestion.controller.js"; + +export { + getGenreAnime, + getAnimeSearch, + getHomePageInfo, + getAnimeEpisodes, + getAnimeCategory, + getEpisodeServers, + getProducerAnimes, + getAnimeAboutInfo, + getEstimatedSchedule, + getAnimeEpisodeSources, + getAnimeSearchSuggestion, +}; diff --git a/src/extractors/index.ts b/src/extractors/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..788dc8c90e5640940c4ebe32891c6d4d7a1fd4bf --- /dev/null +++ b/src/extractors/index.ts @@ -0,0 +1,6 @@ +import StreamSB from "./streamsb.js"; +import StreamTape from "./streamtape.js"; +import RapidCloud from "./rapidcloud.js"; +import MegaCloud from "./megacloud.js"; + +export { StreamSB, StreamTape, RapidCloud, MegaCloud }; diff --git a/src/extractors/megacloud.ts b/src/extractors/megacloud.ts new file mode 100644 index 0000000000000000000000000000000000000000..88b6d9480757e5d898fc8162c9248d99a78117cf --- /dev/null +++ b/src/extractors/megacloud.ts @@ -0,0 +1,245 @@ +import axios from "axios"; +import crypto from "crypto"; +import createHttpError from "http-errors"; + +// https://megacloud.tv/embed-2/e-1/dBqCr5BcOhnD?k=1 + +const megacloud = { + script: "https://megacloud.tv/js/player/a/prod/e1-player.min.js?v=", + sources: "https://megacloud.tv/embed-2/ajax/e-1/getSources?id=", +} as const; + +type track = { + file: string; + kind: string; + label?: string; + default?: boolean; +}; + +type intro_outro = { + start: number; + end: number; +}; + +type unencryptedSrc = { + file: string; + type: string; +}; + +type extractedSrc = { + sources: string | unencryptedSrc[]; + tracks: track[]; + encrypted: boolean; + intro: intro_outro; + outro: intro_outro; + server: number; +}; + +interface ExtractedData + extends Pick { + sources: { url: string; type: string }[]; +} + +class MegaCloud { + private serverName = "megacloud"; + + async extract(videoUrl: URL) { + try { + const extractedData: ExtractedData = { + tracks: [], + intro: { + start: 0, + end: 0, + }, + outro: { + start: 0, + end: 0, + }, + sources: [], + }; + + const videoId = videoUrl?.href?.split("/")?.pop()?.split("?")[0]; + const { data: srcsData } = await axios.get( + megacloud.sources.concat(videoId || ""), + { + headers: { + Accept: "*/*", + "X-Requested-With": "XMLHttpRequest", + "User-Agent": + "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36", + Referer: videoUrl.href, + }, + } + ); + if (!srcsData) { + throw createHttpError.NotFound("Url may have an invalid video id"); + } + + // console.log(JSON.stringify(srcsData, null, 2)); + + const encryptedString = srcsData.sources; + if (srcsData.encrypted && Array.isArray(encryptedString)) { + extractedData.intro = srcsData.intro; + extractedData.outro = srcsData.outro; + extractedData.tracks = srcsData.tracks; + extractedData.sources = encryptedString.map((s) => ({ + url: s.file, + type: s.type, + })); + + return extractedData; + } + + let text: string; + const { data } = await axios.get( + megacloud.script.concat(Date.now().toString()) + ); + + text = data; + if (!text) { + throw createHttpError.InternalServerError( + "Couldn't fetch script to decrypt resource" + ); + } + + const vars = this.extractVariables(text, "MEGACLOUD"); + const { secret, encryptedSource } = this.getSecret( + encryptedString as string, + vars + ); + const decrypted = this.decrypt(encryptedSource, secret); + try { + const sources = JSON.parse(decrypted); + extractedData.intro = srcsData.intro; + extractedData.outro = srcsData.outro; + extractedData.tracks = srcsData.tracks; + extractedData.sources = sources.map((s: any) => ({ + url: s.file, + type: s.type, + })); + + return extractedData; + } catch (error) { + throw createHttpError.InternalServerError("Failed to decrypt resource"); + } + } catch (err) { + // console.log(err); + throw err; + } + } + + extractVariables(text: string, sourceName: string) { + // extract needed variables + let allvars; + if (sourceName !== "MEGACLOUD") { + allvars = + text + .match( + /const (?:\w{1,2}=(?:'.{0,50}?'|\w{1,2}\(.{0,20}?\)).{0,20}?,){7}.+?;/gm + ) + ?.at(-1) ?? ""; + } else { + allvars = + text + .match(/const \w{1,2}=new URLSearchParams.+?;(?=function)/gm) + ?.at(-1) ?? ""; + } + // and convert their values into an array of numbers + const vars = allvars + .slice(0, -1) + .split("=") + .slice(1) + .map((pair) => Number(pair.split(",").at(0))) + .filter((num) => num === 0 || num); + + return vars; + } + + getSecret(encryptedString: string, values: number[]) { + let secret = "", + encryptedSource = encryptedString, + totalInc = 0; + + for (let i = 0; i < values[0]!; i++) { + let start, inc; + switch (i) { + case 0: + (start = values[2]), (inc = values[1]); + break; + case 1: + (start = values[4]), (inc = values[3]); + break; + case 2: + (start = values[6]), (inc = values[5]); + break; + case 3: + (start = values[8]), (inc = values[7]); + break; + case 4: + (start = values[10]), (inc = values[9]); + break; + case 5: + (start = values[12]), (inc = values[11]); + break; + case 6: + (start = values[14]), (inc = values[13]); + break; + case 7: + (start = values[16]), (inc = values[15]); + break; + case 8: + (start = values[18]), (inc = values[17]); + } + const from = start! + totalInc, + to = from + inc!; + (secret += encryptedString.slice(from, to)), + (encryptedSource = encryptedSource.replace( + encryptedString.substring(from, to), + "" + )), + (totalInc += inc!); + } + + return { secret, encryptedSource }; + } + + decrypt(encrypted: string, keyOrSecret: string, maybe_iv?: string) { + let key; + let iv; + let contents; + if (maybe_iv) { + key = keyOrSecret; + iv = maybe_iv; + contents = encrypted; + } else { + // copied from 'https://github.com/brix/crypto-js/issues/468' + const cypher = Buffer.from(encrypted, "base64"); + const salt = cypher.subarray(8, 16); + const password = Buffer.concat([ + Buffer.from(keyOrSecret, "binary"), + salt, + ]); + const md5Hashes = []; + let digest = password; + for (let i = 0; i < 3; i++) { + md5Hashes[i] = crypto.createHash("md5").update(digest).digest(); + digest = Buffer.concat([md5Hashes[i], password]); + } + key = Buffer.concat([md5Hashes[0], md5Hashes[1]]); + iv = md5Hashes[2]; + contents = cypher.subarray(16); + } + + const decipher = crypto.createDecipheriv("aes-256-cbc", key, iv); + const decrypted = + decipher.update( + contents as any, + typeof contents === "string" ? "base64" : undefined, + "utf8" + ) + decipher.final(); + + return decrypted; + } +} + +export default MegaCloud; diff --git a/src/extractors/rapidcloud.ts b/src/extractors/rapidcloud.ts new file mode 100644 index 0000000000000000000000000000000000000000..8073c8b9e8a06d4bdcc34d9247ffd2bab5b6c1e2 --- /dev/null +++ b/src/extractors/rapidcloud.ts @@ -0,0 +1,166 @@ +import axios from "axios"; +import CryptoJS from "crypto-js"; +import { substringAfter, substringBefore } from "../utils/index.js"; +import type { Video, Subtitle, Intro } from "../types/extractor.js"; + +type extractReturn = { + sources: Video[]; + subtitles: Subtitle[]; +}; + +// https://megacloud.tv/embed-2/e-1/IxJ7GjGVCyml?k=1 +class RapidCloud { + private serverName = "RapidCloud"; + private sources: Video[] = []; + + // https://rapid-cloud.co/embed-6/eVZPDXwVfrY3?vast=1 + private readonly fallbackKey = "c1d17096f2ca11b7"; + private readonly host = "https://rapid-cloud.co"; + + async extract(videoUrl: URL): Promise { + const result: extractReturn & { intro?: Intro; outro?: Intro } = { + sources: [], + subtitles: [], + }; + + try { + const id = videoUrl.href.split("/").pop()?.split("?")[0]; + const options = { + headers: { + "X-Requested-With": "XMLHttpRequest", + }, + }; + + let res = null; + + res = await axios.get( + `https://${videoUrl.hostname}/embed-2/ajax/e-1/getSources?id=${id}`, + options + ); + + let { + data: { sources, tracks, intro, outro, encrypted }, + } = res; + + let decryptKey = await ( + await axios.get( + "https://raw.githubusercontent.com/cinemaxhq/keys/e1/key" + ) + ).data; + + decryptKey = substringBefore( + substringAfter(decryptKey, '"blob-code blob-code-inner js-file-line">'), + "" + ); + + if (!decryptKey) { + decryptKey = await ( + await axios.get( + "https://raw.githubusercontent.com/cinemaxhq/keys/e1/key" + ) + ).data; + } + + if (!decryptKey) decryptKey = this.fallbackKey; + + try { + if (encrypted) { + const sourcesArray = sources.split(""); + let extractedKey = ""; + let currentIndex = 0; + + for (const index of decryptKey) { + const start = index[0] + currentIndex; + const end = start + index[1]; + + for (let i = start; i < end; i++) { + extractedKey += res.data.sources[i]; + sourcesArray[i] = ""; + } + currentIndex += index[1]; + } + + decryptKey = extractedKey; + sources = sourcesArray.join(""); + + const decrypt = CryptoJS.AES.decrypt(sources, decryptKey); + sources = JSON.parse(decrypt.toString(CryptoJS.enc.Utf8)); + } + } catch (err: any) { + console.log(err.message); + throw new Error("Cannot decrypt sources. Perhaps the key is invalid."); + } + + this.sources = sources?.map((s: any) => ({ + url: s.file, + isM3U8: s.file.includes(".m3u8"), + })); + + result.sources.push(...this.sources); + + if (videoUrl.href.includes(new URL(this.host).host)) { + result.sources = []; + this.sources = []; + + for (const source of sources) { + const { data } = await axios.get(source.file, options); + const m3u8data = data + .split("\n") + .filter( + (line: string) => + line.includes(".m3u8") && line.includes("RESOLUTION=") + ); + + const secondHalf = m3u8data.map((line: string) => + line.match(/RESOLUTION=.*,(C)|URI=.*/g)?.map((s) => s.split("=")[1]) + ); + + const TdArray = secondHalf.map((s: string[]) => { + const f1 = s[0].split(",C")[0]; + const f2 = s[1].replace(/"/g, ""); + + return [f1, f2]; + }); + + for (const [f1, f2] of TdArray) { + this.sources.push({ + url: `${source.file?.split("master.m3u8")[0]}${f2.replace( + "iframes", + "index" + )}`, + quality: f1.split("x")[1] + "p", + isM3U8: f2.includes(".m3u8"), + }); + } + result.sources.push(...this.sources); + } + } + + result.intro = + intro?.end > 1 ? { start: intro.start, end: intro.end } : undefined; + result.outro = + outro?.end > 1 ? { start: outro.start, end: outro.end } : undefined; + + result.sources.push({ + url: sources[0].file, + isM3U8: sources[0].file.includes(".m3u8"), + quality: "auto", + }); + + result.subtitles = tracks + .map((s: any) => + s.file + ? { url: s.file, lang: s.label ? s.label : "Thumbnails" } + : null + ) + .filter((s: any) => s); + + return result; + } catch (err: any) { + console.log(err.message); + throw err; + } + } +} + +export default RapidCloud; diff --git a/src/extractors/streamsb.ts b/src/extractors/streamsb.ts new file mode 100644 index 0000000000000000000000000000000000000000..3eeaabe0e2af753d5da2ea9f9a1eb37475646bfa --- /dev/null +++ b/src/extractors/streamsb.ts @@ -0,0 +1,83 @@ +import axios from "axios"; +import type { Video } from "../types/extractor.js"; +import { USER_AGENT_HEADER } from "../utils/index.js"; + +class StreamSB { + private serverName = "streamSB"; + private sources: Video[] = []; + + private readonly host = "https://watchsb.com/sources50"; + private readonly host2 = "https://streamsss.net/sources16"; + + private PAYLOAD(hex: string): string { + // `5363587530696d33443675687c7c${hex}7c7c433569475830474c497a65767c7c73747265616d7362`; + return `566d337678566f743674494a7c7c${hex}7c7c346b6767586d6934774855537c7c73747265616d7362/6565417268755339773461447c7c346133383438333436313335376136323337373433383634376337633465366534393338373136643732373736343735373237613763376334363733353737303533366236333463353333363534366137633763373337343732363536313664373336327c7c6b586c3163614468645a47617c7c73747265616d7362`; + } + + async extract(videoUrl: URL, isAlt: boolean = false): Promise { + let headers: Record = { + watchsb: "sbstream", + Referer: videoUrl.href, + "User-Agent": USER_AGENT_HEADER, + }; + let id = videoUrl.href.split("/e/").pop(); + if (id?.includes("html")) { + id = id.split(".html")[0]; + } + const bytes = new TextEncoder().encode(id); + + const res = await axios + .get( + `${isAlt ? this.host2 : this.host}/${this.PAYLOAD( + Buffer.from(bytes).toString("hex") + )}`, + { headers } + ) + .catch(() => null); + + if (!res?.data.stream_data) { + throw new Error("No source found. Try a different server"); + } + + headers = { + "User-Agent": USER_AGENT_HEADER, + Referer: videoUrl.href.split("e/")[0], + }; + + const m3u8_urls = await axios.get(res.data.stream_data.file, { + headers, + }); + + const videoList = m3u8_urls?.data?.split("#EXT-X-STREAM-INF:") ?? []; + + for (const video of videoList) { + if (!video.includes("m3u8")) continue; + + const url = video.split("\n")[1]; + const quality = video.split("RESOLUTION=")[1].split(",")[0].split("x")[1]; + + this.sources.push({ + url: url, + quality: `${quality}p`, + isM3U8: true, + }); + } + + this.sources.push({ + url: res.data.stream_data.file, + quality: "auto", + isM3U8: res.data.stream_data.file.includes(".m3u8"), + }); + + return this.sources; + } + + private addSources(source: any): void { + this.sources.push({ + url: source.file, + isM3U8: source.file.includes(".m3u8"), + }); + } +} + +export default StreamSB; diff --git a/src/extractors/streamtape.ts b/src/extractors/streamtape.ts new file mode 100644 index 0000000000000000000000000000000000000000..69910ceb3ebbb4da977e1ca02e9c29b3c2c26e28 --- /dev/null +++ b/src/extractors/streamtape.ts @@ -0,0 +1,37 @@ +import axios from "axios"; +import { load, type CheerioAPI } from "cheerio"; +import type { Video } from "../types/extractor.js"; + +class StreamTape { + private serverName = "StreamTape"; + private sources: Video[] = []; + + async extract(videoUrl: URL): Promise { + try { + const { data } = await axios.get(videoUrl.href).catch(() => { + throw new Error("Video not found"); + }); + + const $: CheerioAPI = load(data); + + let [fh, sh] = $.html() + ?.match(/robotlink'\).innerHTML = (.*)'/)![1] + .split("+ ('"); + + sh = sh.substring(3); + fh = fh.replace(/\'/g, ""); + + const url = `https:${fh}${sh}`; + + this.sources.push({ + url: url, + isM3U8: url.includes(".m3u8"), + }); + + return this.sources; + } catch (err) { + throw new Error((err as Error).message); + } + } +} +export default StreamTape; diff --git a/src/parsers/animeAboutInfo.ts b/src/parsers/animeAboutInfo.ts new file mode 100644 index 0000000000000000000000000000000000000000..cc3f936908a0cd88bc0d8456510bdfa6de7d8115 --- /dev/null +++ b/src/parsers/animeAboutInfo.ts @@ -0,0 +1,184 @@ +import { + SRC_BASE_URL, + extractAnimes, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, + extractMostPopularAnimes, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import { type ScrapedAnimeAboutInfo } from "../types/parsers/index.js"; + +// /anime/info?id=${anime-id} +async function scrapeAnimeAboutInfo( + id: string +): Promise { + const res: ScrapedAnimeAboutInfo = { + anime: { + info: { + id: null, + name: null, + poster: null, + description: null, + stats: { + rating: null, + quality: null, + episodes: { + sub: null, + dub: null, + }, + type: null, + duration: null, + }, + }, + moreInfo: {}, + }, + seasons: [], + mostPopularAnimes: [], + relatedAnimes: [], + recommendedAnimes: [], + }; + + try { + const animeUrl: URL = new URL(id, SRC_BASE_URL); + const mainPage = await axios.get(animeUrl.href, { + headers: { + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Accept: ACCEPT_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const selector: SelectorType = "#ani_detail .container .anis-content"; + + res.anime.info.id = + $(selector) + ?.find(".anisc-detail .film-buttons a.btn-play") + ?.attr("href") + ?.split("/") + ?.pop() || null; + res.anime.info.name = + $(selector) + ?.find(".anisc-detail .film-name.dynamic-name") + ?.text() + ?.trim() || null; + res.anime.info.description = + $(selector) + ?.find(".anisc-detail .film-description .text") + .text() + ?.split("[") + ?.shift() + ?.trim() || null; + res.anime.info.poster = + $(selector)?.find(".film-poster .film-poster-img")?.attr("src")?.trim() || + null; + + // stats + res.anime.info.stats.rating = + $(`${selector} .film-stats .tick .tick-pg`)?.text()?.trim() || null; + res.anime.info.stats.quality = + $(`${selector} .film-stats .tick .tick-quality`)?.text()?.trim() || null; + res.anime.info.stats.episodes = { + sub: + Number($(`${selector} .film-stats .tick .tick-sub`)?.text()?.trim()) || + null, + dub: + Number($(`${selector} .film-stats .tick .tick-dub`)?.text()?.trim()) || + null, + }; + res.anime.info.stats.type = + $(`${selector} .film-stats .tick`) + ?.text() + ?.trim() + ?.replace(/[\s\n]+/g, " ") + ?.split(" ") + ?.at(-2) || null; + res.anime.info.stats.duration = + $(`${selector} .film-stats .tick`) + ?.text() + ?.trim() + ?.replace(/[\s\n]+/g, " ") + ?.split(" ") + ?.pop() || null; + + // more information + $(`${selector} .anisc-info-wrap .anisc-info .item:not(.w-hide)`).each( + (i, el) => { + let key = $(el) + .find(".item-head") + .text() + .toLowerCase() + .replace(":", "") + .trim(); + key = key.includes(" ") ? key.replace(" ", "") : key; + + const value = [ + ...$(el) + .find("*:not(.item-head)") + .map((i, el) => $(el).text().trim()), + ] + .map((i) => `${i}`) + .toString() + .trim(); + + if (key === "genres") { + res.anime.moreInfo[key] = value.split(",").map((i) => i.trim()); + return; + } + if (key === "producers") { + res.anime.moreInfo[key] = value.split(",").map((i) => i.trim()); + return; + } + res.anime.moreInfo[key] = value; + } + ); + + // more seasons + const seasonsSelector: SelectorType = "#main-content .os-list a.os-item"; + $(seasonsSelector).each((i, el) => { + res.seasons.push({ + id: $(el)?.attr("href")?.slice(1)?.trim() || null, + name: $(el)?.attr("title")?.trim() || null, + title: $(el)?.find(".title")?.text()?.trim(), + poster: + $(el) + ?.find(".season-poster") + ?.attr("style") + ?.split(" ") + ?.pop() + ?.split("(") + ?.pop() + ?.split(")")[0] || null, + isCurrent: $(el).hasClass("active"), + }); + }); + + const relatedAnimeSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-realtime:nth-of-type(1) .anif-block-ul ul li"; + res.relatedAnimes = extractMostPopularAnimes($, relatedAnimeSelector); + + const mostPopularSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-realtime:nth-of-type(2) .anif-block-ul ul li"; + res.mostPopularAnimes = extractMostPopularAnimes($, mostPopularSelector); + + const recommendedAnimeSelector: SelectorType = + "#main-content .block_area.block_area_category .tab-content .flw-item"; + res.recommendedAnimes = extractAnimes($, recommendedAnimeSelector); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeAboutInfo; diff --git a/src/parsers/animeCategory.ts b/src/parsers/animeCategory.ts new file mode 100644 index 0000000000000000000000000000000000000000..c79fcc7d92039ef7d3ad64da10f56a2f25bab335 --- /dev/null +++ b/src/parsers/animeCategory.ts @@ -0,0 +1,118 @@ +import { + SRC_BASE_URL, + extractAnimes, + ACCEPT_HEADER, + USER_AGENT_HEADER, + extractTop10Animes, + ACCEPT_ENCODING_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import { type AnimeCategories } from "../types/anime.js"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import { type ScrapedAnimeCategory } from "../types/parsers/index.js"; + +// /anime/:category?page=${page} +async function scrapeAnimeCategory( + category: AnimeCategories, + page: number = 1 +): Promise { + const res: ScrapedAnimeCategory = { + animes: [], + genres: [], + top10Animes: { + today: [], + week: [], + month: [], + }, + category, + currentPage: Number(page), + hasNextPage: false, + totalPages: 1, + }; + + try { + const scrapeUrl: URL = new URL(category, SRC_BASE_URL); + const mainPage = await axios.get(`${scrapeUrl}?page=${page}`, { + headers: { + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Accept: ACCEPT_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const selector: SelectorType = + "#main-content .tab-content .film_list-wrap .flw-item"; + + const categoryNameSelector: SelectorType = + "#main-content .block_area .block_area-header .cat-heading"; + res.category = $(categoryNameSelector)?.text()?.trim() ?? category; + + res.hasNextPage = + $(".pagination > li").length > 0 + ? $(".pagination li.active").length > 0 + ? $(".pagination > li").last().hasClass("active") + ? false + : true + : false + : false; + + res.totalPages = + Number( + $('.pagination > .page-item a[title="Last"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $('.pagination > .page-item a[title="Next"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $(".pagination > .page-item.active a")?.text()?.trim() + ) || 1; + + res.animes = extractAnimes($, selector); + + if (res.animes.length === 0 && !res.hasNextPage) { + res.totalPages = 0; + } + + const genreSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-genres .sb-genre-list li"; + $(genreSelector).each((i, el) => { + res.genres.push(`${$(el).text().trim()}`); + }); + + const top10AnimeSelector: SelectorType = + '#main-sidebar .block_area-realtime [id^="top-viewed-"]'; + + $(top10AnimeSelector).each((i, el) => { + const period = $(el).attr("id")?.split("-")?.pop()?.trim(); + + if (period === "day") { + res.top10Animes.today = extractTop10Animes($, period); + return; + } + if (period === "week") { + res.top10Animes.week = extractTop10Animes($, period); + return; + } + if (period === "month") { + res.top10Animes.month = extractTop10Animes($, period); + } + }); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeCategory; diff --git a/src/parsers/animeEpisodeSrcs.ts b/src/parsers/animeEpisodeSrcs.ts new file mode 100644 index 0000000000000000000000000000000000000000..104bc4db6c09fdaec3c06bd8fbb67c9a1c9148ae --- /dev/null +++ b/src/parsers/animeEpisodeSrcs.ts @@ -0,0 +1,129 @@ +import { + SRC_AJAX_URL, + SRC_BASE_URL, + retrieveServerId, + USER_AGENT_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import { load, type CheerioAPI } from "cheerio"; +import createHttpError, { type HttpError } from "http-errors"; +import { type AnimeServers, Servers } from "../types/anime.js"; +import { + RapidCloud, + StreamSB, + StreamTape, + MegaCloud, +} from "../extractors/index.js"; +import { type ScrapedAnimeEpisodesSources } from "../types/parsers/index.js"; + +// vidtreaming -> 4 +// rapidcloud -> 1 +// streamsb -> 5 +// streamtape -> 3 + +// /anime/episode-srcs?id=${episodeId}?server=${server}&category=${category (dub or sub)} +async function scrapeAnimeEpisodeSources( + episodeId: string, + server: AnimeServers = Servers.VidStreaming, + category: "sub" | "dub" = "sub" +): Promise { + if (episodeId.startsWith("http")) { + const serverUrl = new URL(episodeId); + switch (server) { + case Servers.VidStreaming: + case Servers.VidCloud: + return { + ...(await new MegaCloud().extract(serverUrl)), + }; + case Servers.StreamSB: + return { + headers: { + Referer: serverUrl.href, + watchsb: "streamsb", + "User-Agent": USER_AGENT_HEADER, + }, + sources: await new StreamSB().extract(serverUrl, true), + }; + case Servers.StreamTape: + return { + headers: { Referer: serverUrl.href, "User-Agent": USER_AGENT_HEADER }, + sources: await new StreamTape().extract(serverUrl), + }; + default: // vidcloud + return { + headers: { Referer: serverUrl.href }, + ...(await new RapidCloud().extract(serverUrl)), + }; + } + } + + const epId = new URL(`/watch/${episodeId}`, SRC_BASE_URL).href; + console.log(epId); + + try { + const resp = await axios.get( + `${SRC_AJAX_URL}/v2/episode/servers?episodeId=${epId.split("?ep=")[1]}`, + { + headers: { + Referer: epId, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + }, + } + ); + + const $: CheerioAPI = load(resp.data.html); + + let serverId: string | null = null; + + try { + console.log("THE SERVER: ", server); + + switch (server) { + case Servers.VidCloud: { + serverId = retrieveServerId($, 1, category); + if (!serverId) throw new Error("RapidCloud not found"); + break; + } + case Servers.VidStreaming: { + serverId = retrieveServerId($, 4, category); + console.log("SERVER_ID: ", serverId); + if (!serverId) throw new Error("VidStreaming not found"); + break; + } + case Servers.StreamSB: { + serverId = retrieveServerId($, 5, category); + if (!serverId) throw new Error("StreamSB not found"); + break; + } + case Servers.StreamTape: { + serverId = retrieveServerId($, 3, category); + if (!serverId) throw new Error("StreamTape not found"); + break; + } + } + } catch (err) { + throw createHttpError.NotFound( + "Couldn't find server. Try another server" + ); + } + + const { + data: { link }, + } = await axios.get(`${SRC_AJAX_URL}/v2/episode/sources?id=${serverId}`); + console.log("THE LINK: ", link); + + return await scrapeAnimeEpisodeSources(link, server); + } catch (err: any) { + console.log(err); + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeEpisodeSources; diff --git a/src/parsers/animeEpisodes.ts b/src/parsers/animeEpisodes.ts new file mode 100644 index 0000000000000000000000000000000000000000..41ac942330ea951f60be1d30dd252075540906e1 --- /dev/null +++ b/src/parsers/animeEpisodes.ts @@ -0,0 +1,61 @@ +import { + SRC_BASE_URL, + SRC_AJAX_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import { load, type CheerioAPI } from "cheerio"; +import createHttpError, { type HttpError } from "http-errors"; +import { type ScrapedAnimeEpisodes } from "../types/parsers/index.js"; + +// /anime/episodes/${anime-id} +async function scrapeAnimeEpisodes( + animeId: string +): Promise { + const res: ScrapedAnimeEpisodes = { + totalEpisodes: 0, + episodes: [], + }; + + try { + const episodesAjax = await axios.get( + `${SRC_AJAX_URL}/v2/episode/list/${animeId.split("-").pop()}`, + { + headers: { + Accept: ACCEPT_HEADER, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Referer: `${SRC_BASE_URL}/watch/${animeId}`, + }, + } + ); + + const $: CheerioAPI = load(episodesAjax.data.html); + + res.totalEpisodes = Number($(".detail-infor-content .ss-list a").length); + + $(".detail-infor-content .ss-list a").each((i, el) => { + res.episodes.push({ + title: $(el)?.attr("title")?.trim() || null, + episodeId: $(el)?.attr("href")?.split("/")?.pop() || null, + number: Number($(el).attr("data-number")), + isFiller: $(el).hasClass("ssl-item-filler"), + }); + }); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeEpisodes; diff --git a/src/parsers/animeGenre.ts b/src/parsers/animeGenre.ts new file mode 100644 index 0000000000000000000000000000000000000000..110398d3f8f6b67ac5e0e0557c77721d76d8c3c4 --- /dev/null +++ b/src/parsers/animeGenre.ts @@ -0,0 +1,105 @@ +import { + SRC_BASE_URL, + ACCEPT_HEADER, + extractAnimes, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, + extractMostPopularAnimes, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import type { ScrapedGenreAnime } from "../types/parsers/index.js"; + +// /anime/genre/${name}?page=${page} +async function scrapeGenreAnime( + genreName: string, + page: number = 1 +): Promise { + const res: ScrapedGenreAnime = { + genreName, + animes: [], + genres: [], + topAiringAnimes: [], + totalPages: 1, + hasNextPage: false, + currentPage: Number(page), + }; + + // there's a typo with zoro where martial arts is marial arts + genreName = genreName === "martial-arts" ? "marial-arts" : genreName; + + try { + const genreUrl: URL = new URL( + `/genre/${genreName}?page=${page}`, + SRC_BASE_URL + ); + + const mainPage = await axios.get(genreUrl.href, { + headers: { + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Accept: ACCEPT_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const selector: SelectorType = + "#main-content .tab-content .film_list-wrap .flw-item"; + + const genreNameSelector: SelectorType = + "#main-content .block_area .block_area-header .cat-heading"; + res.genreName = $(genreNameSelector)?.text()?.trim() ?? genreName; + + res.hasNextPage = + $(".pagination > li").length > 0 + ? $(".pagination li.active").length > 0 + ? $(".pagination > li").last().hasClass("active") + ? false + : true + : false + : false; + + res.totalPages = + Number( + $('.pagination > .page-item a[title="Last"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $('.pagination > .page-item a[title="Next"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $(".pagination > .page-item.active a")?.text()?.trim() + ) || 1; + + res.animes = extractAnimes($, selector); + + if (res.animes.length === 0 && !res.hasNextPage) { + res.totalPages = 0; + } + + const genreSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-genres .sb-genre-list li"; + $(genreSelector).each((i, el) => { + res.genres.push(`${$(el).text().trim()}`); + }); + + const topAiringSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-realtime .anif-block-ul ul li"; + res.topAiringAnimes = extractMostPopularAnimes($, topAiringSelector); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeGenreAnime; diff --git a/src/parsers/animeProducer.ts b/src/parsers/animeProducer.ts new file mode 100644 index 0000000000000000000000000000000000000000..286285202968f8336328eda8c9b858e48c58ff9e --- /dev/null +++ b/src/parsers/animeProducer.ts @@ -0,0 +1,120 @@ +import { + SRC_BASE_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, + extractMostPopularAnimes, + extractAnimes, + extractTop10Animes, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import type { ScrapedProducerAnime } from "../types/parsers/index.js"; + +// /anime/producer/${name}?page=${page} +async function scrapeProducerAnimes( + producerName: string, + page: number = 1 +): Promise { + const res: ScrapedProducerAnime = { + producerName, + animes: [], + top10Animes: { + today: [], + week: [], + month: [], + }, + topAiringAnimes: [], + totalPages: 1, + hasNextPage: false, + currentPage: Number(page), + }; + + try { + const producerUrl: URL = new URL( + `/producer/${producerName}?page=${page}`, + SRC_BASE_URL + ); + + const mainPage = await axios.get(producerUrl.href, { + headers: { + Accept: ACCEPT_HEADER, + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const animeSelector: SelectorType = + "#main-content .tab-content .film_list-wrap .flw-item"; + + res.hasNextPage = + $(".pagination > li").length > 0 + ? $(".pagination li.active").length > 0 + ? $(".pagination > li").last().hasClass("active") + ? false + : true + : false + : false; + + res.totalPages = + Number( + $('.pagination > .page-item a[title="Last"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $('.pagination > .page-item a[title="Next"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $(".pagination > .page-item.active a")?.text()?.trim() + ) || 1; + + res.animes = extractAnimes($, animeSelector); + + if (res.animes.length === 0 && !res.hasNextPage) { + res.totalPages = 0; + } + + const producerNameSelector: SelectorType = + "#main-content .block_area .block_area-header .cat-heading"; + res.producerName = $(producerNameSelector)?.text()?.trim() ?? producerName; + + const top10AnimeSelector: SelectorType = + '#main-sidebar .block_area-realtime [id^="top-viewed-"]'; + + $(top10AnimeSelector).each((_, el) => { + const period = $(el).attr("id")?.split("-")?.pop()?.trim(); + + if (period === "day") { + res.top10Animes.today = extractTop10Animes($, period); + return; + } + if (period === "week") { + res.top10Animes.week = extractTop10Animes($, period); + return; + } + if (period === "month") { + res.top10Animes.month = extractTop10Animes($, period); + } + }); + + const topAiringSelector: SelectorType = + "#main-sidebar .block_area_sidebar:nth-child(2) .block_area-content .anif-block-ul ul li"; + res.topAiringAnimes = extractMostPopularAnimes($, topAiringSelector); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeProducerAnimes; diff --git a/src/parsers/animeSearch.ts b/src/parsers/animeSearch.ts new file mode 100644 index 0000000000000000000000000000000000000000..16818c6b2d59008f8a540db0df06cb21bbff81c9 --- /dev/null +++ b/src/parsers/animeSearch.ts @@ -0,0 +1,118 @@ +import { + SRC_SEARCH_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, + extractAnimes, + getSearchFilterValue, + extractMostPopularAnimes, + getSearchDateFilterValue, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import type { ScrapedAnimeSearchResult } from "../types/parsers/index.js"; +import type { SearchFilters, FilterKeys } from "../types/controllers/index.js"; + +// /anime/search?q=${query}&page=${page} +async function scrapeAnimeSearch( + q: string, + page: number = 1, + filters: SearchFilters +): Promise { + const res: ScrapedAnimeSearchResult = { + animes: [], + mostPopularAnimes: [], + currentPage: Number(page), + hasNextPage: false, + totalPages: 1, + searchQuery: q, + searchFilters: filters, + }; + + try { + const url = new URL(SRC_SEARCH_URL); + url.searchParams.set("keyword", q); + url.searchParams.set("page", `${page}`); + url.searchParams.set("sort", "default"); + + for (const key in filters) { + if (key.includes("_date")) { + const dates = getSearchDateFilterValue( + key === "start_date", + filters[key as keyof SearchFilters] || "" + ); + if (!dates) continue; + + dates.map((dateParam) => { + const [key, val] = dateParam.split("="); + url.searchParams.set(key, val); + }); + continue; + } + + const filterVal = getSearchFilterValue( + key as FilterKeys, + filters[key as keyof SearchFilters] || "" + ); + filterVal && url.searchParams.set(key, filterVal); + } + + const mainPage = await axios.get(url.href, { + headers: { + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Accept: ACCEPT_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const selector: SelectorType = + "#main-content .tab-content .film_list-wrap .flw-item"; + + res.hasNextPage = + $(".pagination > li").length > 0 + ? $(".pagination li.active").length > 0 + ? $(".pagination > li").last().hasClass("active") + ? false + : true + : false + : false; + + res.totalPages = + Number( + $('.pagination > .page-item a[title="Last"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $('.pagination > .page-item a[title="Next"]') + ?.attr("href") + ?.split("=") + .pop() ?? + $(".pagination > .page-item.active a")?.text()?.trim() + ) || 1; + + res.animes = extractAnimes($, selector); + + if (res.animes.length === 0 && !res.hasNextPage) { + res.totalPages = 0; + } + + const mostPopularSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-realtime .anif-block-ul ul li"; + res.mostPopularAnimes = extractMostPopularAnimes($, mostPopularSelector); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeSearch; diff --git a/src/parsers/animeSearchSuggestion.ts b/src/parsers/animeSearchSuggestion.ts new file mode 100644 index 0000000000000000000000000000000000000000..acef2a4caab9e9134a781f8c06ce204688de688e --- /dev/null +++ b/src/parsers/animeSearchSuggestion.ts @@ -0,0 +1,77 @@ +import { + SRC_HOME_URL, + SRC_AJAX_URL, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import type { ScrapedAnimeSearchSuggestion } from "../types/parsers/index.js"; + +// /anime/search/suggest?q=${query} +async function scrapeAnimeSearchSuggestion( + q: string +): Promise { + const res: ScrapedAnimeSearchSuggestion = { + suggestions: [], + }; + + try { + const { data } = await axios.get( + `${SRC_AJAX_URL}/search/suggest?keyword=${encodeURIComponent(q)}`, + { + headers: { + Accept: "*/*", + Pragma: "no-cache", + Referer: SRC_HOME_URL, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + }, + } + ); + + const $: CheerioAPI = load(data.html); + const selector: SelectorType = ".nav-item:has(.film-poster)"; + + if ($(selector).length < 1) return res; + + $(selector).each((_, el) => { + const id = $(el).attr("href")?.split("?")[0].includes("javascript") + ? null + : $(el).attr("href")?.split("?")[0]?.slice(1); + + res.suggestions.push({ + id, + name: $(el).find(".srp-detail .film-name")?.text()?.trim() || null, + jname: + $(el).find(".srp-detail .film-name")?.attr("data-jname")?.trim() || + $(el).find(".srp-detail .alias-name")?.text()?.trim() || + null, + poster: $(el) + .find(".film-poster .film-poster-img") + ?.attr("data-src") + ?.trim(), + moreInfo: [ + ...$(el) + .find(".film-infor") + .contents() + .map((_, el) => $(el).text().trim()), + ].filter((i) => i), + }); + }); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeAnimeSearchSuggestion; diff --git a/src/parsers/episodeServers.ts b/src/parsers/episodeServers.ts new file mode 100644 index 0000000000000000000000000000000000000000..97b4237179b818fed3f3fea5a1db51357b8ac9d7 --- /dev/null +++ b/src/parsers/episodeServers.ts @@ -0,0 +1,75 @@ +import { + SRC_BASE_URL, + SRC_AJAX_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import type { ScrapedEpisodeServers } from "../types/parsers/index.js"; + +// /anime/servers?episodeId=${id} +async function scrapeEpisodeServers( + episodeId: string +): Promise { + const res: ScrapedEpisodeServers = { + sub: [], + dub: [], + episodeId, + episodeNo: 0, + }; + + try { + const epId = episodeId.split("?ep=")[1]; + + const { data } = await axios.get( + `${SRC_AJAX_URL}/v2/episode/servers?episodeId=${epId}`, + { + headers: { + Accept: ACCEPT_HEADER, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Referer: new URL(`/watch/${episodeId}`, SRC_BASE_URL).href, + }, + } + ); + + const $: CheerioAPI = load(data.html); + + const epNoSelector: SelectorType = ".server-notice strong"; + res.episodeNo = Number($(epNoSelector).text().split(" ").pop()) || 0; + + $(`.ps_-block.ps_-block-sub.servers-sub .ps__-list .server-item`).each( + (_, el) => { + res.sub.push({ + serverName: $(el).find("a").text().toLowerCase().trim(), + serverId: Number($(el)?.attr("data-server-id")?.trim()) || null, + }); + } + ); + + $(`.ps_-block.ps_-block-sub.servers-dub .ps__-list .server-item`).each( + (_, el) => { + res.dub.push({ + serverName: $(el).find("a").text().toLowerCase().trim(), + serverId: Number($(el)?.attr("data-server-id")?.trim()) || null, + }); + } + ); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeEpisodeServers; diff --git a/src/parsers/estimatedSchedule.ts b/src/parsers/estimatedSchedule.ts new file mode 100644 index 0000000000000000000000000000000000000000..6fc3d1a52a0aa9f355df0f253c4f4dbd76644fbb --- /dev/null +++ b/src/parsers/estimatedSchedule.ts @@ -0,0 +1,67 @@ +import { + SRC_HOME_URL, + SRC_AJAX_URL, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; +import { type ScrapedEstimatedSchedule } from "../types/parsers/index.js"; + +// /anime/schedule?date=${date} +async function scrapeEstimatedSchedule( + date: string +): Promise { + const res: ScrapedEstimatedSchedule = { + scheduledAnimes: [], + }; + + try { + const estScheduleURL = + `${SRC_AJAX_URL}/schedule/list?tzOffset=-330&date=${date}` as const; + + const mainPage = await axios.get(estScheduleURL, { + headers: { + Accept: "*/*", + Referer: SRC_HOME_URL, + "User-Agent": USER_AGENT_HEADER, + "X-Requested-With": "XMLHttpRequest", + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage?.data?.html); + + const selector: SelectorType = "li"; + + if ($(selector)?.text()?.trim()?.includes("No data to display")) { + return res; + } + + $(selector).each((_, el) => { + res.scheduledAnimes.push({ + id: $(el)?.find("a")?.attr("href")?.slice(1)?.trim() || null, + time: $(el)?.find("a .time")?.text()?.trim() || null, + name: $(el)?.find("a .film-name.dynamic-name")?.text()?.trim() || null, + jname: + $(el) + ?.find("a .film-name.dynamic-name") + ?.attr("data-jname") + ?.trim() || null, + }); + }); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeEstimatedSchedule; diff --git a/src/parsers/homePage.ts b/src/parsers/homePage.ts new file mode 100644 index 0000000000000000000000000000000000000000..979e7e9e8f8601eaab6f41021f751ec82a1fafbb --- /dev/null +++ b/src/parsers/homePage.ts @@ -0,0 +1,203 @@ +import { + SRC_HOME_URL, + ACCEPT_HEADER, + USER_AGENT_HEADER, + ACCEPT_ENCODING_HEADER, + extractTop10Animes, + extractAnimes, +} from "../utils/index.js"; +import axios, { AxiosError } from "axios"; +import createHttpError, { type HttpError } from "http-errors"; +import type { ScrapedHomePage } from "../types/parsers/index.js"; +import { load, type CheerioAPI, type SelectorType } from "cheerio"; + +// /anime/home +async function scrapeHomePage(): Promise { + const res: ScrapedHomePage = { + spotlightAnimes: [], + trendingAnimes: [], + latestEpisodeAnimes: [], + topUpcomingAnimes: [], + top10Animes: { + today: [], + week: [], + month: [], + }, + topAiringAnimes: [], + genres: [], + }; + + try { + const mainPage = await axios.get(SRC_HOME_URL as string, { + headers: { + "User-Agent": USER_AGENT_HEADER, + "Accept-Encoding": ACCEPT_ENCODING_HEADER, + Accept: ACCEPT_HEADER, + }, + }); + + const $: CheerioAPI = load(mainPage.data); + + const spotlightSelector: SelectorType = + "#slider .swiper-wrapper .swiper-slide"; + + $(spotlightSelector).each((i, el) => { + const otherInfo = $(el) + .find(".deslide-item-content .sc-detail .scd-item") + .map((i, el) => $(el).text().trim()) + .get() + .slice(0, -1); + + res.spotlightAnimes.push({ + rank: + Number( + $(el) + .find(".deslide-item-content .desi-sub-text") + ?.text() + .trim() + .split(" ")[0] + .slice(1) + ) || null, + id: $(el) + .find(".deslide-item-content .desi-buttons a") + ?.last() + ?.attr("href") + ?.slice(1) + ?.trim(), + name: $(el) + .find(".deslide-item-content .desi-head-title.dynamic-name") + ?.text() + .trim(), + description: $(el) + .find(".deslide-item-content .desi-description") + ?.text() + ?.split("[") + ?.shift() + ?.trim(), + poster: $(el) + .find(".deslide-cover .deslide-cover-img .film-poster-img") + ?.attr("data-src") + ?.trim(), + jname: $(el) + .find(".deslide-item-content .desi-head-title.dynamic-name") + ?.attr("data-jname") + ?.trim(), + episodes: { + sub: + Number( + $(el) + .find( + ".deslide-item-content .sc-detail .scd-item .tick-item.tick-sub" + ) + ?.text() + ?.trim() + ) || null, + dub: + Number( + $(el) + .find( + ".deslide-item-content .sc-detail .scd-item .tick-item.tick-dub" + ) + ?.text() + ?.trim() + ) || null, + }, + otherInfo, + }); + }); + + const trendingSelector: SelectorType = + "#trending-home .swiper-wrapper .swiper-slide"; + + $(trendingSelector).each((i, el) => { + res.trendingAnimes.push({ + rank: parseInt( + $(el).find(".item .number")?.children()?.first()?.text()?.trim() + ), + name: $(el) + .find(".item .number .film-title.dynamic-name") + ?.text() + ?.trim(), + id: $(el).find(".item .film-poster")?.attr("href")?.slice(1)?.trim(), + poster: $(el) + .find(".item .film-poster .film-poster-img") + ?.attr("data-src") + ?.trim(), + }); + }); + + const latestEpisodeSelector: SelectorType = + "#main-content .block_area_home:nth-of-type(1) .tab-content .film_list-wrap .flw-item"; + res.latestEpisodeAnimes = extractAnimes($, latestEpisodeSelector); + + const topUpcomingSelector: SelectorType = + "#main-content .block_area_home:nth-of-type(3) .tab-content .film_list-wrap .flw-item"; + res.topUpcomingAnimes = extractAnimes($, topUpcomingSelector); + + const genreSelector: SelectorType = + "#main-sidebar .block_area.block_area_sidebar.block_area-genres .sb-genre-list li"; + $(genreSelector).each((i, el) => { + res.genres.push(`${$(el).text().trim()}`); + }); + + const mostViewedSelector: SelectorType = + '#main-sidebar .block_area-realtime [id^="top-viewed-"]'; + $(mostViewedSelector).each((i, el) => { + const period = $(el).attr("id")?.split("-")?.pop()?.trim(); + + if (period === "day") { + res.top10Animes.today = extractTop10Animes($, period); + return; + } + if (period === "week") { + res.top10Animes.week = extractTop10Animes($, period); + return; + } + if (period === "month") { + res.top10Animes.month = extractTop10Animes($, period); + } + }); + + const topAiringSelector: SelectorType = + "#anime-featured .row div:nth-of-type(1) .anif-block-ul ul li"; + $(topAiringSelector).each((i, el) => { + const otherInfo = $(el) + .find(".fd-infor .fdi-item") + .map((i, el) => $(el).text().trim()) + .get(); + + res.topAiringAnimes.push({ + id: $(el) + .find(".film-detail .film-name .dynamic-name") + ?.attr("href") + ?.slice(1) + ?.trim(), + name: $(el) + .find(".film-detail .film-name .dynamic-name") + ?.attr("title") + ?.trim(), + jname: $(el) + .find(".film-detail .film-name .dynamic-name") + ?.attr("data-jname") + ?.trim(), + poster: $(el) + .find(".film-poster a .film-poster-img") + ?.attr("data-src") + ?.trim(), + otherInfo, + }); + }); + + return res; + } catch (err: any) { + if (err instanceof AxiosError) { + throw createHttpError( + err?.response?.status || 500, + err?.response?.statusText || "Something went wrong" + ); + } + throw createHttpError.InternalServerError(err?.message); + } +} + +export default scrapeHomePage; diff --git a/src/parsers/index.ts b/src/parsers/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..395833a31e672dbfc2443b0bac3db2b145168ff0 --- /dev/null +++ b/src/parsers/index.ts @@ -0,0 +1,25 @@ +import scrapeHomePage from "./homePage.js"; +import scrapeGenreAnime from "./animeGenre.js"; +import scrapeAnimeSearch from "./animeSearch.js"; +import scrapeAnimeEpisodes from "./animeEpisodes.js"; +import scrapeAnimeCategory from "./animeCategory.js"; +import scrapeProducerAnimes from "./animeProducer.js"; +import scrapeEpisodeServers from "./episodeServers.js"; +import scrapeAnimeAboutInfo from "./animeAboutInfo.js"; +import scrapeEstimatedSchedule from "./estimatedSchedule.js"; +import scrapeAnimeEpisodeSources from "./animeEpisodeSrcs.js"; +import scrapeAnimeSearchSuggestion from "./animeSearchSuggestion.js"; + +export { + scrapeHomePage, + scrapeGenreAnime, + scrapeAnimeSearch, + scrapeAnimeEpisodes, + scrapeAnimeCategory, + scrapeEpisodeServers, + scrapeProducerAnimes, + scrapeAnimeAboutInfo, + scrapeEstimatedSchedule, + scrapeAnimeEpisodeSources, + scrapeAnimeSearchSuggestion, +}; diff --git a/src/routes/index.ts b/src/routes/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..7046ae1dff0700bbcb9e04559caad226fa1c2468 --- /dev/null +++ b/src/routes/index.ts @@ -0,0 +1,55 @@ +import { Router, type IRouter } from "express"; +import { + getGenreAnime, + getAnimeSearch, + getHomePageInfo, + getAnimeCategory, + getAnimeEpisodes, + getEpisodeServers, + getProducerAnimes, + getAnimeAboutInfo, + getEstimatedSchedule, + getAnimeEpisodeSources, + getAnimeSearchSuggestion, +} from "../controllers/index.js"; + +const router: IRouter = Router(); + +// /anime +router.get("/", (_, res) => res.redirect("/")); + +// /anime/home +router.get("/home", getHomePageInfo); + +// /anime/info?id=${anime-id} +router.get("/info", getAnimeAboutInfo); + +// /anime/genre/${name}?page=${page} +router.get("/genre/:name", getGenreAnime); + +// /anime/search?q=${query}&page=${page} +router.get("/search", getAnimeSearch); + +// /anime/search/suggest?q=${query} +router.get("/search/suggest", getAnimeSearchSuggestion); + +// /anime/episodes/${anime-id} +router.get("/episodes/:animeId", getAnimeEpisodes); + +// /anime/servers?episodeId=${id} +router.get("/servers", getEpisodeServers); + +// episodeId=steinsgate-3?ep=230 +// /anime/episode-srcs?id=${episodeId}?server=${server}&category=${category (dub or sub)} +router.get("/episode-srcs", getAnimeEpisodeSources); + +// /anime/schedule?date=${date} +router.get("/schedule", getEstimatedSchedule); + +// /anime/producer/${name}?page=${page} +router.get("/producer/:name", getProducerAnimes); + +// /anime/:category?page=${page} +router.get("/:category", getAnimeCategory); + +export default router; diff --git a/src/server.ts b/src/server.ts new file mode 100644 index 0000000000000000000000000000000000000000..3520958c58e11ffb191b4bba46fd7cfa07a3ad68 --- /dev/null +++ b/src/server.ts @@ -0,0 +1,60 @@ +import https from "https"; +import morgan from "morgan"; +import express from "express"; +import { resolve } from "path"; +import { config } from "dotenv"; + +import corsConfig from "./config/cors.js"; +import { ratelimit } from "./config/ratelimit.js"; +import errorHandler from "./config/errorHandler.js"; +import notFoundHandler from "./config/notFoundHandler.js"; + +import animeRouter from "./routes/index.js"; + +config(); +const app: express.Application = express(); +const PORT: number = Number(process.env.PORT) || 4000; + +app.use(morgan("dev")); +app.use(corsConfig); + +// CAUTION: For personal deployments, "refrain" from having an env +// named "ANIWATCH_API_HOSTNAME". You may face rate limitting +// and other issues if you do. +const ISNT_PERSONAL_DEPLOYMENT = Boolean(process?.env?.ANIWATCH_API_HOSTNAME); +if (ISNT_PERSONAL_DEPLOYMENT) { + app.use(ratelimit); +} + +app.use(express.static(resolve("public"))); +app.get("/health", (_, res) => res.sendStatus(200)); +app.use("/anime", animeRouter); + +app.use(notFoundHandler); +app.use(errorHandler); + +// NOTE: this env is "required" for vercel deployments +if (!Boolean(process?.env?.IS_VERCEL_DEPLOYMENT)) { + app.listen(PORT, () => { + console.log(`⚔️ api @ http://localhost:${PORT}`); + }); + + // NOTE: remove the `if` block below for personal deployments + if (ISNT_PERSONAL_DEPLOYMENT) { + // don't sleep + const intervalTime = 9 * 60 * 1000; // 9mins + setInterval(() => { + console.log("HEALTHCHECK ;)", new Date().toLocaleString()); + https + .get( + new URL("/health", `https://${process.env.ANIWATCH_API_HOSTNAME}`) + .href + ) + .on("error", (err) => { + console.error(err.message); + }); + }, intervalTime); + } +} + +export default app; diff --git a/src/types/anime.ts b/src/types/anime.ts new file mode 100644 index 0000000000000000000000000000000000000000..7365c5e37caade4798218059442b6ea50b6dfa41 --- /dev/null +++ b/src/types/anime.ts @@ -0,0 +1,113 @@ +export interface Anime { + id: string | null; + name: string | null; + poster: string | null; + duration: string | null; + type: string | null; + rating: string | null; + episodes: { + sub: number | null; + dub: number | null; + }; +} + +type CommonAnimeProps = "id" | "name" | "poster"; + +export interface Top10Anime extends Pick { + rank: number | null; +} + +export type Top10AnimeTimePeriod = "day" | "week" | "month"; + +export interface MostPopularAnime + extends Pick { + jname: string | null; +} + +export interface SpotlightAnime + extends MostPopularAnime, + Pick { + description: string | null; +} + +export interface TrendingAnime + extends Pick, + Pick {} + +export interface LatestEpisodeAnime extends Anime {} + +export interface TopUpcomingAnime extends Anime {} + +export interface TopAiringAnime extends MostPopularAnime {} + +export interface AnimeGeneralAboutInfo + extends Pick, + Pick { + stats: { + quality: string | null; + } & Pick; +} + +export interface RecommendedAnime extends Anime {} + +export interface RelatedAnime extends MostPopularAnime {} + +export interface Season extends Pick { + isCurrent: boolean; + title: string | null; +} + +export interface AnimeSearchSuggestion + extends Omit { + moreInfo: Array; +} + +export interface AnimeEpisode extends Pick { + episodeId: string | null; + number: number; + isFiller: boolean; +} + +export interface SubEpisode { + serverName: string; + serverId: number | null; +} +export interface DubEpisode extends SubEpisode {} + +export type AnimeCategories = + | "most-favorite" + | "most-popular" + | "subbed-anime" + | "dubbed-anime" + | "recently-updated" + | "recently-added" + | "top-upcoming" + | "top-airing" + | "movie" + | "special" + | "ova" + | "ona" + | "tv" + | "completed"; + +export type AnimeServers = + | "vidstreaming" + | "megacloud" + | "streamsb" + | "streamtape" + | "vidcloud"; + +export enum Servers { + VidStreaming = "vidstreaming", + MegaCloud = "megacloud", + StreamSB = "streamsb", + StreamTape = "streamtape", + VidCloud = "vidcloud", + AsianLoad = "asianload", + GogoCDN = "gogocdn", + MixDrop = "mixdrop", + UpCloud = "upcloud", + VizCloud = "vizcloud", + MyCloud = "mycloud", + Filemoon = "filemoon", +} diff --git a/src/types/controllers/animeAboutInfo.ts b/src/types/controllers/animeAboutInfo.ts new file mode 100644 index 0000000000000000000000000000000000000000..aa65efd72d9ce2674f8d6df5ea377532d49303a5 --- /dev/null +++ b/src/types/controllers/animeAboutInfo.ts @@ -0,0 +1,3 @@ +export type AnimeAboutInfoQueryParams = { + id?: string; +}; diff --git a/src/types/controllers/animeCategory.ts b/src/types/controllers/animeCategory.ts new file mode 100644 index 0000000000000000000000000000000000000000..7de08767b1324cd792539fb67879dbe98161b4c9 --- /dev/null +++ b/src/types/controllers/animeCategory.ts @@ -0,0 +1,7 @@ +export type CategoryAnimePathParams = { + category?: string; +}; + +export type CategoryAnimeQueryParams = { + page?: string; +}; diff --git a/src/types/controllers/animeEpisodeSrcs.ts b/src/types/controllers/animeEpisodeSrcs.ts new file mode 100644 index 0000000000000000000000000000000000000000..0e82539a2e3405785c5d3962e33728228d368178 --- /dev/null +++ b/src/types/controllers/animeEpisodeSrcs.ts @@ -0,0 +1,7 @@ +import { type AnimeServers } from "../anime.js"; + +export type AnimeEpisodeSrcsQueryParams = { + id?: string; + server?: AnimeServers; + category?: "sub" | "dub"; +}; diff --git a/src/types/controllers/animeEpisodes.ts b/src/types/controllers/animeEpisodes.ts new file mode 100644 index 0000000000000000000000000000000000000000..481eb876a28909c22990f2183d6c6cad0ac19b5a --- /dev/null +++ b/src/types/controllers/animeEpisodes.ts @@ -0,0 +1,3 @@ +export type AnimeEpisodePathParams = { + animeId?: string; +}; diff --git a/src/types/controllers/animeGenre.ts b/src/types/controllers/animeGenre.ts new file mode 100644 index 0000000000000000000000000000000000000000..038d15e77daf866203f37f03a03b6f7c7a1d7b84 --- /dev/null +++ b/src/types/controllers/animeGenre.ts @@ -0,0 +1,7 @@ +export type GenreAnimePathParams = { + name?: string; +}; + +export type GenreAnimeQueryParams = { + page?: string; +}; diff --git a/src/types/controllers/animeProducer.ts b/src/types/controllers/animeProducer.ts new file mode 100644 index 0000000000000000000000000000000000000000..7f8a9a7f97da0a82563950f15e7e3a3ee96dc7af --- /dev/null +++ b/src/types/controllers/animeProducer.ts @@ -0,0 +1,7 @@ +export type AnimeProducerPathParams = { + name?: string; +}; + +export type AnimeProducerQueryParams = { + page?: string; +}; diff --git a/src/types/controllers/animeSearch.ts b/src/types/controllers/animeSearch.ts new file mode 100644 index 0000000000000000000000000000000000000000..fccb24367c49aa024e2f6509cfb597ca86546b54 --- /dev/null +++ b/src/types/controllers/animeSearch.ts @@ -0,0 +1,20 @@ +export type AnimeSearchQueryParams = { + q?: string; + page?: string; + type?: string; + status?: string; + rated?: string; + score?: string; + season?: string; + language?: string; + start_date?: string; + end_date?: string; + sort?: string; + genres?: string; +}; + +export type SearchFilters = Omit; + +export type FilterKeys = Partial< + keyof Omit +>; diff --git a/src/types/controllers/animeSearchSuggestion.ts b/src/types/controllers/animeSearchSuggestion.ts new file mode 100644 index 0000000000000000000000000000000000000000..491daa042dba0f04ecf6146500b72c7d230b671c --- /dev/null +++ b/src/types/controllers/animeSearchSuggestion.ts @@ -0,0 +1,3 @@ +export type AnimeSearchSuggestQueryParams = { + q?: string; +}; diff --git a/src/types/controllers/episodeServers.ts b/src/types/controllers/episodeServers.ts new file mode 100644 index 0000000000000000000000000000000000000000..d711a400426b6a2a6ad2ddf8b76245325a3d1e46 --- /dev/null +++ b/src/types/controllers/episodeServers.ts @@ -0,0 +1,3 @@ +export type EpisodeServersQueryParams = { + episodeId?: string; +}; diff --git a/src/types/controllers/estimatedSchedule.ts b/src/types/controllers/estimatedSchedule.ts new file mode 100644 index 0000000000000000000000000000000000000000..e732aaab44a1080909fbc6a911d002bb7da74471 --- /dev/null +++ b/src/types/controllers/estimatedSchedule.ts @@ -0,0 +1,3 @@ +export type EstimatedScheduleQueryParams = { + date?: string; +}; diff --git a/src/types/controllers/index.ts b/src/types/controllers/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..14876e289f5517843e962032c44ce73412cb0ec4 --- /dev/null +++ b/src/types/controllers/index.ts @@ -0,0 +1,10 @@ +export type * from "./animeGenre.js"; +export type * from "./animeCategory.js"; +export type * from "./animeProducer.js"; +export type * from "./animeSearch.js"; +export type * from "./animeEpisodes.js"; +export type * from "./episodeServers.js"; +export type * from "./animeAboutInfo.js"; +export type * from "./animeEpisodeSrcs.js"; +export type * from "./estimatedSchedule.js"; +export type * from "./animeSearchSuggestion.js"; diff --git a/src/types/extractor.ts b/src/types/extractor.ts new file mode 100644 index 0000000000000000000000000000000000000000..71c6be15b69aaabe4feb474dfe3253773bd2670f --- /dev/null +++ b/src/types/extractor.ts @@ -0,0 +1,18 @@ +export interface Video { + url: string; + quality?: string; + isM3U8?: boolean; + size?: number; + [x: string]: unknown; +} + +export interface Subtitle { + id?: string; + url: string; + lang: string; +} + +export interface Intro { + start: number; + end: number; +} diff --git a/src/types/parsers/animeAboutInfo.ts b/src/types/parsers/animeAboutInfo.ts new file mode 100644 index 0000000000000000000000000000000000000000..274e7c2b2fdcb4a4f0f2e5ac7e0434b4c1aec316 --- /dev/null +++ b/src/types/parsers/animeAboutInfo.ts @@ -0,0 +1,19 @@ +import type { + Season, + RelatedAnime, + RecommendedAnime, + AnimeGeneralAboutInfo, +} from "../anime.js"; +import { type HttpError } from "http-errors"; +import { type ScrapedAnimeSearchResult } from "./animeSearch.js"; + +export interface ScrapedAnimeAboutInfo + extends Pick { + anime: { + info: AnimeGeneralAboutInfo; + moreInfo: Record; + }; + seasons: Array; + relatedAnimes: Array | HttpError; + recommendedAnimes: Array | HttpError; +} diff --git a/src/types/parsers/animeCategory.ts b/src/types/parsers/animeCategory.ts new file mode 100644 index 0000000000000000000000000000000000000000..89c56f0da9ec363dd7db0aafd25ec2739276d82b --- /dev/null +++ b/src/types/parsers/animeCategory.ts @@ -0,0 +1,22 @@ +import type { HttpError } from "http-errors"; +import type { Anime, Top10Anime } from "../anime.js"; + +export interface ScrapedAnimeCategory { + animes: Array | HttpError; + genres: Array; + top10Animes: { + today: Array | HttpError; + week: Array | HttpError; + month: Array | HttpError; + }; + category: string; + totalPages: number; + currentPage: number; + hasNextPage: boolean; +} + +export type CommonAnimeScrapeTypes = + | "animes" + | "totalPages" + | "hasNextPage" + | "currentPage"; diff --git a/src/types/parsers/animeEpisodeSrcs.ts b/src/types/parsers/animeEpisodeSrcs.ts new file mode 100644 index 0000000000000000000000000000000000000000..ef58ed3c96211d113b1fdd208a6fca92942ddb46 --- /dev/null +++ b/src/types/parsers/animeEpisodeSrcs.ts @@ -0,0 +1,12 @@ +import type { Intro, Subtitle, Video } from "../extractor.js"; + +export interface ScrapedAnimeEpisodesSources { + headers?: { + [k: string]: string; + }; + intro?: Intro; + subtitles?: Subtitle[]; + sources: Video[]; + download?: string; + embedURL?: string; +} diff --git a/src/types/parsers/animeEpisodes.ts b/src/types/parsers/animeEpisodes.ts new file mode 100644 index 0000000000000000000000000000000000000000..3587573e525d228320caf47b63ba21dd12ecebb9 --- /dev/null +++ b/src/types/parsers/animeEpisodes.ts @@ -0,0 +1,6 @@ +import { type AnimeEpisode } from "../anime.js"; + +export interface ScrapedAnimeEpisodes { + totalEpisodes: number; + episodes: Array; +} diff --git a/src/types/parsers/animeGenre.ts b/src/types/parsers/animeGenre.ts new file mode 100644 index 0000000000000000000000000000000000000000..10f8b858c9bc9ce2011d920e6ba7bd7f1690415d --- /dev/null +++ b/src/types/parsers/animeGenre.ts @@ -0,0 +1,11 @@ +import type { + ScrapedAnimeCategory, + CommonAnimeScrapeTypes, +} from "./animeCategory.js"; +import { type ScrapedHomePage } from "./homePage.js"; + +export interface ScrapedGenreAnime + extends Pick, + Pick { + genreName: string; +} diff --git a/src/types/parsers/animeProducer.ts b/src/types/parsers/animeProducer.ts new file mode 100644 index 0000000000000000000000000000000000000000..bcd784e77fcfe7edb29c846fbd8675649cf97e3e --- /dev/null +++ b/src/types/parsers/animeProducer.ts @@ -0,0 +1,8 @@ +import type { ScrapedHomePage } from "./homePage.js"; +import type { ScrapedAnimeCategory } from "./animeCategory.js"; + +export interface ScrapedProducerAnime + extends Omit, + Pick { + producerName: string; +} diff --git a/src/types/parsers/animeSearch.ts b/src/types/parsers/animeSearch.ts new file mode 100644 index 0000000000000000000000000000000000000000..7e641eaba0835f6c955931f61dae72ea5aa01a48 --- /dev/null +++ b/src/types/parsers/animeSearch.ts @@ -0,0 +1,14 @@ +import type { + ScrapedAnimeCategory, + CommonAnimeScrapeTypes, +} from "./animeCategory.js"; +import type { HttpError } from "http-errors"; +import type { MostPopularAnime } from "../anime.js"; +import type { SearchFilters } from "../controllers/animeSearch.js"; + +export interface ScrapedAnimeSearchResult + extends Pick { + mostPopularAnimes: Array | HttpError; + searchQuery: string; + searchFilters: SearchFilters; +} diff --git a/src/types/parsers/animeSearchSuggestion.ts b/src/types/parsers/animeSearchSuggestion.ts new file mode 100644 index 0000000000000000000000000000000000000000..4c262a4250d7d7c0c1a0a536897c1c91754566ed --- /dev/null +++ b/src/types/parsers/animeSearchSuggestion.ts @@ -0,0 +1,6 @@ +import type { HttpError } from "http-errors"; +import type { AnimeSearchSuggestion } from "../anime.js"; + +export interface ScrapedAnimeSearchSuggestion { + suggestions: Array | HttpError; +} diff --git a/src/types/parsers/episodeServers.ts b/src/types/parsers/episodeServers.ts new file mode 100644 index 0000000000000000000000000000000000000000..2d8cb5e1c5adbfbc99cd00920e291665b199f19e --- /dev/null +++ b/src/types/parsers/episodeServers.ts @@ -0,0 +1,8 @@ +import type { SubEpisode, DubEpisode } from "../anime.js"; + +export interface ScrapedEpisodeServers { + sub: SubEpisode[]; + dub: DubEpisode[]; + episodeNo: number; + episodeId: string; +} diff --git a/src/types/parsers/estimatedSchedule.ts b/src/types/parsers/estimatedSchedule.ts new file mode 100644 index 0000000000000000000000000000000000000000..f62dd14d596b2fce73513a5464c3f6bfd6aafacb --- /dev/null +++ b/src/types/parsers/estimatedSchedule.ts @@ -0,0 +1,10 @@ +type EstimatedSchedule = { + id: string | null; + time: string | null; + name: string | null; + jname: string | null; +}; + +export type ScrapedEstimatedSchedule = { + scheduledAnimes: Array; +}; diff --git a/src/types/parsers/homePage.ts b/src/types/parsers/homePage.ts new file mode 100644 index 0000000000000000000000000000000000000000..5641a9a92759dc76d92a69cb2ef5b185e0da7b40 --- /dev/null +++ b/src/types/parsers/homePage.ts @@ -0,0 +1,18 @@ +import type { + TrendingAnime, + SpotlightAnime, + TopAiringAnime, + TopUpcomingAnime, + LatestEpisodeAnime, +} from "../anime.js"; +import type { HttpError } from "http-errors"; +import type { ScrapedAnimeCategory } from "./animeCategory.js"; + +export interface ScrapedHomePage + extends Pick { + spotlightAnimes: Array | HttpError; + trendingAnimes: Array | HttpError; + latestEpisodeAnimes: Array | HttpError; + topUpcomingAnimes: Array | HttpError; + topAiringAnimes: Array | HttpError; +} diff --git a/src/types/parsers/index.ts b/src/types/parsers/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..057e2fcbee3c556625ab149fe02fd23acbb950b3 --- /dev/null +++ b/src/types/parsers/index.ts @@ -0,0 +1,25 @@ +import type { ScrapedHomePage } from "./homePage.js"; +import type { ScrapedGenreAnime } from "./animeGenre.js"; +import type { ScrapedAnimeEpisodes } from "./animeEpisodes.js"; +import type { ScrapedAnimeCategory } from "./animeCategory.js"; +import type { ScrapedProducerAnime } from "./animeProducer.js"; +import type { ScrapedEpisodeServers } from "./episodeServers.js"; +import type { ScrapedAnimeAboutInfo } from "./animeAboutInfo.js"; +import type { ScrapedAnimeSearchResult } from "./animeSearch.js"; +import type { ScrapedEstimatedSchedule } from "./estimatedSchedule.js"; +import type { ScrapedAnimeEpisodesSources } from "./animeEpisodeSrcs.js"; +import type { ScrapedAnimeSearchSuggestion } from "./animeSearchSuggestion.js"; + +export type { + ScrapedHomePage, + ScrapedGenreAnime, + ScrapedAnimeEpisodes, + ScrapedProducerAnime, + ScrapedAnimeCategory, + ScrapedEpisodeServers, + ScrapedAnimeAboutInfo, + ScrapedAnimeSearchResult, + ScrapedEstimatedSchedule, + ScrapedAnimeEpisodesSources, + ScrapedAnimeSearchSuggestion, +}; diff --git a/src/utils/constants.ts b/src/utils/constants.ts new file mode 100644 index 0000000000000000000000000000000000000000..861505046c3b52d62e292433842b212d1e8fb5ef --- /dev/null +++ b/src/utils/constants.ts @@ -0,0 +1,129 @@ +import { config } from "dotenv"; + +config(); + +export const ACCEPT_ENCODING_HEADER = "gzip, deflate, br"; +export const USER_AGENT_HEADER = + "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4692.71 Safari/537.36"; +export const ACCEPT_HEADER = + "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9"; + +// previously aniwatch.to || aniwatchtv.to +const DOMAIN = process.env.DOMAIN || "hianime.to"; + +export const SRC_BASE_URL = `https://${DOMAIN}`; +export const SRC_AJAX_URL = `${SRC_BASE_URL}/ajax`; +export const SRC_HOME_URL = `${SRC_BASE_URL}/home`; +export const SRC_SEARCH_URL = `${SRC_BASE_URL}/search`; + +// +export const genresIdMap: Record = { + action: 1, + adventure: 2, + cars: 3, + comedy: 4, + dementia: 5, + demons: 6, + drama: 8, + ecchi: 9, + fantasy: 10, + game: 11, + harem: 35, + historical: 13, + horror: 14, + isekai: 44, + josei: 43, + kids: 15, + magic: 16, + "martial-arts": 17, + mecha: 18, + military: 38, + music: 19, + mystery: 7, + parody: 20, + police: 39, + psychological: 40, + romance: 22, + samurai: 21, + school: 23, + "sci-fi": 24, + seinen: 42, + shoujo: 25, + "shoujo-ai": 26, + shounen: 27, + "shounen-ai": 28, + "slice-of-life": 36, + space: 29, + sports: 30, + "super-power": 31, + supernatural: 37, + thriller: 41, + vampire: 32, +} as const; + +export const typeIdMap: Record = { + all: 0, + movie: 1, + tv: 2, + ova: 3, + ona: 4, + special: 5, + music: 6, +} as const; + +export const statusIdMap: Record = { + all: 0, + "finished-airing": 1, + "currently-airing": 2, + "not-yet-aired": 3, +} as const; + +export const ratedIdMap: Record = { + all: 0, + g: 1, + pg: 2, + "pg-13": 3, + r: 4, + "r+": 5, + rx: 6, +} as const; + +export const scoreIdMap: Record = { + all: 0, + appalling: 1, + horrible: 2, + "very-bad": 3, + bad: 4, + average: 5, + fine: 6, + good: 7, + "very-good": 8, + great: 9, + masterpiece: 10, +} as const; + +export const seasonIdMap: Record = { + all: 0, + spring: 1, + summer: 2, + fall: 3, + winter: 4, +} as const; + +export const languageIdMap: Record = { + all: 0, + sub: 1, + dub: 2, + "sub-&-dub": 3, +} as const; + +export const sortIdMap: Record = { + default: "default", + "recently-added": "recently_added", + "recently-updated": "recently_updated", + score: "score", + "name-a-z": "name_az", + "released-date": "released_date", + "most-watched": "most_watched", +} as const; +// diff --git a/src/utils/index.ts b/src/utils/index.ts new file mode 100644 index 0000000000000000000000000000000000000000..b514c85e6b10c1b239d56a4aace6c3e180b78dfa --- /dev/null +++ b/src/utils/index.ts @@ -0,0 +1,2 @@ +export * from "./methods.js"; +export * from "./constants.js"; diff --git a/src/utils/methods.ts b/src/utils/methods.ts new file mode 100644 index 0000000000000000000000000000000000000000..9d89822172a5d0869710ffc9ec1a288f749c7ab9 --- /dev/null +++ b/src/utils/methods.ts @@ -0,0 +1,288 @@ +import type { + Anime, + Top10Anime, + MostPopularAnime, + Top10AnimeTimePeriod, +} from "../types/anime.js"; +import type { CheerioAPI, SelectorType } from "cheerio"; +import { + genresIdMap, + languageIdMap, + ratedIdMap, + scoreIdMap, + seasonIdMap, + sortIdMap, + statusIdMap, + typeIdMap, +} from "./constants.js"; +import { type FilterKeys } from "../types/controllers/animeSearch.js"; +import createHttpError, { HttpError } from "http-errors"; + +export const extractAnimes = ( + $: CheerioAPI, + selector: SelectorType +): Array | HttpError => { + try { + const animes: Array = []; + + $(selector).each((i, el) => { + const animeId = + $(el) + .find(".film-detail .film-name .dynamic-name") + ?.attr("href") + ?.slice(1) + .split("?ref=search")[0] || null; + + animes.push({ + id: animeId, + name: $(el) + .find(".film-detail .film-name .dynamic-name") + ?.text() + ?.trim(), + poster: + $(el) + .find(".film-poster .film-poster-img") + ?.attr("data-src") + ?.trim() || null, + duration: $(el) + .find(".film-detail .fd-infor .fdi-item.fdi-duration") + ?.text() + ?.trim(), + type: $(el) + .find(".film-detail .fd-infor .fdi-item:nth-of-type(1)") + ?.text() + ?.trim(), + rating: $(el).find(".film-poster .tick-rate")?.text()?.trim() || null, + episodes: { + sub: + Number( + $(el) + .find(".film-poster .tick-sub") + ?.text() + ?.trim() + .split(" ") + .pop() + ) || null, + dub: + Number( + $(el) + .find(".film-poster .tick-dub") + ?.text() + ?.trim() + .split(" ") + .pop() + ) || null, + }, + }); + }); + + return animes; + } catch (err: any) { + throw createHttpError.InternalServerError( + err?.message || "Something went wrong" + ); + } +}; + +export const extractTop10Animes = ( + $: CheerioAPI, + period: Top10AnimeTimePeriod +): Array | HttpError => { + try { + const animes: Array = []; + const selector = `#top-viewed-${period} ul li`; + + $(selector).each((i, el) => { + animes.push({ + id: + $(el) + .find(".film-detail .dynamic-name") + ?.attr("href") + ?.slice(1) + .trim() || null, + rank: Number($(el).find(".film-number span")?.text()?.trim()) || null, + name: $(el).find(".film-detail .dynamic-name")?.text()?.trim() || null, + poster: + $(el) + .find(".film-poster .film-poster-img") + ?.attr("data-src") + ?.trim() || null, + episodes: { + sub: + Number( + $(el) + .find(".film-detail .fd-infor .tick-item.tick-sub") + ?.text() + ?.trim() + ) || null, + dub: + Number( + $(el) + .find(".film-detail .fd-infor .tick-item.tick-dub") + ?.text() + ?.trim() + ) || null, + }, + }); + }); + + return animes; + } catch (err: any) { + throw createHttpError.InternalServerError( + err?.message || "Something went wrong" + ); + } +}; + +export const extractMostPopularAnimes = ( + $: CheerioAPI, + selector: SelectorType +): Array | HttpError => { + try { + const animes: Array = []; + + $(selector).each((i, el) => { + animes.push({ + id: + $(el) + .find(".film-detail .dynamic-name") + ?.attr("href") + ?.slice(1) + .trim() || null, + name: $(el).find(".film-detail .dynamic-name")?.text()?.trim() || null, + poster: + $(el) + .find(".film-poster .film-poster-img") + ?.attr("data-src") + ?.trim() || null, + jname: + $(el) + .find(".film-detail .film-name .dynamic-name") + .attr("data-jname") + ?.trim() || null, + + episodes: { + sub: + Number($(el)?.find(".fd-infor .tick .tick-sub")?.text()?.trim()) || + null, + dub: + Number($(el)?.find(".fd-infor .tick .tick-dub")?.text()?.trim()) || + null, + }, + type: + $(el) + ?.find(".fd-infor .tick") + ?.text() + ?.trim() + ?.replace(/[\s\n]+/g, " ") + ?.split(" ") + ?.pop() || null, + }); + }); + + return animes; + } catch (err: any) { + throw createHttpError.InternalServerError( + err?.message || "Something went wrong" + ); + } +}; + +export function retrieveServerId( + $: CheerioAPI, + index: number, + category: "sub" | "dub" +) { + return ( + $(`.ps_-block.ps_-block-sub.servers-${category} > .ps__-list .server-item`) + ?.map((_, el) => + $(el).attr("data-server-id") == `${index}` ? $(el) : null + ) + ?.get()[0] + ?.attr("data-id") || null + ); +} + +function getGenresFilterVal(genreNames: string[]): string | undefined { + if (genreNames.length < 1) { + return undefined; + } + return genreNames.map((name) => genresIdMap[name]).join(","); +} + +export function getSearchFilterValue( + key: FilterKeys, + rawValue: string +): string | undefined { + rawValue = rawValue.trim(); + if (!rawValue) return undefined; + + switch (key) { + case "genres": { + return getGenresFilterVal(rawValue.split(",")); + } + case "type": { + const val = typeIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "status": { + const val = statusIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "rated": { + const val = ratedIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "score": { + const val = scoreIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "season": { + const val = seasonIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "language": { + const val = languageIdMap[rawValue] ?? 0; + return val === 0 ? undefined : `${val}`; + } + case "sort": { + return sortIdMap[rawValue] ?? undefined; + } + default: + return undefined; + } +} + +// this fn tackles both start_date and end_date +export function getSearchDateFilterValue( + isStartDate: boolean, + rawValue: string +): string[] | undefined { + rawValue = rawValue.trim(); + if (!rawValue) return undefined; + + const dateRegex = /^\d{4}-([0-9]|1[0-2])-([0-9]|[12][0-9]|3[01])$/; + const dateCategory = isStartDate ? "s" : "e"; + const [year, month, date] = rawValue.split("-"); + + if (!dateRegex.test(rawValue)) { + return undefined; + } + + // sample return -> [sy=2023, sm=10, sd=11] + return [ + Number(year) > 0 ? `${dateCategory}y=${year}` : "", + Number(month) > 0 ? `${dateCategory}m=${month}` : "", + Number(date) > 0 ? `${dateCategory}d=${date}` : "", + ].filter((d) => Boolean(d)); +} + +export function substringAfter(str: string, toFind: string) { + const index = str.indexOf(toFind); + return index == -1 ? "" : str.substring(index + toFind.length); +} + +export function substringBefore(str: string, toFind: string) { + const index = str.indexOf(toFind); + return index == -1 ? "" : str.substring(0, index); +} diff --git a/test/animeAboutInfo.test.ts b/test/animeAboutInfo.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..2cdc0cb8639f3ee52eb153e9daa57a39783c4a51 --- /dev/null +++ b/test/animeAboutInfo.test.ts @@ -0,0 +1,11 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeAboutInfo } from "../src/parsers/index.js"; + +test("returns information about an anime", async () => { + const data = await scrapeAnimeAboutInfo("steinsgate-3"); + + expect(data.anime.info.name).not.toEqual(null); + expect(data.recommendedAnimes).not.toEqual([]); + expect(data.mostPopularAnimes).not.toEqual([]); + expect(Object.keys(data.anime.moreInfo)).not.toEqual([]); +}); diff --git a/test/animeCategory.test.ts b/test/animeCategory.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..55ad8734d1e9b037b662a0411180de9540dde851 --- /dev/null +++ b/test/animeCategory.test.ts @@ -0,0 +1,12 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeCategory } from "../src/parsers/index.js"; + +test("returns animes belonging to a category", async () => { + const data = await scrapeAnimeCategory("subbed-anime"); + + expect(data.animes).not.toEqual([]); + expect(data.genres).not.toEqual([]); + expect(data.top10Animes.today).not.toEqual([]); + expect(data.top10Animes.week).not.toEqual([]); + expect(data.top10Animes.month).not.toEqual([]); +}); diff --git a/test/animeEpisodeSrcs.test.ts b/test/animeEpisodeSrcs.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..3aed84217bf852dcba1a9ee5f00d89dff68044b0 --- /dev/null +++ b/test/animeEpisodeSrcs.test.ts @@ -0,0 +1,13 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeEpisodeSources } from "../src/parsers/index.js"; + +test("returns anime episode streaming link(s)", async () => { + const data = await scrapeAnimeEpisodeSources( + "steinsgate-3?ep=230", + "vidstreaming", + "sub" + ); + + expect(data.sources).not.toEqual([]); + // expect(data) +}); diff --git a/test/animeEpisodes.test.ts b/test/animeEpisodes.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..f727056766d00b00516e0e008b0939483a5ca7fc --- /dev/null +++ b/test/animeEpisodes.test.ts @@ -0,0 +1,9 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeEpisodes } from "../src/parsers/index.js"; + +test("returns episodes info of an anime", async () => { + const data = await scrapeAnimeEpisodes("steinsgate-3"); + + expect(data.totalEpisodes).not.toEqual(0); + expect(data.episodes).not.toEqual([]); +}); diff --git a/test/animeGenre.test.ts b/test/animeGenre.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..9e6610a5c32286b159cd3e1423786ef6f3ec2dbc --- /dev/null +++ b/test/animeGenre.test.ts @@ -0,0 +1,10 @@ +import { expect, test } from "vitest"; +import { scrapeGenreAnime } from "../src/parsers/index.js"; + +test("returns animes belonging to a genre", async () => { + const data = await scrapeGenreAnime("shounen", 2); + + expect(data.animes).not.toEqual([]); + expect(data.genres).not.toEqual([]); + expect(data.topAiringAnimes).not.toEqual([]); +}); diff --git a/test/animeProducer.test.ts b/test/animeProducer.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..ae17c836a5bcd3c03a09136cc6ea3602e32c9fe1 --- /dev/null +++ b/test/animeProducer.test.ts @@ -0,0 +1,12 @@ +import { expect, test } from "vitest"; +import { scrapeProducerAnimes } from "../src/parsers/index.js"; + +test("returns animes produced by a producer", async () => { + const data = await scrapeProducerAnimes("toei-animation", 2); + + expect(data.animes).not.toEqual([]); + expect(data.topAiringAnimes).not.toEqual([]); + expect(data.top10Animes.today).not.toEqual([]); + expect(data.top10Animes.week).not.toEqual([]); + expect(data.top10Animes.month).not.toEqual([]); +}); diff --git a/test/animeSearch.test.ts b/test/animeSearch.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..efd264183e3f519011df4502cfcdd0f46c55e960 --- /dev/null +++ b/test/animeSearch.test.ts @@ -0,0 +1,9 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeSearch } from "../src/parsers/index.js"; + +test("returns animes related to search query", async () => { + const data = await scrapeAnimeSearch("monster", 2); + + expect(data.animes).not.toEqual([]); + expect(data.mostPopularAnimes).not.toEqual([]); +}); diff --git a/test/animeSearchSuggestion.test.ts b/test/animeSearchSuggestion.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..f3a875aeb34d6d6bfefa434788fa4f73cc63b660 --- /dev/null +++ b/test/animeSearchSuggestion.test.ts @@ -0,0 +1,8 @@ +import { expect, test } from "vitest"; +import { scrapeAnimeSearchSuggestion } from "../src/parsers/index.js"; + +test("returns animes search suggestions related to search query", async () => { + const data = await scrapeAnimeSearchSuggestion("one piece"); + + expect(data.suggestions).not.toEqual([]); +}); diff --git a/test/episodeServers.test.ts b/test/episodeServers.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..73cc5cabb2d4a420e3f19341f550d4511a4c4411 --- /dev/null +++ b/test/episodeServers.test.ts @@ -0,0 +1,11 @@ +import { expect, test } from "vitest"; +import { scrapeEpisodeServers } from "../src/parsers/index.js"; + +test("returns episode source servers", async () => { + const data = await scrapeEpisodeServers("steinsgate-0-92?ep=2055"); + + expect(data.episodeId).not.toEqual(null); + expect(data.episodeNo).not.toEqual(0); + expect(data.sub).not.toEqual([]); + expect(data.dub).not.toEqual([]); +}); diff --git a/test/estimatedSchedule.test.ts b/test/estimatedSchedule.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..cd5f4171fbe84505947cdc3d254b53c1065344fe --- /dev/null +++ b/test/estimatedSchedule.test.ts @@ -0,0 +1,8 @@ +import { expect, test } from "vitest"; +import { scrapeEstimatedSchedule } from "../src/parsers/index.js"; + +test("returns estimated schedule anime release", async () => { + const data = await scrapeEstimatedSchedule("2024-03-30"); + + expect(data.scheduledAnimes).not.toEqual([]); +}); diff --git a/test/homePage.test.ts b/test/homePage.test.ts new file mode 100644 index 0000000000000000000000000000000000000000..192a060c92c7cbb40d5eeba24f5f0a9d3e7e698d --- /dev/null +++ b/test/homePage.test.ts @@ -0,0 +1,17 @@ +import { expect, test } from "vitest"; +import { scrapeHomePage } from "../src/parsers/index.js"; + +test("returns anime information present in homepage", async () => { + const data = await scrapeHomePage(); + + expect(data.spotlightAnimes).not.toEqual([]); + expect(data.trendingAnimes).not.toEqual([]); + expect(data.latestEpisodeAnimes).not.toEqual([]); + expect(data.topUpcomingAnimes).not.toEqual([]); + expect(data.topAiringAnimes).not.toEqual([]); + expect(data.genres).not.toEqual([]); + + expect(data.top10Animes.today).not.toEqual([]); + expect(data.top10Animes.week).not.toEqual([]); + expect(data.top10Animes.month).not.toEqual([]); +}); diff --git a/tsconfig.json b/tsconfig.json new file mode 100644 index 0000000000000000000000000000000000000000..278db51ddec3a3dd83006b3d961b4522f16995d3 --- /dev/null +++ b/tsconfig.json @@ -0,0 +1,38 @@ +{ + "compilerOptions": { + "esModuleInterop": true, + "skipLibCheck": true, + "target": "ES2022", + "verbatimModuleSyntax": true, + "allowJs": true, + "resolveJsonModule": true, + "moduleDetection": "force", + // + "strict": true, + // "noUncheckedIndexedAccess": true, + // + "moduleResolution": "NodeNext", + "module": "NodeNext", + "outDir": "./dist", + "rootDir": "./", + "sourceMap": true, + // + "declaration": true, + "removeComments": true, + "forceConsistentCasingInFileNames": true, + "strictFunctionTypes": true, + "lib": [ + "ES2022" + ] + }, + "include": [ + "./src", + "./api" + ], + "ts-node": { + "esm": true + }, + "exclude": [ + "node_modules", + ], +} \ No newline at end of file diff --git a/vercel.json b/vercel.json new file mode 100644 index 0000000000000000000000000000000000000000..2bc8c24ea1103ca6ebebf848359ca2a02e1c9e9c --- /dev/null +++ b/vercel.json @@ -0,0 +1,8 @@ +{ + "rewrites": [ + { + "source": "(.*)", + "destination": "/api" + } + ] +} \ No newline at end of file diff --git a/vitest.config.ts b/vitest.config.ts new file mode 100644 index 0000000000000000000000000000000000000000..9b2eb0fcf9b73db9e01f3c19b7b3d591fa1f1499 --- /dev/null +++ b/vitest.config.ts @@ -0,0 +1,9 @@ +import { defineConfig } from "vitest/config"; + +export default defineConfig({ + test: { + name: "aniwatch-api", + environment: "node", + testTimeout: 6000, + }, +});