repo
stringlengths 7
106
| readme
stringlengths 21
477k
| description
stringlengths 4
350
| topics
stringlengths 2
244
| releases
int64 0
1k
| contributors
int64 0
10k
| pulls
int64 0
13.8k
| commits
int64 1
268k
| issues
int64 0
14.5k
| branches
int64 1
4.52k
| workflows
int64 0
116
|
---|---|---|---|---|---|---|---|---|---|---|
MycroftAI/mimic3 | Mimic 3 A fast and local neural text to speech system developed by Mycroft for the Mark II . Available voices Documentation How does it work? Quickstart Mycroft TTS Plugin ``` sh Install system packages sudo apt-get install libespeak-ng1 Ensure that you're using the latest pip mycroft-pip install --upgrade pip Install plugin mycroft-pip install mycroft-plugin-tts-mimic3[all] Activate plugin mycroft-config set tts.module mimic3_tts_plug Start mycroft mycroft-start all
``` See documentation for more details. Web Server sh
mkdir -p "${HOME}/.local/share/mycroft/mimic3"
chmod a+rwx "${HOME}/.local/share/mycroft/mimic3"
docker run \
-it \
-p 59125:59125 \
-v "${HOME}/.local/share/mycroft/mimic3:/home/mimic3/.local/share/mycroft/mimic3" \
'mycroftai/mimic3' Visit http://localhost:59125 or from another terminal: ``` sh
curl -X POST --data 'Hello world.' --output - localhost:59125/api/tts | aplay ``` See documentation for more details. Command-Line Tool ``` sh Install system packages sudo apt-get install libespeak-ng1 Create virtual environment python3 -m venv .venv
source .venv/bin/activate
pip3 install --upgrade pip pip3 install mycroft-mimic3-tts[all]
``` Now you can run: sh
mimic3 'Hello world.' | aplay Use mimic3-server and mimic3 --remote ... for repeated usage (much faster). See documentation for more details. License Mimic 3 is available under the AGPL v3 license | A fast local neural text to speech engine for Mycroft | [] | 3 | 10 | 9 | 248 | 39 | 1 | 0 |
pocketbase/pocketbase | PocketBase is an open source Go backend, consisting of: embedded database ( SQLite ) with realtime subscriptions built-in files and users management convenient Admin dashboard UI and simple REST-ish API For documentation and examples, please visit https://pocketbase.io/docs. [!WARNING]
Please keep in mind that PocketBase is still under active development
and therefore full backward compatibility is not guaranteed before reaching v1.0.0. API SDK clients The easiest way to interact with the API is to use one of the official SDK clients: JavaScript - pocketbase/js-sdk ( browser and node ) Dart - pocketbase/dart-sdk ( web, mobile, desktop ) Overview Use as standalone app You could download the prebuilt executable for your platform from the Releases page .
Once downloaded, extract the archive and run ./pocketbase serve in the extracted directory. The prebuilt executables are based on the examples/base/main.go file and comes with the JS VM plugin enabled by default which allows to extend PocketBase with JavaScript ( for more details please refer to Extend with JavaScript ). Use as a Go framework/toolkit PocketBase is distributed as a regular Go library package which allows you to build
your own custom app specific business logic and still have a single portable executable at the end. Here is a minimal example: Install Go 1.21+ ( if you haven't already ) Create a new project directory with the following main.go file inside it:
```go
package main import (
"log"
"net/http" "github.com/labstack/echo/v5"
"github.com/pocketbase/pocketbase"
"github.com/pocketbase/pocketbase/apis"
"github.com/pocketbase/pocketbase/core" ) func main() {
app := pocketbase.New() app.OnBeforeServe().Add(func(e *core.ServeEvent) error {
// add new "GET /hello" route to the app router (echo)
e.Router.AddRoute(echo.Route{
Method: http.MethodGet,
Path: "/hello",
Handler: func(c echo.Context) error {
return c.String(200, "Hello world!")
},
Middlewares: []echo.MiddlewareFunc{
apis.ActivityLogger(app),
},
})
return nil
})
if err := app.Start(); err != nil {
log.Fatal(err)
} }
``` To init the dependencies, run go mod init myapp && go mod tidy . To start the application, run go run main.go serve . To build a statically linked executable, you can run CGO_ENABLED=0 go build and then start the created executable with ./myapp serve . [!NOTE]
PocketBase embeds SQLite, but doesn't require CGO. If CGO is enabled (aka. CGO_ENABLED=1 ), it will use mattn/go-sqlite3 driver, otherwise - modernc.org/sqlite .
Enable CGO only if you really need to squeeze the read/write query performance at the expense of complicating cross compilation. For more details please refer to Extend with Go . Building and running the repo main.go example To build the minimal standalone executable, like the prebuilt ones in the releases page, you can simply run go build inside the examples/base directory: Install Go 1.21+ ( if you haven't already ) Clone/download the repo Navigate to examples/base Run GOOS=linux GOARCH=amd64 CGO_ENABLED=0 go build ( https://go.dev/doc/install/source#environment ) Start the created executable by running ./base serve . Note that the supported build targets by the pure Go SQLite driver at the moment are: darwin amd64
darwin arm64
freebsd amd64
freebsd arm64
linux 386
linux amd64
linux arm
linux arm64
linux ppc64le
linux riscv64
linux s390x
windows amd64
windows arm64 Testing PocketBase comes with mixed bag of unit and integration tests.
To run them, use the standard go test command: sh
go test ./... Check also the Testing guide to learn how to write your own custom application tests. Security If you discover a security vulnerability within PocketBase, please send an e-mail to support at pocketbase.io . All reports will be promptly addressed, and you'll be credited accordingly. Contributing PocketBase is free and open source project licensed under the MIT License .
You are free to do whatever you want with it, even offering it as a paid service. You could help continuing its development by: Contribute to the source code Suggest new features and report issues PRs for new OAuth2 providers, bug fixes, code optimizations and documentation improvements are more than welcome. But please refrain creating PRs for new features without previously discussing the implementation details.
PocketBase has a roadmap and I try to work on issues in specific order and such PRs often come in out of nowhere and skew all initial planning with tedious back-and-forth communication. Don't get upset if I close your PR, even if it is well executed and tested. This doesn't mean that it will never be merged.
Later we can always refer to it and/or take pieces of your implementation when the time comes to work on the issue (don't worry you'll be credited in the release notes). | Open Source realtime backend in 1 file | authentication,backend,realtime,golang | 138 | 48 | 199 | 1,397 | 40 | 3 | 1 |
huggingface/diffusers | 🤗 Diffusers is the go-to library for state-of-the-art pretrained diffusion models for generating images, audio, and even 3D structures of molecules. Whether you're looking for a simple inference solution or training your own diffusion models, 🤗 Diffusers is a modular toolbox that supports both. Our library is designed with a focus on [usability over performance](https://huggingface.co/docs/diffusers/conceptual/philosophy#usability-over-performance), [simple over easy](https://huggingface.co/docs/diffusers/conceptual/philosophy#simple-over-easy), and [customizability over abstractions](https://huggingface.co/docs/diffusers/conceptual/philosophy#tweakable-contributorfriendly-over-abstraction).
🤗 Diffusers offers three core components:
- State-of-the-art [diffusion pipelines](https://huggingface.co/docs/diffusers/api/pipelines/overview) that can be run in inference with just a few lines of code.
- Interchangeable noise [schedulers](https://huggingface.co/docs/diffusers/api/schedulers/overview) for different diffusion speeds and output quality.
- Pretrained [models](https://huggingface.co/docs/diffusers/api/models/overview) that can be used as building blocks, and combined with schedulers, for creating your own end-to-end diffusion systems.
## Installation
We recommend installing 🤗 Diffusers in a virtual environment from PyPI or Conda. For more details about installing [PyTorch](https://pytorch.org/get-started/locally/) and [Flax](https://flax.readthedocs.io/en/latest/#installation), please refer to their official documentation.
### PyTorch
With `pip` (official package):
```bash
pip install --upgrade diffusers[torch]
```
With `conda` (maintained by the community):
```sh
conda install -c conda-forge diffusers
```
### Flax
With `pip` (official package):
```bash
pip install --upgrade diffusers[flax]
```
### Apple Silicon (M1/M2) support
Please refer to the [How to use Stable Diffusion in Apple Silicon](https://huggingface.co/docs/diffusers/optimization/mps) guide.
## Quickstart
Generating outputs is super easy with 🤗 Diffusers. To generate an image from text, use the `from_pretrained` method to load any pretrained diffusion model (browse the [Hub](https://huggingface.co/models?library=diffusers&sort=downloads) for 25.000+ checkpoints):
```python
from diffusers import DiffusionPipeline
import torch
pipeline = DiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16)
pipeline.to("cuda")
pipeline("An image of a squirrel in Picasso style").images[0]
```
You can also dig into the models and schedulers toolbox to build your own diffusion system:
```python
from diffusers import DDPMScheduler, UNet2DModel
from PIL import Image
import torch
scheduler = DDPMScheduler.from_pretrained("google/ddpm-cat-256")
model = UNet2DModel.from_pretrained("google/ddpm-cat-256").to("cuda")
scheduler.set_timesteps(50)
sample_size = model.config.sample_size
noise = torch.randn((1, 3, sample_size, sample_size), device="cuda")
input = noise
for t in scheduler.timesteps:
with torch.no_grad():
noisy_residual = model(input, t).sample
prev_noisy_sample = scheduler.step(noisy_residual, t, input).prev_sample
input = prev_noisy_sample
image = (input / 2 + 0.5).clamp(0, 1)
image = image.cpu().permute(0, 2, 3, 1).numpy()[0]
image = Image.fromarray((image * 255).round().astype("uint8"))
image
```
Check out the [Quickstart](https://huggingface.co/docs/diffusers/quicktour) to launch your diffusion journey today!
## How to navigate the documentation
| **Documentation** | **What can I learn?** |
|---------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Tutorial](https://huggingface.co/docs/diffusers/tutorials/tutorial_overview) | A basic crash course for learning how to use the library's most important features like using models and schedulers to build your own diffusion system, and training your own diffusion model. |
| [Loading](https://huggingface.co/docs/diffusers/using-diffusers/loading_overview) | Guides for how to load and configure all the components (pipelines, models, and schedulers) of the library, as well as how to use different schedulers. |
| [Pipelines for inference](https://huggingface.co/docs/diffusers/using-diffusers/pipeline_overview) | Guides for how to use pipelines for different inference tasks, batched generation, controlling generated outputs and randomness, and how to contribute a pipeline to the library. |
| [Optimization](https://huggingface.co/docs/diffusers/optimization/opt_overview) | Guides for how to optimize your diffusion model to run faster and consume less memory. |
| [Training](https://huggingface.co/docs/diffusers/training/overview) | Guides for how to train a diffusion model for different tasks with different training techniques. |
## Contribution
We ❤️ contributions from the open-source community!
If you want to contribute to this library, please check out our [Contribution guide](https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md).
You can look out for [issues](https://github.com/huggingface/diffusers/issues) you'd like to tackle to contribute to the library.
- See [Good first issues](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) for general opportunities to contribute
- See [New model/pipeline](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22New+pipeline%2Fmodel%22) to contribute exciting new diffusion models / diffusion pipelines
- See [New scheduler](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22New+scheduler%22)
Also, say 👋 in our public Discord channel . We discuss the hottest trends about diffusion models, help each other with contributions, personal projects or just hang out ☕.
## Popular Tasks & Pipelines Task Pipeline 🤗 Hub Unconditional Image Generation DDPM google/ddpm-ema-church-256 Text-to-Image Stable Diffusion Text-to-Image runwayml/stable-diffusion-v1-5 Text-to-Image unCLIP kakaobrain/karlo-v1-alpha Text-to-Image DeepFloyd IF DeepFloyd/IF-I-XL-v1.0 Text-to-Image Kandinsky kandinsky-community/kandinsky-2-2-decoder Text-guided Image-to-Image ControlNet lllyasviel/sd-controlnet-canny Text-guided Image-to-Image InstructPix2Pix timbrooks/instruct-pix2pix Text-guided Image-to-Image Stable Diffusion Image-to-Image runwayml/stable-diffusion-v1-5 Text-guided Image Inpainting Stable Diffusion Inpainting runwayml/stable-diffusion-inpainting Image Variation Stable Diffusion Image Variation lambdalabs/sd-image-variations-diffusers Super Resolution Stable Diffusion Upscale stabilityai/stable-diffusion-x4-upscaler Super Resolution Stable Diffusion Latent Upscale stabilityai/sd-x2-latent-upscaler ## Popular libraries using 🧨 Diffusers
- https://github.com/microsoft/TaskMatrix
- https://github.com/invoke-ai/InvokeAI
- https://github.com/apple/ml-stable-diffusion
- https://github.com/Sanster/lama-cleaner
- https://github.com/IDEA-Research/Grounded-Segment-Anything
- https://github.com/ashawkey/stable-dreamfusion
- https://github.com/deep-floyd/IF
- https://github.com/bentoml/BentoML
- https://github.com/bmaltais/kohya_ss
- +11.000 other amazing GitHub repositories 💪
Thank you for using us ❤️.
## Credits
This library concretizes previous work by many different authors and would not have been possible without their great research and implementations. We'd like to thank, in particular, the following implementations which have helped us in our development and without which the API could not have been as polished today:
- @CompVis' latent diffusion models library, available [here](https://github.com/CompVis/latent-diffusion)
- @hojonathanho original DDPM implementation, available [here](https://github.com/hojonathanho/diffusion) as well as the extremely useful translation into PyTorch by @pesser, available [here](https://github.com/pesser/pytorch_diffusion)
- @ermongroup's DDIM implementation, available [here](https://github.com/ermongroup/ddim)
- @yang-song's Score-VE and Score-VP implementations, available [here](https://github.com/yang-song/score_sde_pytorch)
We also want to thank @heejkoo for the very helpful overview of papers, code and resources on diffusion models, available [here](https://github.com/heejkoo/Awesome-Diffusion-Models) as well as @crowsonkb and @rromb for useful discussions and insights.
## Citation
```bibtex
@misc{von-platen-etal-2022-diffusers,
author = {Patrick von Platen and Suraj Patil and Anton Lozhkov and Pedro Cuenca and Nathan Lambert and Kashif Rasul and Mishig Davaadorj and Dhruv Nair and Sayak Paul and William Berman and Yiyi Xu and Steven Liu and Thomas Wolf},
title = {Diffusers: State-of-the-art diffusion models},
year = {2022},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/huggingface/diffusers}}
}
``` | 🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX. | deep-learning,diffusion,image-generation,pytorch,score-based-generative-modeling,image2image,text2image,stable-diffusion,stable-diffusion-diffusers,hacktoberfest | 72 | 857 | 4,369 | 4,246 | 395 | 360 | 24 |
charmbracelet/gum | Gum A tool for glamorous shell scripts. Leverage the power of Bubbles and Lip
Gloss in your scripts and aliases
without writing any Go code! The above example is running from a single shell script ( source ). Tutorial Gum provides highly configurable, ready-to-use utilities to help you write
useful shell scripts and dotfiles aliases with just a few lines of code.
Let's build a simple script to help you write Conventional Commits for your dotfiles. Ask for the commit type with gum choose: bash
gum choose "fix" "feat" "docs" "style" "refactor" "test" "chore" "revert" [!NOTE]
This command itself will print to stdout which is not all that useful. To make use of the command later on you can save the stdout to a $VARIABLE or file.txt . Prompt for the scope of these changes: bash
gum input --placeholder "scope" Prompt for the summary and description of changes: bash
gum input --value "$TYPE$SCOPE: " --placeholder "Summary of this change"
gum write --placeholder "Details of this change" Confirm before committing: bash
gum confirm "Commit changes?" && git commit -m "$SUMMARY" -m "$DESCRIPTION" Check out the complete example for combining these commands in a single script. Installation Use a package manager: ```bash macOS or Linux brew install gum Arch Linux (btw) pacman -S gum Nix nix-env -iA nixpkgs.gum Windows (via WinGet or Scoop) winget install charmbracelet.gum
scoop install charm-gum
``` Debian/Ubuntu ```bash
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
sudo apt update && sudo apt install gum
``` Fedora/RHEL ```bash
echo '[charm]
name=Charm
baseurl=https://repo.charm.sh/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo
sudo yum install gum
``` Or download it: Packages are available in Debian, RPM, and Alpine formats Binaries are available for Linux, macOS, Windows, FreeBSD, OpenBSD, and NetBSD Or just install it with go : bash
go install github.com/charmbracelet/gum@latest Commands choose : Choose an option from a list of choices confirm : Ask a user to confirm an action file : Pick a file from a folder filter : Filter items from a list format : Format a string using a template input : Prompt for some input join : Join text vertically or horizontally pager : Scroll through a file spin : Display spinner while running a command style : Apply coloring, borders, spacing to text table : Render a table of data write : Prompt for long-form text log : Log messages to output Customization You can customize gum options and styles with --flags and $ENVIRONMENT_VARIABLES .
See gum <command> --help for a full view of each command's customization and configuration options. Customize with --flags :
```bash gum input --cursor.foreground "#FF0" \
--prompt.foreground "#0FF" \
--placeholder "What's up?" \
--prompt "* " \
--width 80 \
--value "Not much, hby?"
``` Customize with ENVIRONMENT_VARIABLES : ```bash
export GUM_INPUT_CURSOR_FOREGROUND="#FF0"
export GUM_INPUT_PROMPT_FOREGROUND="#0FF"
export GUM_INPUT_PLACEHOLDER="What's up?"
export GUM_INPUT_PROMPT="* "
export GUM_INPUT_WIDTH=80 --flags can override values set with environment gum input
``` Input Prompt for input with a simple command. bash
gum input > answer.txt
gum input --password > password.txt Write Prompt for some multi-line text ( ctrl+d to complete text entry). bash
gum write > story.txt Filter Filter a list of values with fuzzy matching: bash
echo Strawberry >> flavors.txt
echo Banana >> flavors.txt
echo Cherry >> flavors.txt
gum filter < flavors.txt > selection.txt Select multiple options with the --limit flag or --no-limit flag. Use tab or ctrl+space to select, enter to confirm. bash
cat flavors.txt | gum filter --limit 2
cat flavors.txt | gum filter --no-limit Choose Choose an option from a list of choices. bash
echo "Pick a card, any card..."
CARD=$(gum choose --height 15 {{A,K,Q,J},{10..2}}" "{♠,♥,♣,♦})
echo "Was your card the $CARD?" You can also select multiple items with the --limit or --no-limit flag, which determines
the maximum of items that can be chosen. bash
cat songs.txt | gum choose --limit 5
cat foods.txt | gum choose --no-limit --header "Grocery Shopping" Confirm Confirm whether to perform an action. Exits with code 0 (affirmative) or 1 (negative) depending on selection. bash
gum confirm && rm file.txt || echo "File not removed" File Prompt the user to select a file from the file tree. bash
EDITOR $(gum file $HOME) Pager Scroll through a long document with line numbers and a fully customizable viewport. bash
gum pager < README.md Spin Display a spinner while running a script or command. The spinner will
automatically stop after the given command exits. To view or pipe the command's output, use the --show-output flag. bash
gum spin --spinner dot --title "Buying Bubble Gum..." -- sleep 5 Available spinner types include: line , dot , minidot , jump , pulse , points , globe , moon , monkey , meter , hamburger . Table Select a row from some tabular data. bash
gum table < flavors.csv | cut -d ',' -f 1 Style Pretty print any string with any layout with one command. bash
gum style \
--foreground 212 --border-foreground 212 --border double \
--align center --width 50 --margin "1 2" --padding "2 4" \
'Bubble Gum (1¢)' 'So sweet and so fresh!' Join Combine text vertically or horizontally. Use this command with gum style to
build layouts and pretty output. Tip: Always wrap the output of gum style in quotes to preserve newlines
( \n ) when using it as an argument in the join command. ```bash
I=$(gum style --padding "1 5" --border double --border-foreground 212 "I")
LOVE=$(gum style --padding "1 4" --border double --border-foreground 57 "LOVE")
BUBBLE=$(gum style --padding "1 8" --border double --border-foreground 255 "Bubble")
GUM=$(gum style --padding "1 5" --border double --border-foreground 240 "Gum") I_LOVE=$(gum join "$I" "$LOVE")
BUBBLE_GUM=$(gum join "$BUBBLE" "$GUM")
gum join --align center --vertical "$I_LOVE" "$BUBBLE_GUM"
``` Format format processes and formats bodies of text. gum format can parse markdown,
template strings, and named emojis. ```bash Format some markdown gum format -- "# Gum Formats" "- Markdown" "- Code" "- Template" "- Emoji"
echo "# Gum Formats\n- Markdown\n- Code\n- Template\n- Emoji" | gum format Syntax highlight some code cat main.go | gum format -t code Render text any way you want with templates echo '{{ Bold "Tasty" }} {{ Italic "Bubble" }} {{ Color "99" "0" " Gum " }}' \
| gum format -t template Display your favorite emojis! echo 'I :heart: Bubble Gum :candy:' | gum format -t emoji
``` For more information on template helpers, see the Termenv
docs . For a full list of
named emojis see the GitHub API . Log log logs messages to the terminal at using different levels and styling using
the charmbracelet/log library. ```bash Log some debug information. gum log --structured --level debug "Creating file..." name file.txt DEBUG Unable to create file. name=temp.txt Log some error. gum log --structured --level error "Unable to create file." name file.txt ERROR Unable to create file. name=temp.txt Include a timestamp. gum log --time rfc822 --level error "Unable to create file."
``` See the Go time package for acceptable --time formats. See charmbracelet/log for more usage. Examples How to use gum in your daily workflows: See the examples directory for more real world use cases. Write a commit message: bash
git commit -m "$(gum input --width 50 --placeholder "Summary of changes")" \
-m "$(gum write --width 80 --placeholder "Details of changes")" Open files in your $EDITOR bash
$EDITOR $(gum filter) Connect to a tmux session bash
SESSION=$(tmux list-sessions -F \#S | gum filter --placeholder "Pick session...")
tmux switch-client -t $SESSION || tmux attach -t $SESSION Pick a commit hash from git history bash
git log --oneline | gum filter | cut -d' ' -f1 # | copy Simple skate password selector. skate list -k | gum filter | xargs skate get Uninstall packages bash
brew list | gum choose --no-limit | xargs brew uninstall Clean up git branches bash
git branch | cut -c 3- | gum choose --no-limit | xargs git branch -D Checkout GitHub pull requests with gh bash
gh pr list | cut -f1,2 | gum choose | cut -f1 | xargs gh pr checkout Copy command from shell history bash
gum filter < $HISTFILE --height 20 sudo replacement bash
alias please="gum input --password | sudo -nS" Feedback We’d love to hear your thoughts on this project. Feel free to drop us a note! Twitter The Fediverse Discord License MIT Part of Charm . Charm热爱开源 • Charm loves open source | A tool for glamorous shell scripts 🎀 | bash,shell | 15 | 51 | 252 | 447 | 139 | 5 | 5 |
nvim-lua/kickstart.nvim | kickstart.nvim Introduction A starting point for Neovim that is: Small Single-file Completely Documented NOT a Neovim distribution, but instead a starting point for your configuration. Installation Install Neovim Kickstart.nvim targets only the latest 'stable' and latest 'nightly' of Neovim.
If you are experiencing issues, please make sure you have the latest versions. Install External Dependencies External Requirements:
- Basic utils: git , make , unzip , C Compiler ( gcc )
- ripgrep - Clipboard tool (xclip/xsel/win32yank or other depending on platform)
- A Nerd Font : optional, provides various icons
- if you have it set vim.g.have_nerd_font in init.lua to true
- Language Setup:
- If want to write Typescript, you need npm - If want to write Golang, you will need go - etc. NOTE See Install Recipes for additional Windows and Linux specific notes
and quick install snippets Install Kickstart NOTE Backup your previous configuration (if any exists) Neovim's configurations are located under the following paths, depending on your OS: | OS | PATH |
| :- | :--- |
| Linux, MacOS | $XDG_CONFIG_HOME/nvim , ~/.config/nvim |
| Windows (cmd)| %userprofile%\AppData\Local\nvim\ |
| Windows (powershell)| $env:USERPROFILE\AppData\Local\nvim\ | Recommended Step Fork this repo
so that you have your own copy that you can modify, then install by cloning the
fork to your machine using one of the commands below, depending on your OS. NOTE Your fork's url will be something like this: https://github.com/<your_github_username>/kickstart.nvim.git Clone kickstart.nvim NOTE If following the recommended step above (i.e., forking the repo), replace nvim-lua with <your_github_username> in the commands below Linux and Mac ```sh
git clone https://github.com/nvim-lua/kickstart.nvim.git "${XDG_CONFIG_HOME:-$HOME/.config}"/nvim
``` Windows If you're using `cmd.exe`:
```
git clone https://github.com/nvim-lua/kickstart.nvim.git %userprofile%\AppData\Local\nvim\
```
If you're using `powershell.exe`
```
git clone https://github.com/nvim-lua/kickstart.nvim.git $env:USERPROFILE\AppData\Local\nvim\
``` Post Installation Start Neovim sh
nvim That's it! Lazy will install all the plugins you have. Use :Lazy to view
current plugin status. Hit q to close the window. Read through the init.lua file in your configuration folder for more
information about extending and exploring Neovim. That also includes
examples of adding popularly requested plugins. Getting Started The Only Video You Need to Get Started with Neovim FAQ What should I do if I already have a pre-existing neovim configuration? You should back it up and then delete all associated files. This includes your existing init.lua and the neovim files in ~/.local which can be deleted with rm -rf ~/.local/share/nvim/ Can I keep my existing configuration in parallel to kickstart? Yes! You can use NVIM_APPNAME =nvim-NAME to maintain multiple configurations. For example, you can install the kickstart
configuration in ~/.config/nvim-kickstart and create an alias: alias nvim-kickstart='NVIM_APPNAME="nvim-kickstart" nvim' When you run Neovim using nvim-kickstart alias it will use the alternative
config directory and the matching local directory ~/.local/share/nvim-kickstart . You can apply this approach to any Neovim
distribution that you would like to try out. What if I want to "uninstall" this configuration: See lazy.nvim uninstall information Why is the kickstart init.lua a single file? Wouldn't it make sense to split it into multiple files? The main purpose of kickstart is to serve as a teaching tool and a reference
configuration that someone can easily use to git clone as a basis for their own.
As you progress in learning Neovim and Lua, you might consider splitting init.lua into smaller parts. A fork of kickstart that does this while maintaining the
same functionality is available here: kickstart-modular.nvim Discussions on this topic can be found here: Restructure the configuration Reorganize init.lua into a multi-file setup Install Recipes Below you can find OS specific install instructions for Neovim and dependencies. After installing all the dependencies continue with the Install Kickstart step. Windows Installation Windows with Microsoft C++ Build Tools and CMake Installation may require installing build tools and updating the run command for `telescope-fzf-native`
See `telescope-fzf-native` documentation for [more details](https://github.com/nvim-telescope/telescope-fzf-native.nvim#installation)
This requires:
- Install CMake and the Microsoft C++ Build Tools on Windows
```lua
{'nvim-telescope/telescope-fzf-native.nvim', build = 'cmake -S. -Bbuild -DCMAKE_BUILD_TYPE=Release && cmake --build build --config Release && cmake --install build --prefix build' }
``` Windows with gcc/make using chocolatey Alternatively, one can install gcc and make which don't require changing the config,
the easiest way is to use choco:
1. install [chocolatey](https://chocolatey.org/install)
either follow the instructions on the page or use winget,
run in cmd as **admin**:
```
winget install --accept-source-agreements chocolatey.chocolatey
```
2. install all requirements using choco, exit previous cmd and
open a new one so that choco path is set, and run in cmd as **admin**:
```
choco install -y neovim git ripgrep wget fd unzip gzip mingw make
``` WSL (Windows Subsystem for Linux) ```
wsl --install
wsl
sudo add-apt-repository ppa:neovim-ppa/unstable -y
sudo apt update
sudo apt install make gcc ripgrep unzip git xclip neovim
``` Linux Install Ubuntu Install Steps ```
sudo add-apt-repository ppa:neovim-ppa/unstable -y
sudo apt update
sudo apt install make gcc ripgrep unzip git xclip neovim
``` Debian Install Steps ```
sudo apt update
sudo apt install make gcc ripgrep unzip git xclip curl
# Now we install nvim
curl -LO https://github.com/neovim/neovim/releases/latest/download/nvim-linux64.tar.gz
sudo rm -rf /opt/nvim-linux64
sudo mkdir -p /opt/nvim-linux64
sudo chmod a+rX /opt/nvim-linux64
sudo tar -C /opt -xzf nvim-linux64.tar.gz
# make it available in /usr/local/bin, distro installs to /usr/bin
sudo ln -sf /opt/nvim-linux64/bin/nvim /usr/local/bin/
``` Fedora Install Steps ```
sudo dnf install -y gcc make git ripgrep fd-find unzip neovim
``` Arch Install Steps ```
sudo pacman -S --noconfirm --needed gcc make git ripgrep fd unzip neovim
``` | A launch point for your personal nvim configuration | [] | 0 | 113 | 563 | 277 | 13 | 13 | 1 |
sismo-core/sismo-badges | Sismo Protocol Contracts Made by Sismo This repository contains the smart contracts of the Sismo Protocol. There are three core contracts: core/AttestationsRegistry.sol : The registry stores all attestations. It is owned by the governance that authorizes/unauthorize issuers to record in it core/Attester.sol The standard abstract contract must be inherited by attesters. Attesters are issuers of attestations. They verify user requests and build attestations that will be recorded in the registry core/Badges.sol Reads the registry. Stateless Non Transferable Token view of attestations (ERC1155) It also contains implementations of attester in attesters/ :
- HydraS1SimpleAttester.sol : ZK Attester using the Hydra S1 Proving Scheme and the notion of nullifiers. Users must provide a ZK Proof along with their request to generate attestations
- HydraS1AccountboundAttester.sol : Accountbound version of the Simple Hydra S1 Simple Attester. (Users can update at will where the attestation is stored) Sismo protocol A complete overview of the protocol is available in our documentation Deployed contracts Deployed contracts can be found here Usage Installation yarn Compile contracts Compile contracts using hardhat yarn compile Test Launch all tests yarn test Print storage layout yarn storage-layout Deploy on local chain Terminal tab 1 yarn chain Terminal tab 2 yarn deploy:local Create a new Attester To develop a new attester, you must inherit the core/Attester.sol abstract contract and implement the following functions: _verifyRequest(request, proofData) : You must implement the user request verification against the proof provided by the user buildAttestations(request, proofData) : You must build the attestations that will be recorded from a verified user request Other optional hook functions that can be implemented: _beforeRecordAttestations(request, proofData) _afterRecordAttestations(request, proofData) The /attesters/hydra-s1/HydraS1SimpleAttester.sol is a good example of an attester implementing those functions. A guide is offered in our documentation. Feel free open a PR with your new attester in /attester ! License Distributed under the MIT License. Contribute Please, feel free to open issues, PRs or simply provide feedback! Contact Prefer Discord or Twitter | Contracts of the Sismo Badge Minting Protocol | did,ethereum,zkp,attestations,smart-contracts | 29 | 7 | 69 | 307 | 1 | 32 | 2 |
SagerNet/sing-box | sing-box The universal proxy platform. Documentation https://sing-box.sagernet.org Support https://community.sagernet.org/c/sing-box/ License ```
Copyright (C) 2022 by nekohasekai contact-sagernet@sekai.icu This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version. This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details. You should have received a copy of the GNU General Public License
along with this program. If not, see http://www.gnu.org/licenses/ . In addition, no derivative work may use the name or imply association
with this application without prior consent.
``` | The universal proxy platform | [] | 268 | 60 | 306 | 1,394 | 46 | 26 | 5 |
InterviewReady/system-design-resources | System Design Resources These are the best resources for System Design on the Internet. Table of Contents Video Processing Cluster and Workflow Management Intra-Service Messaging Message Queue Antipattern Service Mesh Practical System Design Distributed File System Time Series Databases Rate Limiting In Memory Database - Redis Network Protocols Chess Engine Design Subscription Management System Google Docs API Design NoSQL Database Internals NoSQL Database Algorithms Database Replication Containers and Docker Capacity Estimation Publisher Subscriber Event Driven Architectures Software Architectures Microservices Distributed Transactions consistency Patterns Load Balancing Alerts and Anomaly Detection Distributed Logging Metrics and Text Search Engine Single Point of Failure Location Based Services Batch Processing Real Time Stream Processing Caching Distributed Consensus Authorization Content Delivery Network Testing Distributed Systems System Design Resources Video Processing Transcoding Videos at Scale Facebook Video Broadcasting Netflix Video Encoding at Scale Netflix Shot based encoding Cluster and Workflow Management Facebook Cluster Management Google Autopilot - Autoscaling Netflix Workflow Orchestration Opensource Workflow Management Meta Hardware Management Meta Capacity Assignment Amazon EC2 Intra-Service Messaging What is a message queue AirBnb Idempotency Nginx Service Mesh Meta Async Task Computing Message Queue Antipattern DB as queue Antipattern Using a database as a message queue Anti-pattern of DB as a queue Drawbacks of DB as a queue Service Mesh Kubernetes Service Mesh Kubernetes Sidecar Service Mesh NginX Service Mesh Data Plane and Control Plane Practical System Design Facebook Messenger Optimisations YouTube Architecture YouTube scalability 2012 Distributed Design Patterns Monolith to Microservice Zerodha Tech Stack Distributed File System Open Source Distributed File System Amazon S3 Performance hacks Amazon S3 object expiration Time Series Databases Pinterest Time Series Database Uber Time Series DB TimeSeries Relational DB Facebook Gorilla Time Series DB Rate Limiting Circuit Breaker Algorithm Uber Rate Limiter In Memory Database - Redis Redis Official Documentation Learn Redis through Redis University Redis Open Source Repo Redis Architecture Network Protocols What is HTTP QUIC Protocol TCP Protocol algorithms (First 10 pages are important) WebRTC WebSockets Dynamic Source Routing using QUIC Chess Engine Design Chess Engine Building Subscription Management System Subscription Manager Google Docs Operational Transform Google Docs Lumiere API Design API Design at Airbnb Swagger APIs NoSQL Database Internals Cassandra Architecture Google BigTable Architecture Amazon Dynamo DB Internals Design Patterns in Amazon Dynamo DB Internals of Amazon Dynamo DB NoSQL Database Algorithms Hyperloglog Algorithm Log Structured Merge Tree Sorted String Tables and Compaction Strategies Leveled Compaction Cassandra Scylla DB Compaction Indexing in Cassandra Database Replication Database replication Netflix Data replication - Change Data Capture LinkedIn Logging Usecases Uber Trillions of indexes in LedgerStore Containers and Docker Facebook Twine Containerization CloudFlare Containerization Docker Architecture Capacity Estimation Google Capacity Estimation Scalability at YouTube 2012 Back of envelope Calculations at AWS Capacity Estimation Publisher Subscriber Oracle Publisher Subscriber Amazon Pub Sub Messaging Asynchronous processing Async Request Response Event Driven Architectures Martin Fowler- Event Driven Architecture Event Driven Architecture Software Architectures Hexagonal Architecture Hexagonal architecture (Alistair Cockburn) The Clean Code by Robert C. Martin (Uncle Bob) CQRS DomainDrivenDesign Microservices Monolith Architecture Monoliths vs Microservices Microservices Uber Nanoservices antipattern Uber Domain oriented microservice Distributed Transactions consistency Patterns Transactional outbox SAGAS Long lived transactions (LLTs) Load Balancing Load Balancer with Sticky Sessions NetScaler what is load balancing Nginx Load Balancing Consistent hashing Alerts and Anomaly Detection Outlier Detection Anomaly Detection Uber Real Time Monitoring and Root Cause Analysis Argos Microsoft Anomaly Detection Facebook Data Engineering LinkedIn Real Time Alerting LinkedIn Isolation Forests Distributed Logging Uber Distributed Request Tracing Pintrest Logging Google Monitoring Infrastructure Metrics and Text Search Engine Facebook real-time text search engine Elastic Search Time Based Querying Elastic Search Aggregation Single Point of Failure Avoiding Single Points of Failure Netflix Multi-Region Availability Oracle Single Points of failure DNS single point of failure 2004 DNS traffic management by Shopify Sharding Location Based Services Google S2 library Batch Processing Map Reduce Architecture Real Time Stream Processing LinkedIn Brooklin- Real-time data streaming Netflix Real Time Stream Processing KSQLDB for Kafka Netflix Psyberg Caching Google Guava Cache Caching (See the README) Caching Microsoft Caching Guide Caching patterns Distributed consensus Paxos Raft Authorization Designing an Authorization Model for an Enterprise The Architectural Patterns of Cloud-native Authorization Systems Content Delivery Network AWS CloudFront CDN with S3 Testing Distributed Systems Deterministic Testing TLA+ by Leslie Lamport Jepsen System Design Resources Designing Data-Intensive Applications Book WhitePapers InterviewReady Videos System Design Online Judge | These are the best resources for System Design on the Internet | cache,fault-tolerance,scalability,system-design | 0 | 12 | 20 | 74 | 0 | 1 | 1 |
charmbracelet/vhs | VHS Write terminal GIFs as code for integration testing and demoing your CLI tools. The above example was generated with VHS ( view source ). Tutorial To get started, install VHS and create a new .tape file. sh
vhs new demo.tape Open the .tape file with your favorite $EDITOR . sh
vim demo.tape Tape files consist of a series of commands . The commands are
instructions for VHS to perform on its virtual terminal. For a list of all
possible commands see the command reference . ```elixir Where should we write the GIF? Output demo.gif Set up a 1200x600 terminal with 46px font. Set FontSize 46
Set Width 1200
Set Height 600 Type a command in the terminal. Type "echo 'Welcome to VHS!'" Pause for dramatic effect... Sleep 500ms Run the command by pressing enter. Enter Admire the output for a bit. Sleep 5s
``` Once you've finished, save the file and feed it into VHS. sh
vhs demo.tape All done! You should see a new file called demo.gif (or whatever you named
the Output ) in the directory. For more examples see the examples/ directory. Installation [!NOTE]
VHS requires ttyd and ffmpeg to be installed and available on your PATH . Use a package manager: ```sh macOS or Linux brew install vhs Arch Linux (btw) pacman -S vhs Nix nix-env -iA nixpkgs.vhs Windows using scoop scoop install vhs
``` Or, use Docker to run VHS directly, dependencies included: sh
docker run --rm -v $PWD:/vhs ghcr.io/charmbracelet/vhs <cassette>.tape Or, download it: Packages are available in Debian and RPM formats Binaries are available for Linux, macOS, and Windows Or, just install it with go : sh
go install github.com/charmbracelet/vhs@latest Windows, Debian, Ubuntu, Fedora, RHEL, Void Instructions * Debian / Ubuntu
```sh
# Debian/Ubuntu
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
# Install ttyd from https://github.com/tsl0922/ttyd/releases
sudo apt update && sudo apt install vhs ffmpeg
```
* Fedora / RHEL
```sh
echo '[charm]
name=Charm
baseurl=https://repo.charm.sh/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo
# Install ttyd from https://github.com/tsl0922/ttyd/releases
sudo yum install vhs ffmpeg
```
* Void
```sh
sudo xbps-install vhs
```
* Windows
```sh
winget install charmbracelet.vhs
# or scoop
scoop install vhs
``` Record Tapes VHS has the ability to generate tape files from your terminal actions! To record to a tape file, run: bash
vhs record > cassette.tape Perform any actions you want and then exit the terminal session to stop
recording. You may want to manually edit the generated .tape file to add
settings or modify actions. Then, you can generate the GIF: bash
vhs cassette.tape Publish Tapes VHS allows you to publish your GIFs to our servers for easy sharing with your
friends and colleagues. Specify which file you want to share, then use the publish sub-command to host it on vhs.charm.sh . The output will provide you
with links to share your GIF via browser, HTML, and Markdown. bash
vhs publish demo.gif The VHS Server VHS has an SSH server built in! When you self-host VHS you can access it as
though it were installed locally. VHS will have access to commands and
applications on the host, so you don't need to install them on your machine. To start the server run: sh
vhs serve Configuration Options * `VHS_PORT`: The port to listen on (`1976`)
* `VHS_HOST`: The host to listen on (`localhost`)
* `VHS_GID`: The Group ID to run the server as (current user's GID)
* `VHS_UID`: The User ID to run the server as (current user's UID)
* `VHS_KEY_PATH`: The path to the SSH key to use (`.ssh/vhs_ed25519`)
* `VHS_AUTHORIZED_KEYS_PATH`: The path to the authorized keys file (empty, publicly accessible) Then, simply access VHS from a different machine via ssh : sh
ssh vhs.example.com < demo.tape > demo.gif VHS Command Reference [!NOTE]
You can view all VHS documentation on the command line with vhs manual . There are a few basic types of VHS commands: Output <path> : specify file output Require <program> : specify required programs for tape file Set <Setting> Value : set recording settings Type "<characters>" : emulate typing Left Right Up Down : arrow keys Backspace Enter Tab Space : special keys Ctrl[+Alt][+Shift]+<char> : press control + key and/or modifier Sleep <time> : wait for a certain amount of time Hide : hide commands from output Show : stop hiding commands from output Screenshot : screenshot the current frame Copy/Paste : copy and paste text from clipboard. Source : source commands from another tape Env <Key> Value : set environment variables Output The Output command allows you to specify the location and file format
of the render. You can specify more than one output in a tape file which
will render them to the respective locations. elixir
Output out.gif
Output out.mp4
Output out.webm
Output frames/ # a directory of frames as a PNG sequence Require The Require command allows you to specify dependencies for your tape file.
These are useful to fail early if a required program is missing from the $PATH , and it is certain that the VHS execution will not work as expected. Require commands must be defined at the top of a tape file, before any non-
setting or non-output command. ```elixir A tape file that requires gum and glow to be in the $PATH Require gum
Require glow
``` Settings The Set command allows you to change global aspects of the terminal, such as
the font settings, window dimensions, and GIF output location. Setting must be administered at the top of the tape file. Any setting (except TypingSpeed ) applied after a non-setting or non-output command will be
ignored. Set Shell Set the shell with the Set Shell <shell> command elixir
Set Shell fish Set Font Size Set the font size with the Set FontSize <number> command. elixir
Set FontSize 10
Set FontSize 20
Set FontSize 40 Set Font Family Set the font family with the Set FontFamily "<font>" command elixir
Set FontFamily "Monoflow" Set Width Set the width of the terminal with the Set Width command. elixir
Set Width 300 Set Height Set the height of the terminal with the Set Height command. elixir
Set Height 1000 Set Letter Spacing Set the spacing between letters (tracking) with the Set LetterSpacing Command. elixir
Set LetterSpacing 20 Set Line Height Set the spacing between lines with the Set LineHeight Command. elixir
Set LineHeight 1.8 Set Typing Speed elixir
Set TypingSpeed 500ms # 500ms
Set TypingSpeed 1s # 1s Set the typing speed of seconds per key press. For example, a typing speed of 0.1 would result in a 0.1s ( 100ms ) delay between each character being typed. This setting can also be overwritten per command with the @<time> syntax. elixir
Set TypingSpeed 0.1
Type "100ms delay per character"
Type@500ms "500ms delay per character" Set Theme Set the theme of the terminal with the Set Theme command. The theme value
should be a JSON string with the base 16 colors and foreground + background. elixir
Set Theme { "name": "Whimsy", "black": "#535178", "red": "#ef6487", "green": "#5eca89", "yellow": "#fdd877", "blue": "#65aef7", "magenta": "#aa7ff0", "cyan": "#43c1be", "white": "#ffffff", "brightBlack": "#535178", "brightRed": "#ef6487", "brightGreen": "#5eca89", "brightYellow": "#fdd877", "brightBlue": "#65aef7", "brightMagenta": "#aa7ff0", "brightCyan": "#43c1be", "brightWhite": "#ffffff", "background": "#29283b", "foreground": "#b3b0d6", "selection": "#3d3c58", "cursor": "#b3b0d6" } You can also set themes by name: elixir
Set Theme "Catppuccin Frappe" See the full list by running vhs themes , or in THEMES.md . Set Padding Set the padding (in pixels) of the terminal frame with the Set Padding command. elixir
Set Padding 0 Set Margin Set the margin (in pixels) of the video with the Set Margin command. elixir
Set Margin 60
Set MarginFill "#6B50FF" Set Window Bar Set the type of window bar (Colorful, ColorfulRight, Rings, RingsRight) on the terminal window with the Set WindowBar command. elixir
Set WindowBar Colorful Set Border Radius Set the border radius (in pixels) of the terminal window with the Set BorderRadius command. ```elixir You'll likely want to add a Margin + MarginFill if you use BorderRadius. Set Margin 20
Set MarginFill "#674EFF"
Set BorderRadius 10
``` Set Framerate Set the rate at which VHS captures frames with the Set Framerate command. elixir
Set Framerate 60 Set Playback Speed Set the playback speed of the final render. elixir
Set PlaybackSpeed 0.5 # Make output 2 times slower
Set PlaybackSpeed 1.0 # Keep output at normal speed (default)
Set PlaybackSpeed 2.0 # Make output 2 times faster Set Loop Offset Set the offset for when the GIF loop should begin. This allows you to make the
first frame of the GIF (generally used for previews) more interesting. elixir
Set LoopOffset 5 # Start the GIF at the 5th frame
Set LoopOffset 50% # Start the GIF halfway through Set Cursor Blink Set whether the cursor should blink. Enabled by default. elixir
Set CursorBlink false Type Use Type to emulate key presses. That is, you can use Type to script typing
in a terminal. Type is handy for both entering commands and interacting with
prompts and TUIs in the terminal. The command takes a string argument of the
characters to type. You can set the standard typing speed with Set TypingSpeed and override it in places with a @time argument. ```elixir Type something Type "Whatever you want" Type something really slowly! Type@500ms "Slow down there, partner."
``` Escape single and double quotes with backticks. elixir
Type `VAR="Escaped"` Keys Key commands take an optional @time and optional repeat count for repeating
the key press every interval of <time> . Key[@<time>] [count] Backspace Press the backspace key with the Backspace command. elixir
Backspace 18 Ctrl You can access the control modifier and send control sequences with the Ctrl command. elixir
Ctrl+R Enter Press the enter key with the Enter command. elixir
Enter 2 Arrow Keys Press any of the arrow keys with the Up , Down , Left , Right commands. elixir
Up 2
Down 2
Left
Right
Left
Right
Type "B"
Type "A" Tab Enter a tab with the Tab command. elixir
Tab@500ms 2 Space Press the space bar with the Space command. elixir
Space 10 Page Up / Down Press the Page Up / Down keys with the PageUp or PageDown commands. elixir
PageUp 3
PageDown 5 Sleep The Sleep command allows you to continue capturing frames without interacting
with the terminal. This is useful when you need to wait on something to
complete while including it in the recording like a spinner or loading state.
The command takes a number argument in seconds. elixir
Sleep 0.5 # 500ms
Sleep 2 # 2s
Sleep 100ms # 100ms
Sleep 1s # 1s Hide The Hide command instructs VHS to stop capturing frames. It's useful to pause
a recording to perform hidden commands. elixir
Hide This command is helpful for performing any setup and cleanup required to record
a GIF, such as building the latest version of a binary and removing the binary
once the demo is recorded. ```elixir
Output example.gif Setup Hide
Type "go build -o example . && clear"
Enter
Show Recording... Type 'Running ./example'
...
Enter Cleanup Hide
Type 'rm example'
``` Show The Show command instructs VHS to begin capturing frames, again. It's useful
after a Hide command to resume frame recording for the output. elixir
Hide
Type "You won't see this being typed."
Show
Type "You will see this being typed." Screenshot The Screenshot command captures the current frame (png format). ```elixir At any point... Screenshot examples/screenshot.png
``` Copy / Paste The Copy and Paste copy and paste the string from clipboard. elixir
Copy "https://github.com/charmbracelet"
Type "open "
Sleep 500ms
Paste Env Env command sets the environment variable via key-value pair. ```elixir
Env HELLO "WORLD" Type "echo $HELLO"
Enter
Sleep 1s
``` Source The source command allows you to execute commands from another tape. elixir
Source config.tape Continuous Integration You can hook up VHS to your CI pipeline to keep your GIFs up-to-date with
the official VHS GitHub Action: ⚙️ charmbracelet/vhs-action VHS can also be used for integration testing. Use the .txt or .ascii output
to generate golden files. Store these files in a git repository to ensure there
are no diffs between runs of the tape file. elixir
Output golden.ascii Syntax Highlighting There’s a tree-sitter grammar for .tape files available for editors that
support syntax highlighting with tree-sitter: 🌳 charmbracelet/tree-sitter-vhs It works great with Neovim, Emacs, and so on! Feedback We’d love to hear your thoughts on this project. Feel free to drop us a note! Twitter The Fediverse Discord License MIT Part of Charm . Charm热爱开源 • Charm loves open source | Your CLI home video recorder 📼 | gif,recording,terminal,video,cli,command-line,ascii,vhs | 11 | 47 | 279 | 636 | 71 | 13 | 5 |
ChrisTitusTech/winutil | Chris Titus Tech's Windows Utility This utility is a compilation of Windows tasks I perform on each Windows system I use. It is meant to streamline installs , debloat with tweaks , troubleshoot with config , and fix Windows updates . I am extremely picky about any contributions to keep this project clean and efficient. Usage Winutil must be run in Admin mode because it performs system-wide tweaks. To achieve this, open PowerShell or Windows Terminal as an administrator. Here are a few ways to do it: Right-Click Method: Right-click on the start menu. Choose "Windows PowerShell (Admin)" (for Windows 10) or "Terminal (Admin)" (for Windows 11). Search and Launch Method: Press the Windows key. Type "PowerShell" or "Terminal" (for Windows 11). Press Ctrl + Shift + Enter to launch it with administrator privileges. Launch Command Simple way irm https://christitus.com/win | iex Courtesy of the issue raised at: #144 or by executing: iwr -useb https://christitus.com/win | iex if for some reason this site is not reachable from your country please try running it directly from github (replace RELEASE_TAG with current release that you are interested in, for example v2024.06.05 ) irm "https://github.com/ChrisTitusTech/winutil/releases/download/RELEASE_TAG/winutil.ps1" | iex Automation Some features are avaliable through automation. This allows you to save your config file pass it to Winutil walk away and come back to a finished system. Here is how you can set it up currently with Winutil >24.01.15 On the Install Tab, click "Get Installed", this will get all installed apps supported by Winutil on the system Click on the Settings cog in the upper right corner and chose Export, chose file file and location, this will export the setting file. Copy this file to a USB or somewhere you can use after Windows installation. Use Microwin tab to create a custom Windows image. Install the Windows image. In the new Windows, Open PowerShell in the admin mode and run command to automatically apply tweaks and install apps from the config file. iex "& { $(irm christitus.com/win) } -Config [path-to-your-config] -Run" Have a cup of coffee! Come back when it's done. Issues: If you are unable to resolve christitus.com/win and are getting errors launching the tool, it might be due to India blocking GitHub's content domain and preventing downloads. You may use a VPN or change your DNS provider to Google/Cloudflare/etc. Source: https://timesofindia.indiatimes.com/gadgets-news/github-content-domain-blocked-for-these-indian-users-reports/articleshow/96687992.cms Windows Security (formerly Defender) and other anti-virus software are known to block the script. The script gets flagged due to the fact that it requires administrator privileges & makes drastic system changes. If you are having TLS 1.2 issues, or are having trouble resolving christitus.com/win then run with the following command: ``` ``` If you are still having issues try changing your DNS provider to 1.1.1.1 || 1.0.0.1 or 8.8.8.8 || 8.8.4.4 Support To morally and mentally support the project, make sure to leave a ⭐️! EXE Wrapper for $10 @ https://www.cttstore.com/windows-toolbox Tutorial Overview Install Install Selection: Organize programs by category and facilitate installation by enabling users to select programs and initiate the installation process with a single click. Upgrade All: Upgrade all existing programs to their latest versions, ensuring users have the most up-to-date and feature-rich software. Uninstall Selection: Effortlessly uninstall selected programs, providing users with a streamlined way to remove unwanted software from their system. Get Installed: Retrieve a comprehensive list of installed programs on the system, offering users visibility into the software currently installed on their computer. Import / Export: Enable users to import or export the selection list of programs, allowing them to save their preferred program configurations or share them with others. This feature promotes convenience and flexibility in managing program selections across different systems. Tweaks Recommended Selection: Provides pre-defined templates tailored for desktop, laptop, and minimal configurations, allowing users to select recommended settings and optimizations specific to their system type. Essential Tweaks: Offers a collection of essential tweaks aimed at improving system performance, privacy, and resource utilization. These tweaks include creating a system restore point, disabling telemetry, Wi-Fi Sense, setting services to manual, disabling location tracking, and HomeGroup, among others. Advanced Tweaks: Encompasses a range of various advanced power user tweaks to further optimize the system. These tweaks include removing OneDrive and Edge, disabling User Account Control (UAC), notification panel, among others. Toggles: Adds easy to use, one click shortcuts for toggling dark mode, NumLock on startup, file extensions, sticky keys, among others. Additional Tweaks: Introduces various other tweaks such as enabling dark mode, changing DNS settings, adding an Ultimate Performance mode, and creating shortcuts for WinUtil tools. These tweaks provide users with additional customization options to tailor their system to their preferences. Config Features: Allows users to easily install various essential components and features to enhance their Windows experience. These features include installing .NET Frameworks, enabling Hyper-V virtualization, enabling legacy media support for Windows Media Player and DirectPlay, enabling NFS (Network File System) for network file sharing, and enabling Windows Subsystem for Linux (WSL) for running Linux applications on Windows. Fixes: Provides a range of helpful fixes to address common issues and improve system stability. This includes setting up autologon for seamless login experiences, resetting Windows updates to resolve update-related problems, performing a system corruption scan to detect and repair corrupted files, and resetting network settings to troubleshoot network connectivity issues. Legacy Windows Panels: Includes access to legacy Windows panels from Windows 7, allowing users to access familiar and powerful tools. These panels include Control Panel for managing system settings, Network Connections for configuring network adapters and connections, Power Panel for adjusting power and sleep settings, Sound Settings for managing audio devices and settings, System Properties for viewing and modifying system information, and User Accounts for managing user profiles and account settings. Updates: Default (Out of Box) Settings: Provides the default settings that come with Windows for updates. Security (Recommended) Settings: Offers recommended settings, including a slight delay of feature updates by 2 years and installation of security updates 4 days after release. Disable All Updates (Not Recommended!): Allows users to disable all Windows updates, but it's not recommended due to potential security risks. Video and Written Article walkthrough @ https://christitus.com/windows-tool/ Issues If you encounter any challenges or problems with the script, I kindly request that you submit them via the "Issues" tab on the GitHub repository. By filling out the provided template, you can provide specific details about the issue, allowing me to promptly address any bugs or consider feature requests. Contribute Code Pull Requests are now handled directly on the MAIN branch. This was done since we can now select specific releases to launch via releases in GitHub. If doing a code change and you can submit a PR to main branch, but I am very selective about these. Do not use a code formatter, massive amounts of line changes, and make multiple feature changes. EACH FEATURE CHANGE SHOULD BE IT'S OWN Pull Request! When creating pull requests, it is essential to thoroughly document all changes made. This includes documenting any additions made to the tweaks section and ensuring that corresponding undo measures are in place to remove the newly added tweaks if necessary. Failure to adhere to this format may result in denial of the pull request. Additionally, comprehensive documentation is required for all code changes. Any code lacking sufficient documentation may also be denied. By following these guidelines, we can maintain a high standard of quality and ensure that the codebase remains organized and well-documented. NOTE: When creating a function please include "WPF" or "WinUtil" in the name so that it can be loaded into the runspace. Thanks to all Contributors Thanks a lot for spending your time helping Winutil grow. Thanks a lot! Keep rocking 🍻. GitHub Stats | Chris Titus Tech's Windows Utility - Install Programs, Tweaks, Fixes, and Updates | [] | 5 | 150 | 763 | 250 | 58 | 1 | 4 |
sczhou/CodeFormer | Towards Robust Blind Face Restoration with Codebook Lookup Transformer (NeurIPS 2022) Paper | Project Page | Video Shangchen Zhou , Kelvin C.K. Chan , Chongyi Li , Chen Change Loy S-Lab, Nanyang Technological University :star: If CodeFormer is helpful to your images or projects, please help star this repo. Thanks! :hugs: Update 2023.07.20 : Integrated to :panda_face: OpenXLab . Try out online demo! 2023.04.19 : :whale: Training codes and config files are public available now. 2023.04.09 : Add features of inpainting and colorization for cropped and aligned face images. 2023.02.10 : Include dlib as a new face detector option, it produces more accurate face identity. 2022.10.05 : Support video input --input_path [YOUR_VIDEO.mp4] . Try it to enhance your videos! :clapper: 2022.09.14 : Integrated to :hugs: Hugging Face . Try out online demo! 2022.09.09 : Integrated to :rocket: Replicate . Try out online demo! More TODO [x] Add training code and config files [x] Add checkpoint and script for face inpainting [x] Add checkpoint and script for face colorization [x] ~~Add background image enhancement~~ :panda_face: Try Enhancing Old Photos / Fixing AI-arts Face Restoration Face Color Enhancement and Restoration Face Inpainting Dependencies and Installation Pytorch >= 1.7.1 CUDA >= 10.1 Other required packages in requirements.txt ``` git clone this repository git clone https://github.com/sczhou/CodeFormer
cd CodeFormer create new anaconda env conda create -n codeformer python=3.8 -y
conda activate codeformer install python dependencies pip3 install -r requirements.txt
python basicsr/setup.py develop
conda install -c conda-forge dlib (only for face detection or cropping with dlib)
``` Quick Inference Download Pre-trained Models: Download the facelib and dlib pretrained models from [ Releases | Google Drive | OneDrive ] to the weights/facelib folder. You can manually download the pretrained models OR download by running the following command: python scripts/download_pretrained_models.py facelib
python scripts/download_pretrained_models.py dlib (only for dlib face detector) Download the CodeFormer pretrained models from [ Releases | Google Drive | OneDrive ] to the weights/CodeFormer folder. You can manually download the pretrained models OR download by running the following command: python scripts/download_pretrained_models.py CodeFormer Prepare Testing Data: You can put the testing images in the inputs/TestWhole folder. If you would like to test on cropped and aligned faces, you can put them in the inputs/cropped_faces folder. You can get the cropped and aligned faces by running the following command:
``` you may need to install dlib via: conda install -c conda-forge dlib python scripts/crop_align_face.py -i [input folder] -o [output folder]
``` Testing: [Note] If you want to compare CodeFormer in your paper, please run the following command indicating --has_aligned (for cropped and aligned face), as the command for the whole image will involve a process of face-background fusion that may damage hair texture on the boundary, which leads to unfair comparison. Fidelity weight w lays in [0, 1]. Generally, smaller w tends to produce a higher-quality result, while larger w yields a higher-fidelity result. The results will be saved in the results folder. 🧑🏻 Face Restoration (cropped and aligned face)
``` For cropped and aligned faces (512x512) python inference_codeformer.py -w 0.5 --has_aligned --input_path [image folder]|[image path]
``` :framed_picture: Whole Image Enhancement
``` For whole image Add '--bg_upsampler realesrgan' to enhance the background regions with Real-ESRGAN Add '--face_upsample' to further upsample restorated face with Real-ESRGAN python inference_codeformer.py -w 0.7 --input_path [image folder]|[image path]
``` :clapper: Video Enhancement
``` For Windows/Mac users, please install ffmpeg first conda install -c conda-forge ffmpeg For video clips Video path should end with '.mp4'|'.mov'|'.avi' python inference_codeformer.py --bg_upsampler realesrgan --face_upsample -w 1.0 --input_path [video path]
``` 🌈 Face Colorization (cropped and aligned face)
``` For cropped and aligned faces (512x512) Colorize black and white or faded photo python inference_colorization.py --input_path [image folder]|[image path]
``` 🎨 Face Inpainting (cropped and aligned face)
``` For cropped and aligned faces (512x512) Inputs could be masked by white brush using an image editing app (e.g., Photoshop) (check out the examples in inputs/masked_faces) python inference_inpainting.py --input_path [image folder]|[image path]
``` Training: The training commands can be found in the documents: English | 简体中文 . Citation If our work is useful for your research, please consider citing: @inproceedings{zhou2022codeformer,
author = {Zhou, Shangchen and Chan, Kelvin C.K. and Li, Chongyi and Loy, Chen Change},
title = {Towards Robust Blind Face Restoration with Codebook Lookup TransFormer},
booktitle = {NeurIPS},
year = {2022}
} License This project is licensed under NTU S-Lab License 1.0 . Redistribution and use should follow this license. Acknowledgement This project is based on BasicSR . Some codes are brought from Unleashing Transformers , YOLOv5-face , and FaceXLib . We also adopt Real-ESRGAN to support background image enhancement. Thanks for their awesome works. Contact If you have any questions, please feel free to reach me out at shangchenzhou@gmail.com . | [NeurIPS 2022] Towards Robust Blind Face Restoration with Codebook Lookup Transformer | codebook,codeformer,face-enhancement,face-restoration,pytorch,super-resolution,vqgan,restoration | 1 | 3 | 44 | 71 | 215 | 1 | 0 |
WongKinYiu/yolov7 | Official YOLOv7 Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors Web Demo Integrated into Huggingface Spaces 🤗 using Gradio. Try out the Web Demo Performance MS COCO | Model | Test Size | AP test | AP 50 test | AP 75 test | batch 1 fps | batch 32 average time |
| :-- | :-: | :-: | :-: | :-: | :-: | :-: |
| YOLOv7 | 640 | 51.4% | 69.7% | 55.9% | 161 fps | 2.8 ms |
| YOLOv7-X | 640 | 53.1% | 71.2% | 57.8% | 114 fps | 4.3 ms |
| | | | | | | |
| YOLOv7-W6 | 1280 | 54.9% | 72.6% | 60.1% | 84 fps | 7.6 ms |
| YOLOv7-E6 | 1280 | 56.0% | 73.5% | 61.2% | 56 fps | 12.3 ms |
| YOLOv7-D6 | 1280 | 56.6% | 74.0% | 61.8% | 44 fps | 15.0 ms |
| YOLOv7-E6E | 1280 | 56.8% | 74.4% | 62.1% | 36 fps | 18.7 ms | Installation Docker environment (recommended) Expand ``` shell
# create the docker container, you can change the share memory size if you have more.
nvidia-docker run --name yolov7 -it -v your_coco_path/:/coco/ -v your_code_path/:/yolov7 --shm-size=64g nvcr.io/nvidia/pytorch:21.08-py3
# apt install required packages
apt update
apt install -y zip htop screen libgl1-mesa-glx
# pip install required packages
pip install seaborn thop
# go to code folder
cd /yolov7
``` Testing yolov7.pt yolov7x.pt yolov7-w6.pt yolov7-e6.pt yolov7-d6.pt yolov7-e6e.pt shell
python test.py --data data/coco.yaml --img 640 --batch 32 --conf 0.001 --iou 0.65 --device 0 --weights yolov7.pt --name yolov7_640_val You will get the results: Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.51206
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.69730
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.55521
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.35247
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.55937
Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.66693
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.38453
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.63765
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.68772
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.53766
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.73549
Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.83868 To measure accuracy, download COCO-annotations for Pycocotools to the ./coco/annotations/instances_val2017.json Training Data preparation shell
bash scripts/get_coco.sh Download MS COCO dataset images ( train , val , test ) and labels . If you have previously used a different version of YOLO, we strongly recommend that you delete train2017.cache and val2017.cache files, and redownload labels Single GPU training ``` shell train p5 models python train.py --workers 8 --device 0 --batch-size 32 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml train p6 models python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml
``` Multiple GPU training ``` shell train p5 models python -m torch.distributed.launch --nproc_per_node 4 --master_port 9527 train.py --workers 8 --device 0,1,2,3 --sync-bn --batch-size 128 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml train p6 models python -m torch.distributed.launch --nproc_per_node 8 --master_port 9527 train_aux.py --workers 8 --device 0,1,2,3,4,5,6,7 --sync-bn --batch-size 128 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml
``` Transfer learning yolov7_training.pt yolov7x_training.pt yolov7-w6_training.pt yolov7-e6_training.pt yolov7-d6_training.pt yolov7-e6e_training.pt Single GPU finetuning for custom dataset ``` shell finetune p5 models python train.py --workers 8 --device 0 --batch-size 32 --data data/custom.yaml --img 640 640 --cfg cfg/training/yolov7-custom.yaml --weights 'yolov7_training.pt' --name yolov7-custom --hyp data/hyp.scratch.custom.yaml finetune p6 models python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/custom.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6-custom.yaml --weights 'yolov7-w6_training.pt' --name yolov7-w6-custom --hyp data/hyp.scratch.custom.yaml
``` Re-parameterization See reparameterization.ipynb Inference On video: shell
python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source yourvideo.mp4 On image: shell
python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source inference/images/horses.jpg Export Pytorch to CoreML (and inference on MacOS/iOS) Pytorch to ONNX with NMS (and inference) shell
python export.py --weights yolov7-tiny.pt --grid --end2end --simplify \
--topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640 --max-wh 640 Pytorch to TensorRT with NMS (and inference) shell
wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt
python export.py --weights ./yolov7-tiny.pt --grid --end2end --simplify --topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640
git clone https://github.com/Linaom1214/tensorrt-python.git
python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16 Pytorch to TensorRT another way Expand ```shell
wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt
python export.py --weights yolov7-tiny.pt --grid --include-nms
git clone https://github.com/Linaom1214/tensorrt-python.git
python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16 Or use trtexec to convert ONNX to TensorRT engine /usr/src/tensorrt/bin/trtexec --onnx=yolov7-tiny.onnx --saveEngine=yolov7-tiny-nms.trt --fp16
``` Tested with: Python 3.7.13, Pytorch 1.12.0+cu113 Pose estimation code yolov7-w6-pose.pt See keypoint.ipynb . Instance segmentation (with NTU) code yolov7-mask.pt See instance.ipynb . Instance segmentation code yolov7-seg.pt YOLOv7 for instance segmentation (YOLOR + YOLOv5 + YOLACT) | Model | Test Size | AP box | AP 50 box | AP 75 box | AP mask | AP 50 mask | AP 75 mask |
| :-- | :-: | :-: | :-: | :-: | :-: | :-: | :-: |
| YOLOv7-seg | 640 | 51.4% | 69.4% | 55.8% | 41.5% | 65.5% | 43.7% | Anchor free detection head code yolov7-u6.pt YOLOv7 with decoupled TAL head (YOLOR + YOLOv5 + YOLOv6) | Model | Test Size | AP val | AP 50 val | AP 75 val |
| :-- | :-: | :-: | :-: | :-: |
| YOLOv7-u6 | 640 | 52.6% | 69.7% | 57.3% | Citation @inproceedings{wang2023yolov7,
title={{YOLOv7}: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors},
author={Wang, Chien-Yao and Bochkovskiy, Alexey and Liao, Hong-Yuan Mark},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2023}
} @article{wang2023designing,
title={Designing Network Design Strategies Through Gradient Path Analysis},
author={Wang, Chien-Yao and Liao, Hong-Yuan Mark and Yeh, I-Hau},
journal={Journal of Information Science and Engineering},
year={2023}
} Teaser YOLOv7-semantic & YOLOv7-panoptic & YOLOv7-caption YOLOv7-semantic & YOLOv7-detection & YOLOv7-depth (with NTUT) YOLOv7-3d-detection & YOLOv7-lidar & YOLOv7-road (with NTUT) Acknowledgements Expand * [https://github.com/AlexeyAB/darknet](https://github.com/AlexeyAB/darknet)
* [https://github.com/WongKinYiu/yolor](https://github.com/WongKinYiu/yolor)
* [https://github.com/WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4)
* [https://github.com/WongKinYiu/ScaledYOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4)
* [https://github.com/Megvii-BaseDetection/YOLOX](https://github.com/Megvii-BaseDetection/YOLOX)
* [https://github.com/ultralytics/yolov3](https://github.com/ultralytics/yolov3)
* [https://github.com/ultralytics/yolov5](https://github.com/ultralytics/yolov5)
* [https://github.com/DingXiaoH/RepVGG](https://github.com/DingXiaoH/RepVGG)
* [https://github.com/JUGGHM/OREPA_CVPR2022](https://github.com/JUGGHM/OREPA_CVPR2022)
* [https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose](https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose) | Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors | scaled-yolov4,yolor,yolov3,yolov4,yolov7,darknet,pytorch | 1 | 30 | 242 | 134 | 1,414 | 9 | 0 |
actualbudget/actual | Getting Started Actual is a local-first personal finance tool. It is 100% free and open-source, written in NodeJS, it has a synchronization element so that all your changes can move between devices without any heavy lifting. If you are interested in contributing, or want to know how development works, see our contributing document we would love to have you. Want to say thanks? Click the ⭐ at the top of the page. Key Links Actual discord community. Actual Community Documentation Installation If you are only interested in running the latest version and not contributing to the source code, you don't need to clone this repo. You can get the latest version through npm. The easy way: using a server (recommended) The easiest way to get Actual running is to use the actual-server project. That is the server for syncing changes across devices, and it comes with the latest version of Actual. The server will provide both the web project and a server for syncing. You can get up and running quickly and easily by following our Running Actual Locally Guide Documentation We have a wide range of documentation on how to use Actual, this is all available in our Community Documentation , this includes topics on Budgeting, Account Management, Tips & Tricks and some documentation for developers. Code structure The Actual app is split up into a few packages: loot-core - The core application that runs on any platform desktop-client - The desktop UI desktop-electron - The desktop app More information on the project structure is available in our community documentation . Feature Requests Current feature requests can be seen here .
Vote for your favorite requests by reacting :+1: to the top comment of the request. To add new feature requests, open a new Issue of the "Feature Request" type. Sponsors Thanks to our wonderful sponsors who make Actual budget possible! | A local-first personal finance app | budgeting,finance,money,personal-finance | 26 | 133 | 1,628 | 1,590 | 91 | 21 | 15 |
formkit/auto-animate | Add motion to your apps with a single line of code. AutoAnimate is a zero-config, drop-in animation utility that adds smooth transitions to your web app. You can use it with Vue, React, Solid or any other JavaScript application. With one line of code, you can improve your interfaces, for example: Installation Install using your package manager of choice. ```bash yarn yarn add @formkit/auto-animate npm npm install @formkit/auto-animate pnpm pnpm add @formkit/auto-animate
``` Boom! Done. That was fast! 🐇 Usage 📖 View the documentation site for usage instructions . Examples 📖 View the documentation site for examples . Plugins 📖 View the documentation site for plugin instructions . Support us Is AutoAnimate saving you time? Please consider supporting us with a recurring or one-time donation ! 🙏 Contributing Thank you for your willingness to contribute to this free and open source project! When contributing, consider first discussing your desired change with the core team via GitHub issues , Discord , or other method. | A zero-config, drop-in animation utility that adds smooth transitions to your web app. You can use it with React, Vue, or any other JavaScript application. | animation,javascript,react,ui,vue | 7 | 41 | 65 | 347 | 43 | 10 | 1 |
willwulfken/MidJourney-Styles-and-Keywords-Reference | DISCLAMER: I am not officially affiliated with MidJourney. I am simply a user/member who enjoys using their service. Styles Comparison Pages Created By Will Wulfken | A reference containing Styles and Keywords that you can use with MidJourney AI. There are also pages showing resolution comparison, image weights, and much more! | ai,artificial-intelligence,guide,midjourney,neural-network,reference,keywords,styles,prompt,ai-art | 1 | 9 | 10 | 1,229 | 1 | 2 | 0 |
Dao-AILab/flash-attention | FlashAttention This repository provides the official implementation of FlashAttention and
FlashAttention-2 from the
following papers. FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher Ré Paper: https://arxiv.org/abs/2205.14135 IEEE Spectrum article about our submission to the MLPerf 2.0 benchmark using FlashAttention. FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning Tri Dao Paper: https://tridao.me/publications/flash2/flash2.pdf Usage We've been very happy to see FlashAttention being widely adopted in such a short
time after its release. This page contains a partial list of places where FlashAttention is being used. FlashAttention and FlashAttention-2 are free to use and modify (see LICENSE).
Please cite and credit FlashAttention if you use it. Installation and features Requirements:
- CUDA 11.6 and above.
- PyTorch 1.12 and above.
- Linux. Might work for Windows starting v2.3.2 (we've seen a few positive reports ) but Windows compilation still requires more testing. If you have ideas on how to set up prebuilt CUDA wheels for Windows, please reach out via Github issue. We recommend the Pytorch container from Nvidia, which has all the required tools to install FlashAttention. To install:
1. Make sure that PyTorch is installed.
2. Make sure that packaging is installed ( pip install packaging )
3. Make sure that ninja is installed and that it works correctly (e.g. ninja
--version then echo $? should return exit code 0). If not (sometimes ninja
--version then echo $? returns a nonzero exit code), uninstall then reinstall ninja ( pip uninstall -y ninja && pip install ninja ). Without ninja ,
compiling can take a very long time (2h) since it does not use multiple CPU
cores. With ninja compiling takes 3-5 minutes on a 64-core machine.
4. Then: sh
pip install flash-attn --no-build-isolation Alternatively you can compile from source: sh
python setup.py install If your machine has less than 96GB of RAM and lots of CPU cores, ninja might
run too many parallel compilation jobs that could exhaust the amount of RAM. To
limit the number of parallel compilation jobs, you can set the environment
variable MAX_JOBS : sh
MAX_JOBS=4 pip install flash-attn --no-build-isolation Interface: src/flash_attention_interface.py FlashAttention-2 currently supports:
1. Ampere, Ada, or Hopper GPUs (e.g., A100, RTX 3090, RTX 4090, H100). Support for Turing
GPUs (T4, RTX 2080) is coming soon, please use FlashAttention 1.x for Turing
GPUs for now.
2. Datatype fp16 and bf16 (bf16 requires Ampere, Ada, or Hopper GPUs).
3. All head dimensions up to 256. ~~Head dim > 192 backward requires A100/A800 or H100/H800~~. Head dim 256 backward now works on consumer GPUs (if there's no dropout) as of flash-attn 2.5.5. How to use FlashAttention The main functions implement scaled dot product attention (softmax(Q @ K^T *
softmax_scale) @ V): python
from flash_attn import flash_attn_qkvpacked_func, flash_attn_func python
flash_attn_qkvpacked_func(qkv, dropout_p=0.0, softmax_scale=None, causal=False,
window_size=(-1, -1), alibi_slopes=None, deterministic=False):
"""dropout_p should be set to 0.0 during evaluation
If Q, K, V are already stacked into 1 tensor, this function will be faster than
calling flash_attn_func on Q, K, V since the backward pass avoids explicit concatenation
of the gradients of Q, K, V.
If window_size != (-1, -1), implements sliding window local attention. Query at position i
will only attend to keys between [i - window_size[0], i + window_size[1]] inclusive.
Arguments:
qkv: (batch_size, seqlen, 3, nheads, headdim)
dropout_p: float. Dropout probability.
softmax_scale: float. The scaling of QK^T before applying softmax.
Default to 1 / sqrt(headdim).
causal: bool. Whether to apply causal attention mask (e.g., for auto-regressive modeling).
window_size: (left, right). If not (-1, -1), implements sliding window local attention.
alibi_slopes: (nheads,) or (batch_size, nheads), fp32. A bias of (-alibi_slope * |i - j|) is added to
the attention score of query i and key j.
deterministic: bool. Whether to use the deterministic implementation of the backward pass,
which is slightly slower and uses more memory. The forward pass is always deterministic.
Return:
out: (batch_size, seqlen, nheads, headdim).
""" ```python
flash_attn_func(q, k, v, dropout_p=0.0, softmax_scale=None, causal=False,
window_size=(-1, -1), alibi_slopes=None, deterministic=False):
"""dropout_p should be set to 0.0 during evaluation
Supports multi-query and grouped-query attention (MQA/GQA) by passing in KV with fewer heads
than Q. Note that the number of heads in Q must be divisible by the number of heads in KV.
For example, if Q has 6 heads and K, V have 2 heads, head 0, 1, 2 of Q will attention to head
0 of K, V, and head 3, 4, 5 of Q will attention to head 1 of K, V.
If window_size != (-1, -1), implements sliding window local attention. Query at position i
will only attend to keys between
[i + seqlen_k - seqlen_q - window_size[0], i + seqlen_k - seqlen_q + window_size[1]] inclusive. Arguments:
q: (batch_size, seqlen, nheads, headdim)
k: (batch_size, seqlen, nheads_k, headdim)
v: (batch_size, seqlen, nheads_k, headdim)
dropout_p: float. Dropout probability.
softmax_scale: float. The scaling of QK^T before applying softmax.
Default to 1 / sqrt(headdim).
causal: bool. Whether to apply causal attention mask (e.g., for auto-regressive modeling).
window_size: (left, right). If not (-1, -1), implements sliding window local attention.
alibi_slopes: (nheads,) or (batch_size, nheads), fp32. A bias of
(-alibi_slope * |i + seqlen_k - seqlen_q - j|)
is added to the attention score of query i and key j.
deterministic: bool. Whether to use the deterministic implementation of the backward pass,
which is slightly slower and uses more memory. The forward pass is always deterministic.
Return:
out: (batch_size, seqlen, nheads, headdim).
"""
``` ```python
def flash_attn_with_kvcache(
q,
k_cache,
v_cache,
k=None,
v=None,
rotary_cos=None,
rotary_sin=None,
cache_seqlens: Optional[Union[(int, torch.Tensor)]] = None,
cache_batch_idx: Optional[torch.Tensor] = None,
block_table: Optional[torch.Tensor] = None,
softmax_scale=None,
causal=False,
window_size=(-1, -1), # -1 means infinite context window
rotary_interleaved=True,
alibi_slopes=None,
):
"""
If k and v are not None, k_cache and v_cache will be updated inplace with the new values from
k and v. This is useful for incremental decoding: you can pass in the cached keys/values from
the previous step, and update them with the new keys/values from the current step, and do
attention with the updated cache, all in 1 kernel. If you pass in k / v, you must make sure that the cache is large enough to hold the new values.
For example, the KV cache could be pre-allocated with the max sequence length, and you can use
cache_seqlens to keep track of the current sequence lengths of each sequence in the batch.
Also apply rotary embedding if rotary_cos and rotary_sin are passed in. The key @k will be
rotated by rotary_cos and rotary_sin at indices cache_seqlens, cache_seqlens + 1, etc.
If causal or local (i.e., window_size != (-1, -1)), the query @q will be rotated by rotary_cos
and rotary_sin at indices cache_seqlens, cache_seqlens + 1, etc.
If not causal and not local, the query @q will be rotated by rotary_cos and rotary_sin at
indices cache_seqlens only (i.e. we consider all tokens in @q to be at position cache_seqlens).
See tests/test_flash_attn.py::test_flash_attn_kvcache for examples of how to use this function.
Supports multi-query and grouped-query attention (MQA/GQA) by passing in KV with fewer heads
than Q. Note that the number of heads in Q must be divisible by the number of heads in KV.
For example, if Q has 6 heads and K, V have 2 heads, head 0, 1, 2 of Q will attention to head
0 of K, V, and head 3, 4, 5 of Q will attention to head 1 of K, V.
If causal=True, the causal mask is aligned to the bottom right corner of the attention matrix.
For example, if seqlen_q = 2 and seqlen_k = 5, the causal mask (1 = keep, 0 = masked out) is:
1 1 1 1 0
1 1 1 1 1
If seqlen_q = 5 and seqlen_k = 2, the causal mask is:
0 0
0 0
0 0
1 0
1 1
If the row of the mask is all zero, the output will be zero.
If window_size != (-1, -1), implements sliding window local attention. Query at position i
will only attend to keys between
[i + seqlen_k - seqlen_q - window_size[0], i + seqlen_k - seqlen_q + window_size[1]] inclusive.
Note: Does not support backward pass.
Arguments:
q: (batch_size, seqlen, nheads, headdim)
k_cache: (batch_size_cache, seqlen_cache, nheads_k, headdim) if there's no block_table,
or (num_blocks, page_block_size, nheads_k, headdim) if there's a block_table (i.e. paged KV cache)
page_block_size must be a multiple of 256.
v_cache: (batch_size_cache, seqlen_cache, nheads_k, headdim) if there's no block_table,
or (num_blocks, page_block_size, nheads_k, headdim) if there's a block_table (i.e. paged KV cache)
k [optional]: (batch_size, seqlen_new, nheads_k, headdim). If not None, we concatenate
k with k_cache, starting at the indices specified by cache_seqlens.
v [optional]: (batch_size, seqlen_new, nheads_k, headdim). Similar to k.
rotary_cos [optional]: (seqlen_ro, rotary_dim / 2). If not None, we apply rotary embedding
to k and q. Only applicable if k and v are passed in. rotary_dim must be divisible by 16.
rotary_sin [optional]: (seqlen_ro, rotary_dim / 2). Similar to rotary_cos.
cache_seqlens: int, or (batch_size,), dtype torch.int32. The sequence lengths of the
KV cache.
block_table [optional]: (batch_size, max_num_blocks_per_seq), dtype torch.int32.
cache_batch_idx: (batch_size,), dtype torch.int32. The indices used to index into the KV cache.
If None, we assume that the batch indices are [0, 1, 2, ..., batch_size - 1].
If the indices are not distinct, and k and v are provided, the values updated in the cache
might come from any of the duplicate indices.
softmax_scale: float. The scaling of QK^T before applying softmax.
Default to 1 / sqrt(headdim).
causal: bool. Whether to apply causal attention mask (e.g., for auto-regressive modeling).
window_size: (left, right). If not (-1, -1), implements sliding window local attention.
rotary_interleaved: bool. Only applicable if rotary_cos and rotary_sin are passed in.
If True, rotary embedding will combine dimensions 0 & 1, 2 & 3, etc. If False,
rotary embedding will combine dimensions 0 & rotary_dim / 2, 1 & rotary_dim / 2 + 1
(i.e. GPT-NeoX style).
alibi_slopes: (nheads,) or (batch_size, nheads), fp32. A bias of
(-alibi_slope * |i + seqlen_k - seqlen_q - j|)
is added to the attention score of query i and key j.
Return:
out: (batch_size, seqlen, nheads, headdim).
""" ``` To see how these functions are used in a multi-head attention layer (which
includes QKV projection, output projection), see the MHA implementation . Changelog 2.0: Complete rewrite, 2x faster Upgrading from FlashAttention (1.x) to FlashAttention-2 These functions have been renamed:
- flash_attn_unpadded_func -> flash_attn_varlen_func - flash_attn_unpadded_qkvpacked_func -> flash_attn_varlen_qkvpacked_func - flash_attn_unpadded_kvpacked_func -> flash_attn_varlen_kvpacked_func If the inputs have the same sequence lengths in the same batch, it is simpler
and faster to use these functions: python
flash_attn_qkvpacked_func(qkv, dropout_p=0.0, softmax_scale=None, causal=False) python
flash_attn_func(q, k, v, dropout_p=0.0, softmax_scale=None, causal=False) 2.1: Change behavior of causal flag If seqlen_q != seqlen_k and causal=True, the causal mask is aligned to the
bottom right corner of the attention matrix, instead of the top-left corner. For example, if seqlen_q = 2 and seqlen_k = 5, the causal mask (1 = keep, 0 =
masked out) is: v2.0: 1 0 0 0 0 1 1 0 0 0 v2.1: 1 1 1 1 0 1 1 1 1 1 If seqlen_q = 5 and seqlen_k = 2, the causal mask is: v2.0: 1 0 1 1 1 1 1 1 1 1 v2.1: 0 0 0 0 0 0 1 0 1 1 If the row of the mask is all zero, the output will be zero. 2.2: Optimize for inference Optimize for inference (iterative decoding) when query has very small sequence
length (e.g., query sequence length = 1). The bottleneck here is to load KV
cache as fast as possible, and we split the loading across different thread
blocks, with a separate kernel to combine results. See the function flash_attn_with_kvcache with more features for inference
(perform rotary embedding, updating KV cache inplace). Thanks to the xformers team, and in particular Daniel Haziza, for this
collaboration. 2.3: Local (i.e., sliding window) attention Implement sliding window attention (i.e., local attention). Thanks to Mistral
AI and in particular Timothée Lacroix for this
contribution. Sliding window was used in the Mistral 7B model. 2.4: ALiBi (attention with linear bias), deterministic backward pass. Implement ALiBi (Press et al., 2021). Thanks to Sanghun Cho from Kakao Brain for this contribution. Implement deterministic backward pass. Thanks to engineers from Meituan for this contribution. 2.5: Paged KV cache. Support paged KV cache (i.e., PagedAttention ).
Thanks to @beginlner for this contribution. Performance We present expected speedup (combined forward + backward pass) and memory savings from using FlashAttention against PyTorch standard attention, depending on sequence length, on different GPUs (speedup depends on memory bandwidth - we see more speedup on slower GPU memory). We currently have benchmarks for these GPUs:
* A100 * H100 A100 We display FlashAttention speedup using these parameters:
* Head dimension 64 or 128, hidden dimension 2048 (i.e. either 32 or 16 heads).
* Sequence length 512, 1k, 2k, 4k, 8k, 16k.
* Batch size set to 16k / seqlen. Speedup Memory We show memory savings in this graph (note that memory footprint is the same no matter if you use dropout or masking).
Memory savings are proportional to sequence length -- since standard attention has memory quadratic in sequence length, whereas FlashAttention has memory linear in sequence length.
We see 10X memory savings at sequence length 2K, and 20X at 4K.
As a result, FlashAttention can scale to much longer sequence lengths. H100 Full model code and training script We have released the full GPT model implementation .
We also provide optimized implementations of other layers (e.g., MLP, LayerNorm,
cross-entropy loss, rotary embedding). Overall this speeds up training by 3-5x
compared to the baseline implementation from Huggingface, reaching up to 225
TFLOPs/sec per A100, equivalent to 72% model FLOPs utilization (we don't need
any activation checkpointing). We also include a training script to
train GPT2 on Openwebtext and GPT3 on The Pile. Triton implementation of FlashAttention Phil Tillet (OpenAI) has an experimental implementation of FlashAttention in Triton:
https://github.com/openai/triton/blob/master/python/tutorials/06-fused-attention.py As Triton is a higher-level language than CUDA, it might be easier to understand
and experiment with. The notations in the Triton implementation are also closer
to what's used in our paper. We also have an experimental implementation in Triton that support attention
bias (e.g. ALiBi):
https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/flash_attn_triton.py Tests We test that FlashAttention produces the same output and gradient as a reference
implementation, up to some numerical tolerance. In particular, we check that the
maximum numerical error of FlashAttention is at most twice the numerical error
of a baseline implementation in Pytorch (for different head dimensions, input
dtype, sequence length, causal / non-causal). To run the tests: sh
pytest -q -s tests/test_flash_attn.py When you encounter issues This new release of FlashAttention-2 has been tested on several GPT-style
models, mostly on A100 GPUs. If you encounter bugs, please open a GitHub Issue! Citation If you use this codebase, or otherwise found our work valuable, please cite: @inproceedings{dao2022flashattention,
title={Flash{A}ttention: Fast and Memory-Efficient Exact Attention with {IO}-Awareness},
author={Dao, Tri and Fu, Daniel Y. and Ermon, Stefano and Rudra, Atri and R{\'e}, Christopher},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2022}
}
@inproceedings{dao2023flashattention2,
title={Flash{A}ttention-2: Faster Attention with Better Parallelism and Work Partitioning},
author={Dao, Tri},
booktitle={International Conference on Learning Representations (ICLR)},
year={2024}
} | Fast and memory-efficient exact attention | [] | 66 | 66 | 158 | 615 | 393 | 3 | 1 |
noodle-run/noodle | Noodle Rethinking Student Productivity Warning This is a work-in-progress and not the finished product. Noodle is still in active development towards a minimal viable product (MVP). Follow me on twitter @ixahmedxii for updates. ⚠️ This is a UI design mockup of what the platform will look like, it is not the current state of the project. Purpose Noodle as an idea came from the struggles that I faced during my university years. I was using multiple apps to try and stay on track with my studies, and I thought to myself, why is there no singular app that can do everything a student needs to stay on track with their studies? Like a GitHub but for students. Planned MVP Features ✍️ Note Taking 📚 Flashcards The flashcards will be generated from the notes that you take, and you will be able to quiz yourself on them. Future Features 📅 Calendar 📝 Task management 📊 Grade tracking Feel free to suggest more features by opening an issue, or join our Discord server to discuss it with the community. Star History Contributing If you would like to contribute to Noodle, please read the CONTRIBUTING.md file to get started. License Noodle is open source and available under the AGPL-3.0-or-later license. | Rethinking Student Productivity | trpc,eslint,prettier,drizzle-orm,education,javascript,next,nextjs13,productivity-tool,tailwindcss | 0 | 21 | 260 | 668 | 1 | 1 | 3 |
MatsuriDayo/nekoray | NekoRay / NekoBox For PC Qt based cross-platform GUI proxy configuration manager (backend: v2ray / sing-box) Support Windows / Linux out of the box now. 基于 Qt 的跨平台代理配置管理器 (后端 v2ray / sing-box) 目前支持 Windows / Linux 开箱即用 Readme Translations 中文 / English / fa_IR / 日本語 / Русский 下载 / Download GitHub Releases (Portable ZIP) 便携格式,无安装器。转到 Releases 下载预编译的二进制文件,解压后即可使用。 下载 / Download 安装包的说明,如果你不知道要下载哪一个 Package AUR nekoray nekoray-git archlinuxcn nekoray nekoray-git Scoop Extras scoop install nekoray 更改记录 & 发布频道 / Changelog & Telegram Channel https://t.me/Matsuridayo 项目主页 & 文档 / Homepage & Documents https://matsuridayo.github.io 代理 / Proxy SOCKS (4/4a/5) HTTP(S) Shadowsocks VMess VLESS Trojan TUIC ( sing-box ) NaïveProxy ( Custom Core ) Hysteria ( Custom Core or sing-box ) Hysteria2 ( Custom Core or sing-box ) Custom Outbound Custom Config Custom Core 订阅 / Subscription Raw: some widely used formats (like Shadowsocks, Clash and v2rayN) 原始格式: 一些广泛使用的格式 (如 Shadowsocks、Clash 和 v2rayN) 运行参数 运行参数 Windows 运行 若提示 DLL 缺失,无法运行,请下载 安装 微软 C++ 运行库 Linux 运行 Linux 运行教程 macOS 由于缺乏维护,不再提供 macOS 版本下载。 您还可以在下面的存储库中非正式地获取更新版本。 非官方发布 / Unofficial releases 仍可以参照下方教程自行编译,常见问题请参考历史 Issue,欢迎 Pull Request 修复。 编译教程 / Compile Tutorial 请看 技术文档 / Technical documentation 捐助 / Donate 如果这个项目对您有帮助,可以通过捐赠的方式帮助我们维持这个项目。 捐赠满等额 50 USD 可以在「 捐赠榜 」显示头像,如果您未被添加到这里,欢迎联系我们补充。 Donations of 50 USD or more can display your avatar on the Donation List . If you are not added here, please contact us to add it. USDT TRC20 TRhnA7SXE5Sap5gSG3ijxRmdYFiD4KRhPs XMR 49bwESYQjoRL3xmvTcjZKHEKaiGywjLYVQJMUv79bXonGiyDCs8AzE3KiGW2ytTybBCpWJUvov8SjZZEGg66a4e59GXa6k5 Credits Core: v2fly/v2ray-core ( < 3.10 ) MatsuriDayo/Matsuri ( < 3.10 ) MatsuriDayo/v2ray-core ( < 3.10 ) XTLS/Xray-core ( >= 3.10 ) MatsuriDayo/Xray-core ( >= 3.10 ) SagerNet/sing-box Matsuridayo/sing-box-extra Gui: Qv2ray Qt protobuf yaml-cpp zxing-cpp QHotkey AppImageKit | Qt based cross-platform GUI proxy configuration manager (backend: v2ray / sing-box) | linux,proxy,qt,shadowsocks,sing-box,trojan,v2ray,vless,vmess,windows | 73 | 35 | 101 | 497 | 342 | 1 | 2 |
phidatahq/phidata | phidata Build AI Assistants with memory, knowledge and tools What is phidata? Phidata is a framework for building Autonomous Assistants (aka Agents) that have long-term memory, contextual knowledge and the ability to take actions using function calling. Use phidata to turn any LLM into an AI Assistant that can:
- Search the web using DuckDuckGo, Google etc.
- Analyze data using SQL, DuckDb, etc.
- Conduct research and generate reports.
- Answer questions from PDFs, APIs, etc.
- Write scripts for movies, books, etc.
- Summarize articles, videos, etc.
- Perform tasks like sending emails, querying databases, etc.
- And much more... Why phidata? Problem: We need to turn general-purpose LLMs into specialized assistants for our use-case. Solution: Extend LLMs with memory, knowledge and tools:
- Memory: Stores chat history in a database and enables LLMs to have long-term conversations.
- Knowledge: Stores information in a vector database and provides LLMs with business context .
- Tools: Enable LLMs to take actions like pulling data from an API, sending emails or querying a database. Memory & knowledge make LLMs smarter while tools make them autonomous . How it works Step 1: Create an Assistant Step 2: Add Tools (functions), Knowledge (vectordb) and Storage (database) Step 3: Serve using Streamlit, FastApi or Django to build your AI application Installation shell
pip install -U phidata Quickstart Assistant that can search the web Create a file assistant.py ```python
from phi.assistant import Assistant
from phi.tools.duckduckgo import DuckDuckGo assistant = Assistant(tools=[DuckDuckGo()], show_tool_calls=True)
assistant.print_response("Whats happening in France?", markdown=True)
``` Install libraries, export your OPENAI_API_KEY and run the Assistant ```shell
pip install openai duckduckgo-search export OPENAI_API_KEY=sk-xxxx python assistant.py
``` Assistant that can query financial data Create a file finance_assistant.py ```python
from phi.assistant import Assistant
from phi.llm.openai import OpenAIChat
from phi.tools.yfinance import YFinanceTools assistant = Assistant(
llm=OpenAIChat(model="gpt-4o"),
tools=[YFinanceTools(stock_price=True, analyst_recommendations=True, company_info=True, company_news=True)],
show_tool_calls=True,
markdown=True,
)
assistant.print_response("What is the stock price of NVDA")
assistant.print_response("Write a comparison between NVDA and AMD, use all tools available.")
``` Install libraries and run the Assistant ```shell
pip install yfinance python finance_assistant.py
``` More information Read the docs at docs.phidata.com Chat with us on discord Examples LLM OS : Using LLMs as the CPU for an emerging Operating System. Autonomous RAG : Gives LLMs tools to search its knowledge, web or chat history. Local RAG : Fully local RAG with Llama3 on Ollama and PgVector. Investment Researcher : Generate investment reports on stocks using Llama3 and Groq. News Articles : Write News Articles using Llama3 and Groq. Video Summaries : YouTube video summaries using Llama3 and Groq. Research Assistant : Write research reports using Llama3 and Groq. Assistant that can write and run python code Show code The `PythonAssistant` can achieve tasks by writing and running python code.
- Create a file `python_assistant.py`
```python
from phi.assistant.python import PythonAssistant
from phi.file.local.csv import CsvFile
python_assistant = PythonAssistant(
files=[
CsvFile(
path="https://phidata-public.s3.amazonaws.com/demo_data/IMDB-Movie-Data.csv",
description="Contains information about movies from IMDB.",
)
],
pip_install=True,
show_tool_calls=True,
)
python_assistant.print_response("What is the average rating of movies?", markdown=True)
```
- Install pandas and run the `python_assistant.py`
```shell
pip install pandas
python python_assistant.py
``` Assistant that can analyze data using SQL Show code The `DuckDbAssistant` can perform data analysis using SQL.
- Create a file `data_assistant.py`
```python
import json
from phi.assistant.duckdb import DuckDbAssistant
duckdb_assistant = DuckDbAssistant(
semantic_model=json.dumps({
"tables": [
{
"name": "movies",
"description": "Contains information about movies from IMDB.",
"path": "https://phidata-public.s3.amazonaws.com/demo_data/IMDB-Movie-Data.csv",
}
]
}),
)
duckdb_assistant.print_response("What is the average rating of movies? Show me the SQL.", markdown=True)
```
- Install duckdb and run the `data_assistant.py` file
```shell
pip install duckdb
python data_assistant.py
``` Assistant that can generate pydantic models Show code One of our favorite LLM features is generating structured data (i.e. a pydantic model) from text. Use this feature to extract features, generate movie scripts, produce fake data etc.
Let's create a Movie Assistant to write a `MovieScript` for us.
- Create a file `movie_assistant.py`
```python
from typing import List
from pydantic import BaseModel, Field
from rich.pretty import pprint
from phi.assistant import Assistant
class MovieScript(BaseModel):
setting: str = Field(..., description="Provide a nice setting for a blockbuster movie.")
ending: str = Field(..., description="Ending of the movie. If not available, provide a happy ending.")
genre: str = Field(..., description="Genre of the movie. If not available, select action, thriller or romantic comedy.")
name: str = Field(..., description="Give a name to this movie")
characters: List[str] = Field(..., description="Name of characters for this movie.")
storyline: str = Field(..., description="3 sentence storyline for the movie. Make it exciting!")
movie_assistant = Assistant(
description="You help write movie scripts.",
output_model=MovieScript,
)
pprint(movie_assistant.run("New York"))
```
- Run the `movie_assistant.py` file
```shell
python movie_assistant.py
```
- The output is an object of the `MovieScript` class, here's how it looks:
```shell
MovieScript(
│ setting='A bustling and vibrant New York City',
│ ending='The protagonist saves the city and reconciles with their estranged family.',
│ genre='action',
│ name='City Pulse',
│ characters=['Alex Mercer', 'Nina Castillo', 'Detective Mike Johnson'],
│ storyline='In the heart of New York City, a former cop turned vigilante, Alex Mercer, teams up with a street-smart activist, Nina Castillo, to take down a corrupt political figure who threatens to destroy the city. As they navigate through the intricate web of power and deception, they uncover shocking truths that push them to the brink of their abilities. With time running out, they must race against the clock to save New York and confront their own demons.'
)
``` PDF Assistant with Knowledge & Storage Show code Lets create a PDF Assistant that can answer questions from a PDF. We'll use `PgVector` for knowledge and storage.
**Knowledge Base:** information that the Assistant can search to improve its responses (uses a vector db).
**Storage:** provides long term memory for Assistants (uses a database).
1. Run PgVector
Install [docker desktop](https://docs.docker.com/desktop/install/mac-install/) and run **PgVector** on port **5532** using:
```bash
docker run -d \
-e POSTGRES_DB=ai \
-e POSTGRES_USER=ai \
-e POSTGRES_PASSWORD=ai \
-e PGDATA=/var/lib/postgresql/data/pgdata \
-v pgvolume:/var/lib/postgresql/data \
-p 5532:5432 \
--name pgvector \
phidata/pgvector:16
```
2. Create PDF Assistant
- Create a file `pdf_assistant.py`
```python
import typer
from typing import Optional, List
from phi.assistant import Assistant
from phi.storage.assistant.postgres import PgAssistantStorage
from phi.knowledge.pdf import PDFUrlKnowledgeBase
from phi.vectordb.pgvector import PgVector2
db_url = "postgresql+psycopg://ai:ai@localhost:5532/ai"
knowledge_base = PDFUrlKnowledgeBase(
urls=["https://phi-public.s3.amazonaws.com/recipes/ThaiRecipes.pdf"],
vector_db=PgVector2(collection="recipes", db_url=db_url),
)
# Comment out after first run
knowledge_base.load()
storage = PgAssistantStorage(table_name="pdf_assistant", db_url=db_url)
def pdf_assistant(new: bool = False, user: str = "user"):
run_id: Optional[str] = None
if not new:
existing_run_ids: List[str] = storage.get_all_run_ids(user)
if len(existing_run_ids) > 0:
run_id = existing_run_ids[0]
assistant = Assistant(
run_id=run_id,
user_id=user,
knowledge_base=knowledge_base,
storage=storage,
# Show tool calls in the response
show_tool_calls=True,
# Enable the assistant to search the knowledge base
search_knowledge=True,
# Enable the assistant to read the chat history
read_chat_history=True,
)
if run_id is None:
run_id = assistant.run_id
print(f"Started Run: {run_id}\n")
else:
print(f"Continuing Run: {run_id}\n")
# Runs the assistant as a cli app
assistant.cli_app(markdown=True)
if __name__ == "__main__":
typer.run(pdf_assistant)
```
3. Install libraries
```shell
pip install -U pgvector pypdf "psycopg[binary]" sqlalchemy
```
4. Run PDF Assistant
```shell
python pdf_assistant.py
```
- Ask a question:
```
How do I make pad thai?
```
- See how the Assistant searches the knowledge base and returns a response.
- Message `bye` to exit, start the assistant again using `python pdf_assistant.py` and ask:
```
What was my last message?
```
See how the assistant now maintains storage across sessions.
- Run the `pdf_assistant.py` file with the `--new` flag to start a new run.
```shell
python pdf_assistant.py --new
``` Checkout the cookbook for more examples. Next Steps Read the basics to learn more about phidata. Read about Assistants and how to customize them. Checkout the cookbook for in-depth examples and code. Demos Checkout the following AI Applications built using phidata: PDF AI that summarizes and answers questions from PDFs. ArXiv AI that answers questions about ArXiv papers using the ArXiv API. HackerNews AI summarize stories, users and shares what's new on HackerNews. Tutorials LLM OS with gpt-4o Autonomous RAG Local RAG with Llama3 Llama3 Research Assistant powered by Groq Looking to build an AI product? We've helped many companies build AI products, the general workflow is: Build an Assistant with proprietary data to perform tasks specific to your product. Connect your product to the Assistant via an API. Monitor and Improve your AI product. We also provide dedicated support and development, book a call to get started. Contributions We're an open-source project and welcome contributions, please read the contributing guide for more information. Request a feature If you have a feature request, please open an issue or make a pull request. If you have ideas on how we can improve, please create a discussion. Roadmap Our roadmap is available here .
If you have a feature request, please open an issue/discussion. | Build AI Assistants with memory, knowledge and tools. | developer-tools,python,aws,ai,llm,llmops,gpt-4 | 130 | 35 | 193 | 1,189 | 52 | 96 | 2 |
imputnet/cobalt | cobalt best way to save what you love: cobalt.tools 💬 community discord server 🐦 twitter/x what's cobalt? cobalt is a media downloader that doesn't piss you off. it's fast, friendly, and doesn't have any bullshit that modern web is filled with: no ads, trackers, or invasive analytics . paste the link, get the file, move on. it's that simple. just how it should be. supported services this list is not final and keeps expanding over time. if support for a service you want is missing, create an issue (or a pull request 👀). | service | video + audio | only audio | only video | metadata | rich file names |
| :-------- | :-----------: | :--------: | :--------: | :------: | :-------------: |
| bilibili.com & bilibili.tv | ✅ | ✅ | ✅ | ➖ | ➖ |
| dailymotion | ✅ | ✅ | ✅ | ✅ | ✅ |
| instagram posts & reels | ✅ | ✅ | ✅ | ➖ | ➖ |
| loom | ✅ | ❌ | ✅ | ✅ | ➖ |
| ok video | ✅ | ❌ | ✅ | ✅ | ✅ |
| pinterest | ✅ | ✅ | ✅ | ➖ | ➖ |
| reddit | ✅ | ✅ | ✅ | ❌ | ❌ |
| rutube | ✅ | ✅ | ✅ | ✅ | ✅ |
| soundcloud | ➖ | ✅ | ➖ | ✅ | ✅ |
| streamable | ✅ | ✅ | ✅ | ➖ | ➖ |
| tiktok | ✅ | ✅ | ✅ | ❌ | ❌ |
| tumblr | ✅ | ✅ | ✅ | ➖ | ➖ |
| twitch clips | ✅ | ✅ | ✅ | ✅ | ✅ |
| twitter/x | ✅ | ✅ | ✅ | ➖ | ➖ |
| vimeo | ✅ | ✅ | ✅ | ✅ | ✅ |
| vine archive | ✅ | ✅ | ✅ | ➖ | ➖ |
| vk videos & clips | ✅ | ❌ | ✅ | ✅ | ✅ |
| youtube videos, shorts & music | ✅ | ✅ | ✅ | ✅ | ✅ | | emoji | meaning |
| :-----: | :---------------------- |
| ✅ | supported |
| ➖ | impossible/unreasonable |
| ❌ | not supported | additional notes or features (per service) | service | notes or features |
| :-------- | :----- |
| instagram | supports reels, photos, and videos. lets you pick what to save from multi-media posts. |
| pinterest | supports photos, gifs, videos and stories. |
| reddit | supports gifs and videos. |
| rutube | supports yappy & private links. |
| soundcloud | supports private links. |
| tiktok | supports videos with or without watermark, images from slideshow without watermark, and full (original) audios. |
| twitter/x | lets you pick what to save from multi-media posts. may not be 100% reliable due to current management. |
| vimeo | audio downloads are only available for dash. |
| youtube | supports videos, music, and shorts. 8K, 4K, HDR, VR, and high FPS videos. rich metadata & dubs. h264/av1/vp9 codecs. | cobalt api cobalt has an open api that you can use in your projects for free~ . it's easy and straightforward to use, check out the docs to learn how to use it. ✅ you can use the main api instance ( api.cobalt.tools ) in your personal projects. ❌ you cannot use the free api commercially (anywhere that's gated behind paywalls or ads). host your own instance for this. we reserve the right to restrict abusive/excessive access to the main instance api. how to run your own instance if you want to run your own instance for whatever purpose, follow this guide . it's highly recommended to use a docker compose method unless you run for developing/debugging purposes. partners cobalt is sponsored by royalehosting.net , all main instances are currently hosted on their network :) ethics and disclaimer cobalt is a tool for easing content downloads from internet and takes zero liability . you are responsible for what you download, how you use and distribute that content. please be mindful when using content of others and always credit original creators. fair use and credits benefit everyone. cobalt is NOT a piracy tool and cannot be used as such. it can only download free, publicly accessible content. such content can be easily downloaded through any browser's dev tools. pressing one button is easier, so i made a convenient, ad-less tool for such repeated actions. cobalt license cobalt code is licensed under AGPL-3.0 . cobalt branding, mascots, and other related assets included in the repo are copyrighted and not covered by the AGPL-3.0 license. you cannot use them under same terms. you are allowed to host an unmodified instance of cobalt with branding, but this does not give you permission to use it anywhere else, or make derivatives of it in any way. notes: mascots and other assets are a part of the branding. when making an alternative version of the project, please replace or remove all branding (including the name). you must link the original repo when using any parts of code (such as using separate processing modules in your project) or forking the project. if you make a modified version of cobalt, the codebase must be published under the same license (according to AGPL-3.0). 3rd party licenses Fluent Emoji by Microsoft (used in cobalt) is under MIT license. Noto Sans Mono fonts (used in cobalt) are licensed under the OFL license. many update banners were taken from tenor.com . acknowledgements ffmpeg cobalt heavily relies on ffmpeg for converting and merging media files. it's an absolutely amazing piece of software offered for anyone for free, yet doesn't receive as much credit as it should. you can support ffmpeg here ! ffmpeg-static we use ffmpeg-static to get binaries for ffmpeg depending on the platform. you can support the developer via various methods listed on their github page! (linked above) youtube.js cobalt relies on youtube.js for interacting with the innertube api, it wouldn't have been possible without it. you can support the developer via various methods listed on their github page! (linked above) many others cobalt also depends on: content-disposition-header to simplify the provision of content-disposition headers. cors to manage cross-origin resource sharing within expressjs. dotenv to load environment variables from the .env file. esbuild to minify the frontend files. express as the backbone of cobalt servers. express-rate-limit to rate limit api endpoints. hls-parser to parse m3u8 playlists for certain services. ipaddr.js to parse ip addresses (for rate limiting). nanoid to generate unique (temporary) identifiers for each requested stream. node-cache to cache stream info in server ram for a limited amount of time. psl as the domain name parser. set-cookie-parser to parse cookies that cobalt receives from certain services. undici for making http requests. url-pattern to match provided links with supported patterns. ...and many other packages that these packages rely on. | save what you love | bilibili,downloader,reddit,social-media,twitter,vk,webapp,youtube,tiktok,youtube-music | 0 | 26 | 242 | 1,088 | 73 | 3 | 2 |
windmill-labs/windmill | Open-source developer infrastructure for internal tools (APIs, background jobs, workflows and UIs). Self-hostable alternative to Airplane, Pipedream, Superblocks and a simplified Temporal with autogenerated UIs and custom UIs to trigger workflows and scripts as internal apps. Scripts are turned into sharable UIs automatically, and can be composed together into flows or used into richer apps built with low-code. Supported script languages supported are: Python, TypeScript, Go, Bash, SQL, and GraphQL. Try it - Docs - Discord - Hub - Contributor's guide # Windmill - Developer platform for APIs, background jobs, workflows and UIs
Windmill is fully open-sourced (AGPLv3) and Windmill Labs offers
dedicated instance and commercial support and licenses.
![Windmill Diagram](./imgs/stacks.svg)
https://github.com/windmill-labs/windmill/assets/122811744/0b132cd1-ee67-4505-822f-0c7ee7104252
- [Windmill - Developer platform for APIs, background jobs, workflows and UIs](#windmill---developer-platform-for-apis-background-jobs-workflows-and-uis)
- [Main Concepts](#main-concepts)
- [Show me some actual script code](#show-me-some-actual-script-code)
- [CLI](#cli)
- [Running scripts locally](#running-scripts-locally)
- [Stack](#stack)
- [Fastest Self-Hostable Workflow Engine](#fastest-self-hostable-workflow-engine)
- [Security](#security)
- [Sandboxing](#sandboxing)
- [Secrets, credentials and sensitive values](#secrets-credentials-and-sensitive-values)
- [Performance](#performance)
- [Architecture](#architecture)
- [How to self-host](#how-to-self-host)
- [Docker compose](#docker-compose)
- [Kubernetes (k8s) and Helm charts](#kubernetes-k8s-and-helm-charts)
- [Run from binaries](#run-from-binaries)
- [OAuth, SSO \& SMTP](#oauth-sso--smtp)
- [Commercial license](#commercial-license)
- [Integrations](#integrations)
- [Environment Variables](#environment-variables)
- [Run a local dev setup](#run-a-local-dev-setup)
- [only Frontend](#only-frontend)
- [Backend + Frontend](#backend--frontend)
- [Contributors](#contributors)
- [Copyright](#copyright)
## Main Concepts
1. Define a minimal and generic script in Python, TypeScript, Go or Bash that
solves a specific task. The code can be defined in the
[provided Web IDE](https://www.windmill.dev/docs/code_editor) or
[synchronized with your own GitHub repo](https://www.windmill.dev/docs/advanced/cli/sync)
(e.g. through
[VS Code](https://www.windmill.dev/docs/cli_local_dev/vscode-extension)
extension):
![Step 1](./imgs/windmill-editor.png)
2. Your scripts parameters are automatically parsed and
[generate a frontend](https://www.windmill.dev/docs/core_concepts/auto_generated_uis).
![Step 2](./imgs/windmill-run.png)
![Step 3](./imgs/windmill-result.png)
3. Make it [flow](https://www.windmill.dev/docs/flows/flow_editor)! You can
chain your scripts or scripts made by the community shared on
[WindmillHub](https://hub.windmill.dev).
![Step 3](./imgs/windmill-flow.png)
4. Build [complex UIs](https://www.windmill.dev/docs/apps/app_editor) on top of
your scripts and flows.
![Step 4](./imgs/windmill-builder.png)
Scripts and flows can also be triggered by a
[cron schedule](https://www.windmill.dev/docs/core_concepts/scheduling) (e.g.
'_/5 _ \* \* \*') or through
[webhooks](https://www.windmill.dev/docs/core_concepts/webhooks).
You can build your entire infra on top of Windmill!
## Show me some actual script code
```typescript
//import any dependency from npm
import * as wmill from "https://deno.land/x/windmill@v1.136.0/mod.ts";
import cowsay from "npm:cowsay@1.5.0";
// fill the type, or use the +Resource type to get a type-safe reference to a resource
type Postgresql = {
host: string;
port: number;
user: string;
dbname: string;
sslmode: string;
password: string;
};
export async function main(
a: number,
b: "my" | "enum",
c: Postgresql,
d = "inferred type string from default arg",
e = { nested: "object" }
//f: wmill.Base64
) {
const email = Deno.env.get("WM_EMAIL");
// variables are permissioned and by path
let variable = await wmill.getVariable("f/company-folder/my_secret");
const lastTimeRun = await wmill.getState();
// logs are printed and always inspectable
console.log(cowsay.say({ text: "hello " + email + " " + lastTimeRun }));
await wmill.setState(Date.now());
// return is serialized as JSON
return { foo: d, variable };
}
```
## CLI
We have a powerful CLI to interact with the windmill platform and sync your
scripts from local files, github repos and to run scripts and flows on the
instance from local commands. See
[more details](https://github.com/windmill-labs/windmill/tree/main/cli).
![CLI Screencast](./cli/vhs/output/setup.gif)
### Running scripts locally
You can run your script locally easily, you simply need to pass the right
environment variables for the `wmill` client library to fetch resources and
variables from your instance if necessary. See more: .
To develop & test locally scripts & flows, we recommend using the Windmill VS
Code extension: .
## Stack
- Postgres as the database
- backend in Rust with the following highly-available and horizontally scalable
architecture:
- stateless API backend
- workers that pull jobs from a queue in Postgres (and later, Kafka or Redis.
Upvote [#173](#https://github.com/windmill-labs/windmill/issues/173) if
interested )
- frontend in Svelte
- scripts executions are sandboxed using google's
[nsjail](https://github.com/google/nsjail)
- javascript runtime is the
[deno_core rust library](https://denolib.gitbook.io/guide/) (which itself uses
the [rusty_v8](https://github.com/denoland/rusty_v8) and hence V8 underneath)
- typescript runtime is deno
- python runtime is python3
- golang runtime is 1.19.1
## Fastest Self-Hostable Workflow Engine
We have compared Windmill to other self-hostable workflow engines (Airflow,
Prefect & Temporal) and Windmill is the most performant solution for both
benchmarks: one flow composed of 40 lightweight tasks & one flow composed of 10
long-running tasks.
All methodology & results on our
[Benchmarks](https://www.windmill.dev/docs/misc/benchmarks/competitors#airflow-setup)
page.
![Fastest workflow engine](./imgs/fastest.png)
## Security
### Sandboxing
Windmill can use [nsjail](https://github.com/google/nsjail). It is production
multi-tenant grade secure. Do not take our word for it, take
[fly.io's one](https://fly.io/blog/sandboxing-and-workload-isolation/).
### Secrets, credentials and sensitive values
There is one encryption key per workspace to encrypt the credentials and secrets
stored in Windmill's K/V store.
In addition, we strongly recommend that you encrypt the whole Postgres database.
That is what we do at .
## Performance
Once a job started, there is no overhead compared to running the same script on
the node with its corresponding runner (Deno/Go/Python/Bash). The added latency
from a job being pulled from the queue, started, and then having its result sent
back to the database is ~50ms. A typical lightweight deno job will take around
100ms total.
## Architecture ## How to self-host
We only provide docker-compose setup here. For more advanced setups, like
compiling from source or using without a postgres super user, see
[Self-Host documentation](https://www.windmill.dev/docs/advanced/self_host).
### Docker compose
Windmill can be deployed using 3 files:
([docker-compose.yml](./docker-compose.yml), [Caddyfile](./Caddyfile) and a
[.env](./.env)) in a single command.
Make sure Docker is started, and run:
```
curl https://raw.githubusercontent.com/windmill-labs/windmill/main/docker-compose.yml -o docker-compose.yml
curl https://raw.githubusercontent.com/windmill-labs/windmill/main/Caddyfile -o Caddyfile
curl https://raw.githubusercontent.com/windmill-labs/windmill/main/.env -o .env
docker compose up -d
```
Go to http://localhost et voilà :)
The default super-admin user is: admin@windmill.dev / changeme.
From there, you can follow the setup app and create other users.
More details in
[Self-Host Documention](https://www.windmill.dev/docs/advanced/self_host#docker).
### Kubernetes (k8s) and Helm charts
We publish helm charts at: .
### Run from binaries
Each release includes the corresponding binaries for x86_64. You can simply
download the latest `windmill` binary using the following set of bash commands.
```bash
BINARY_NAME='windmill-amd64' # or windmill-ee-amd64 for the enterprise edition
LATEST_RELEASE=$(curl -L -s -H 'Accept: application/json' https://github.com/windmill-labs/windmill/releases/latest)
LATEST_VERSION=$(echo $LATEST_RELEASE | sed -e 's/.*"tag_name":"\([^"]*\)".*/\1/')
ARTIFACT_URL="https://github.com/windmill-labs/windmill/releases/download/$LATEST_VERSION/$BINARY_NAME"
wget "$ARTIFACT_URL" -O windmill
```
### OAuth, SSO & SMTP
Windmill Community Edition allows to configure the OAuth, SSO (including Google
Workspace SSO, Microsoft/Azure and Okta) directly from the UI in the superadmin
settings. Do note that there is a limit of 10 SSO users on the community
edition.
[See documentation](https://www.windmill.dev/docs/misc/setup_oauth).
### Commercial license
To self-host Windmill, you must respect the terms of the
[AGPLv3 license](https://www.gnu.org/licenses/agpl-3.0.en.html) which you do not
need to worry about for personal uses. For business uses, you should be fine if
you do not re-expose Windmill in any way to your users and are comfortable with
AGPLv3.
To
[re-expose any Windmill parts to your users](https://www.windmill.dev/docs/misc/white_labelling)
as a feature of your product, or to build a feature on top of Windmill, to
comply with AGPLv3 your product must be AGPLv3 or you must get a commercial
license. Contact us at if you have any doubts.
In addition, a commercial license grants you a dedicated engineer to transition
your current infrastructure to Windmill, support with tight SLA, and our global
cache sync for high-performance/no dependency cache miss of cluster from 10+
nodes to 200+ nodes.
### Integrations
In Windmill, integrations are referred to as
[resources and resource types](https://www.windmill.dev/docs/core_concepts/resources_and_types).
Each Resource has a Resource Type that defines the schema that the resource
needs to implement.
On self-hosted instances, you might want to import all the approved resource
types from [WindmillHub](https://hub.windmill.dev). A setup script will prompt
you to have it being synced automatically everyday.
## Environment Variables
| Environment Variable name | Default | Description | Api Server/Worker/All |
| ------------------------- | ---------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------- |
| DATABASE_URL | | The Postgres database url. | All |
| WORKER_GROUP | default | The worker group the worker belongs to and get its configuration pulled from | Worker |
| MODE | standalone | The mode if the binary. Possible values: standalone, worker, server | All |
| METRICS_ADDR | None | (ee only) The socket addr at which to expose Prometheus metrics at the /metrics path. Set to "true" to expose it on port 8001 | All |
| JSON_FMT | false | Output the logs in json format instead of logfmt | All |
| BASE_URL | http://localhost:8000 | The base url that is exposed publicly to access your instance. Is overriden by the instance settings if any. | Server |
| SCRIPT_TOKEN_EXPIRY | 900 | The default duration period of the ephemeral-token generated at the beginning of a script | Worker |
| ZOMBIE_JOB_TIMEOUT | 30 | The timeout after which a job is considered to be zombie if the worker did not send pings about processing the job (every server check for zombie jobs every 30s) | Server |
| RESTART_ZOMBIE_JOBS | true | If true then a zombie job is restarted (in-place with the same uuid and some logs), if false the zombie job is failed | Server |
| SLEEP_QUEUE | 50 | The number of ms to sleep in between the last check for new jobs in the DB. It is multiplied by NUM_WORKERS such that in average, for one worker instance, there is one pull every SLEEP_QUEUE ms. | Worker |
| MAX_LOG_SIZE | 500000 | The maximum number of characters a job can emit (log + result) | Worker |
| DISABLE_NUSER | false | If Nsjail is enabled, disable the nsjail's `clone_newuser` setting | Worker |
| KEEP_JOB_DIR | false | Keep the job directory after the job is done. Useful for debugging. | Worker |
| LICENSE_KEY (EE only) | None | License key checked at startup for the Enterprise Edition of Windmill | Worker |
| S3_CACHE_BUCKET (EE only) | None | The S3 bucket to sync the cache of the workers to | Worker |
| SLACK_SIGNING_SECRET | None | The signing secret of your Slack app. See [Slack documentation](https://api.slack.com/authentication/verifying-requests-from-slack) | Server |
| COOKIE_DOMAIN | None | The domain of the cookie. If not set, the cookie will be set by the browser based on the full origin | Server |
| DENO_PATH | /usr/bin/deno | The path to the deno binary. | Worker |
| PYTHON_PATH | /usr/local/bin/python3 | The path to the python binary. | Worker |
| GO_PATH | /usr/bin/go | The path to the go binary. | Worker |
| GOPRIVATE | | The GOPRIVATE env variable to use private go modules | Worker |
| GOPROXY | | The GOPROXY env variable to use | Worker |
| NETRC | | The netrc content to use a private go registry | Worker |
| PIP_INDEX_URL | None | The index url to pass for pip. | Worker |
| PIP_EXTRA_INDEX_URL | None | The extra index url to pass to pip. | Worker |
| PIP_TRUSTED_HOST | None | The trusted host to pass to pip. | Worker |
| PATH | None | The path environment variable, usually inherited | Worker |
| HOME | None | The home directory to use for Go and Bash , usually inherited | Worker |
| DATABASE_CONNECTIONS | 50 (Server)/3 (Worker) | The max number of connections in the database connection pool | All |
| SUPERADMIN_SECRET | None | A token that would let the caller act as a virtual superadmin superadmin@windmill.dev | Server |
| TIMEOUT_WAIT_RESULT | 20 | The number of seconds to wait before timeout on the 'run_wait_result' endpoint | Worker |
| QUEUE_LIMIT_WAIT_RESULT | None | The number of max jobs in the queue before rejecting immediately the request in 'run_wait_result' endpoint. Takes precedence on the query arg. If none is specified, there are no limit. | Worker |
| DENO_AUTH_TOKENS | None | Custom DENO_AUTH_TOKENS to pass to worker to allow the use of private modules | Worker |
| DISABLE_RESPONSE_LOGS | false | Disable response logs | Server |
## Run a local dev setup
### only Frontend
This will use the backend of but your own frontend
with hot-code reloading.
1. Install [caddy](https://caddyserver.com)
2. Go to `frontend/`:
1. `npm install`, `npm run generate-backend-client` then `npm run dev`
2. In another shell `sudo caddy run --config CaddyfileRemote`
3. Et voilà, windmill should be available at `http://localhost/`
### Backend + Frontend
See the [./frontend/README_DEV.md](./frontend/README_DEV.md) file for all
running options.
1. Create a Postgres Database for Windmill and create an admin role inside your
Postgres setup.
The easiest way to get a working db is to run
```
cargo install sqlx-cli
env DATABASE_URL= sqlx migrate run
```
This will also avoid compile time issue with sqlx's `query!` macro
2. Install [nsjail](https://github.com/google/nsjail) and have it accessible in
your PATH
3. Install deno and python3, have the bins at `/usr/bin/deno` and
`/usr/local/bin/python3`
4. Install [caddy](https://caddyserver.com)
5. Install the [lld linker](https://lld.llvm.org/)
6. Go to `frontend/`:
1. `npm install`, `npm run generate-backend-client` then `npm run dev`
2. You might need to set some extra heap space for the node runtime `export NODE_OPTIONS="--max-old-space-size=4096"`
3. In another shell `npm run build` otherwise the backend will not find the `frontend/build` folder and will not compile.
4. In another shell `sudo caddy run --config Caddyfile`
7. Go to `backend/`:
`env DATABASE_URL= RUST_LOG=info cargo run`
8. Et voilà, windmill should be available at `http://localhost/`
## Contributors ## Copyright
Windmill Labs, Inc 2023 | Open-source developer platform to turn scripts into workflows and UIs. Fastest workflow engine (5x vs Airflow). Open-source alternative to Airplane and Retool. | low-code,open-source,platform,python,typescript,postgresql,self-hostable | 668 | 87 | 3,144 | 6,986 | 210 | 171 | 19 |
pacocoursey/cmdk | ⌘K ⌘K is a command menu React component that can also be used as an accessible combobox. You render items, it filters and sorts them automatically. ⌘K supports a fully composable API How? , so you can wrap items in other components or even as static JSX. Demo and examples: cmdk.paco.me Install bash
pnpm install cmdk Use ```tsx
import { Command } from 'cmdk' const CommandMenu = () => {
return ( No results found. <Command.Group heading="Letters">
<Command.Item>a</Command.Item>
<Command.Item>b</Command.Item>
<Command.Separator />
<Command.Item>c</Command.Item>
</Command.Group>
<Command.Item>Apple</Command.Item>
</Command.List>
</Command> )
}
``` Or in a dialog: ```tsx
import { Command } from 'cmdk' const CommandMenu = () => {
const [open, setOpen] = React.useState(false) // Toggle the menu when ⌘K is pressed
React.useEffect(() => {
const down = (e) => {
if (e.key === 'k' && (e.metaKey || e.ctrlKey)) {
e.preventDefault()
setOpen((open) => !open)
}
} document.addEventListener('keydown', down)
return () => document.removeEventListener('keydown', down) }, []) return ( No results found. <Command.Group heading="Letters">
<Command.Item>a</Command.Item>
<Command.Item>b</Command.Item>
<Command.Separator />
<Command.Item>c</Command.Item>
</Command.Group>
<Command.Item>Apple</Command.Item>
</Command.List>
</Command.Dialog> )
}
``` Parts and styling All parts forward props, including ref , to an appropriate element. Each part has a specific data-attribute (starting with cmdk- ) that can be used for styling. Command [cmdk-root] Render this to show the command menu inline, or use Dialog to render in a elevated context. Can be controlled with the value and onValueChange props. Note Values are always trimmed with the trim() method. ```tsx
const [value, setValue] = React.useState('apple') return ( Orange Apple )
``` You can provide a custom filter function that is called to rank each item. Note that the value will be trimmed. tsx
<Command
filter={(value, search) => {
if (value.includes(search)) return 1
return 0
}}
/> A third argument, keywords , can also be provided to the filter function. Keywords act as aliases for the item value, and can also affect the rank of the item. Keywords are trimmed. tsx
<Command
filter={(value, search, keywords) => {
const extendValue = value + ' ' + keywords.join(' ')
if (extendValue.includes(search)) return 1
return 0
}}
/> Or disable filtering and sorting entirely: tsx
<Command shouldFilter={false}>
<Command.List>
{filteredItems.map((item) => {
return (
<Command.Item key={item} value={item}>
{item}
</Command.Item>
)
})}
</Command.List>
</Command> You can make the arrow keys wrap around the list (when you reach the end, it goes back to the first item) by setting the loop prop: tsx
<Command loop /> Dialog [cmdk-dialog] [cmdk-overlay] Props are forwarded to Command . Composes Radix UI's Dialog component. The overlay is always rendered. See the Radix Documentation for more information. Can be controlled with the open and onOpenChange props. ```tsx
const [open, setOpen] = React.useState(false) return ( ... )
``` You can provide a container prop that accepts an HTML element that is forwarded to Radix UI's Dialog Portal component to specify which element the Dialog should portal into (defaults to body ). See the Radix Documentation for more information. ```tsx
const containerElement = React.useRef(null) return (
<> )
``` Input [cmdk-input] All props are forwarded to the underlying input element. Can be controlled with the value and onValueChange props. ```tsx
const [search, setSearch] = React.useState('') return ``` List [cmdk-list] Contains items and groups. Animate height using the --cmdk-list-height CSS variable. css
[cmdk-list] {
min-height: 300px;
height: var(--cmdk-list-height);
max-height: 500px;
transition: height 100ms ease;
} To scroll item into view earlier near the edges of the viewport, use scroll-padding: css
[cmdk-list] {
scroll-padding-block-start: 8px;
scroll-padding-block-end: 8px;
} Item [cmdk-item] [data-disabled?] [data-selected?] Item that becomes active on pointer enter. You should provide a unique value for each item, but it will be automatically inferred from the .textContent . ```tsx console.log('Selected', value)}
// Value is implicity "apple" because of the provided text content Apple ``` You can also provide a keywords prop to help with filtering. Keywords are trimmed. tsx
<Command.Item keywords={['fruit', 'apple']}>Apple</Command.Item> ```tsx console.log('Selected', value)}
// Value is implicity "apple" because of the provided text content Apple ``` You can force an item to always render, regardless of filtering, by passing the forceMount prop. Group [cmdk-group] [hidden?] Groups items together with the given heading ( [cmdk-group-heading] ). tsx
<Command.Group heading="Fruit">
<Command.Item>Apple</Command.Item>
</Command.Group> Groups will not unmount from the DOM, rather the hidden attribute is applied to hide it from view. This may be relevant in your styling. You can force a group to always render, regardless of filtering, by passing the forceMount prop. Separator [cmdk-separator] Visible when the search query is empty or alwaysRender is true, hidden otherwise. Empty [cmdk-empty] Automatically renders when there are no results for the search query. Loading [cmdk-loading] You should conditionally render this with progress while loading asynchronous items. ```tsx
const [loading, setLoading] = React.useState(false) return {loading && Hang on… } ``` useCommandState(state => state.selectedField) Hook that composes useSyncExternalStore . Pass a function that returns a slice of the command menu state to re-render when that slice changes. This hook is provided for advanced use cases and should not be commonly used. A good use case would be to render a more detailed empty state, like so: tsx
const search = useCommandState((state) => state.search)
return <Command.Empty>No results found for "{search}".</Command.Empty> Examples Code snippets for common use cases. Nested items Often selecting one item should navigate deeper, with a more refined set of items. For example selecting "Change theme…" should show new items "Dark theme" and "Light theme". We call these sets of items "pages", and they can be implemented with simple state: ```tsx
const ref = React.useRef(null)
const [open, setOpen] = React.useState(false)
const [search, setSearch] = React.useState('')
const [pages, setPages] = React.useState([])
const page = pages[pages.length - 1] return ( {
// Escape goes to previous page
// Backspace goes to previous page when search is empty
if (e.key === 'Escape' || (e.key === 'Backspace' && !search)) {
e.preventDefault()
setPages((pages) => pages.slice(0, -1))
}
}} <Command.Input value={search} onValueChange={setSearch} />
<Command.List>
{!page && (
<>
<Command.Item onSelect={() => setPages([...pages, 'projects'])}>Search projects…</Command.Item>
<Command.Item onSelect={() => setPages([...pages, 'teams'])}>Join a team…</Command.Item>
)} {page === 'projects' && (
<>
<Command.Item>Project A</Command.Item>
<Command.Item>Project B</Command.Item>
)}
{page === 'teams' && (
<>
<Command.Item>Team 1</Command.Item>
<Command.Item>Team 2</Command.Item>
)}
</Command.List> )
``` Show sub-items when searching If your items have nested sub-items that you only want to reveal when searching, render based on the search state: ```tsx
const SubItem = (props) => {
const search = useCommandState((state) => state.search)
if (!search) return null
return } return ( Change theme… Change theme to dark Change theme to light )
``` Asynchronous results Render the items as they become available. Filtering and sorting will happen automatically. ```tsx
const [loading, setLoading] = React.useState(false)
const [items, setItems] = React.useState([]) React.useEffect(() => {
async function getItems() {
setLoading(true)
const res = await api.get('/dictionary')
setItems(res)
setLoading(false)
} getItems()
}, []) return ( {loading && Fetching words… }
{items.map((item) => {
return ( word-${item} } value={item}>
{item} )
})} )
``` Use inside Popover We recommend using the Radix UI popover component. ⌘K relies on the Radix UI Dialog component, so this will reduce your bundle size a bit due to shared dependencies. bash
$ pnpm install @radix-ui/react-popover Render Command inside of the popover content: ```tsx
import * as Popover from '@radix-ui/react-popover' return ( Toggle popover <Popover.Content>
<Command>
<Command.Input />
<Command.List>
<Command.Item>Apple</Command.Item>
</Command.List>
</Command>
</Popover.Content> )
``` Drop in stylesheets You can find global stylesheets to drop in as a starting point for styling. See website/styles/cmdk for examples. FAQ Accessible? Yes. Labeling, aria attributes, and DOM ordering tested with Voice Over and Chrome DevTools. Dialog composes an accessible Dialog implementation. Virtualization? No. Good performance up to 2,000-3,000 items, though. Read below to bring your own. Filter/sort items manually? Yes. Pass shouldFilter={false} to Command . Better memory usage and performance. Bring your own virtualization this way. React 18 safe? Yes, required. Uses React 18 hooks like useId and useSyncExternalStore . Unstyled? Yes, use the listed CSS selectors. Hydration mismatch? No, likely a bug in your code. Ensure the open prop to Command.Dialog is false on the server. React strict mode safe? Yes. Open an issue if you notice an issue. Weird/wrong behavior? Make sure your Command.Item has a key and unique value . Concurrent mode safe? Maybe, but concurrent mode is not yet real. Uses risky approaches like manual DOM ordering. React server component? No, it's a client component. Listen for ⌘K automatically? No, do it yourself to have full control over keybind context. React Native? No, and no plans to support it. If you build a React Native version, let us know and we'll link your repository here. History Written in 2019 by Paco ( @pacocoursey ) to see if a composable combobox API was possible. Used for the Vercel command menu and autocomplete by Rauno ( @raunofreiberg ) in 2020. Re-written independently in 2022 with a simpler and more performant approach. Ideas and help from Shu ( @shuding_ ). use-descendants was extracted from the 2019 version. Testing First, install dependencies and Playwright browsers: bash
pnpm install
pnpm playwright install Then ensure you've built the library: bash
pnpm build Then run the tests using your local build against real browser engines: bash
pnpm test | Fast, unstyled command menu React component. | combobox,command-palette,radix-ui,react,command-menu | 7 | 40 | 100 | 130 | 55 | 4 | 1 |
bigscience-workshop/petals | Run large language models at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading Generate text with distributed Llama 2 (70B), Falcon (40B+), BLOOM (176B) (or their derivatives), and fine‑tune them for your own tasks — right from your desktop computer or Google Colab: ```python
from transformers import AutoTokenizer
from petals import AutoDistributedModelForCausalLM Choose any model available at https://health.petals.dev model_name = "petals-team/StableBeluga2" # This one is fine-tuned Llama 2 (70B) Connect to a distributed network hosting model layers tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoDistributedModelForCausalLM.from_pretrained(model_name) Run the model as if it were on your computer inputs = tokenizer("A cat sat", return_tensors="pt")["input_ids"]
outputs = model.generate(inputs, max_new_tokens=5)
print(tokenizer.decode(outputs[0])) # A cat sat on a mat...
``` 🚀 Try now in Colab 🔏 Privacy. Your data will be processed with the help of other people in the public swarm. Learn more about privacy here . For sensitive data, you can set up a private swarm among people you trust. 🦙 Want to run Llama 2? Request access to its weights at the ♾️ Meta AI website and 🤗 Model Hub , then run huggingface-cli login in the terminal before loading the model. Or just try it in our chatbot app . 💬 Any questions? Ping us in our Discord ! Connect your GPU and increase Petals capacity Petals is a community-run system — we rely on people sharing their GPUs. You can check out available models and help serving one of them! As an example, here is how to host a part of Stable Beluga 2 on your GPU: 🐧 Linux + Anaconda. Run these commands for NVIDIA GPUs (or follow this for AMD): bash
conda install pytorch pytorch-cuda=11.7 -c pytorch -c nvidia
pip install git+https://github.com/bigscience-workshop/petals
python -m petals.cli.run_server petals-team/StableBeluga2 🪟 Windows + WSL. Follow this guide on our Wiki. 🐋 Docker. Run our Docker image for NVIDIA GPUs (or follow this for AMD): bash
sudo docker run -p 31330:31330 --ipc host --gpus all --volume petals-cache:/cache --rm \
learningathome/petals:main \
python -m petals.cli.run_server --port 31330 petals-team/StableBeluga2 🍏 macOS + Apple M1/M2 GPU. Install Homebrew , then run these commands: bash
brew install python
python3 -m pip install git+https://github.com/bigscience-workshop/petals
python3 -m petals.cli.run_server petals-team/StableBeluga2 📚 Learn more (how to use multiple GPUs, start the server on boot, etc.) 💬 Any questions? Ping us in our Discord ! 🦙 Want to host Llama 2? Request access to its weights at the ♾️ Meta AI website and 🤗 Model Hub , generate an 🔑 access token , then add --token YOUR_TOKEN_HERE to the python -m petals.cli.run_server command. 🔒 Security. Hosting a server does not allow others to run custom code on your computer. Learn more here . 🏆 Thank you! Once you load and host 10+ blocks, we can show your name or link on the swarm monitor as a way to say thanks. You can specify them with --public_name YOUR_NAME . How does it work? You load a small part of the model, then join a network of people serving the other parts. Single‑batch inference runs at up to 6 tokens/sec for Llama 2 (70B) and up to 4 tokens/sec for Falcon (180B) — enough for chatbots and interactive apps. You can employ any fine-tuning and sampling methods, execute custom paths through the model, or see its hidden states. You get the comforts of an API with the flexibility of PyTorch and 🤗 Transformers . 📜 Read paper 📚 See FAQ 📚 Tutorials, examples, and more Basic tutorials: Getting started: tutorial Prompt-tune Llama-65B for text semantic classification: tutorial Prompt-tune BLOOM to create a personified chatbot: tutorial Useful tools: Chatbot web app (connects to Petals via an HTTP/WebSocket endpoint): source code Monitor for the public swarm: source code Advanced guides: Launch a private swarm: guide Run a custom model: guide Benchmarks Please see Section 3.3 of our paper . 🛠️ Contributing Please see our FAQ on contributing. 📜 Citation Alexander Borzunov, Dmitry Baranchuk, Tim Dettmers, Max Ryabinin, Younes Belkada, Artem Chumachenko, Pavel Samygin, and Colin Raffel. Petals: Collaborative Inference and Fine-tuning of Large Models. arXiv preprint arXiv:2209.01188, 2022. bibtex
@article{borzunov2022petals,
title = {Petals: Collaborative Inference and Fine-tuning of Large Models},
author = {Borzunov, Alexander and Baranchuk, Dmitry and Dettmers, Tim and Ryabinin, Max and Belkada, Younes and Chumachenko, Artem and Samygin, Pavel and Raffel, Colin},
journal = {arXiv preprint arXiv:2209.01188},
year = {2022},
url = {https://arxiv.org/abs/2209.01188}
} This project is a part of the BigScience research workshop. | 🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading | bloom,deep-learning,distributed-systems,language-models,large-language-models,machine-learning,neural-networks,pytorch,volunteer-computing,pipeline-parallelism | 11 | 46 | 397 | 509 | 76 | 131 | 3 |
nerfstudio-project/nerfstudio | A collaboration friendly studio for NeRFs Quickstart Learn more Supported Features About It’s as simple as plug and play with nerfstudio! Nerfstudio provides a simple API that allows for a simplified end-to-end process of creating, training, and testing NeRFs.
The library supports a more interpretable implementation of NeRFs by modularizing each component. With more modular NeRFs, we hope to create a more user-friendly experience in exploring the technology. This is a contributor-friendly repo with the goal of building a community where users can more easily build upon each other's contributions.
Nerfstudio initially launched as an opensource project by Berkeley students in KAIR lab at Berkeley AI Research (BAIR) in October 2022 as a part of a research project ( paper ). It is currently developed by Berkeley students and community contributors. We are committed to providing learning resources to help you understand the basics of (if you're just getting started), and keep up-to-date with (if you're a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we're here to help with tutorials, documentation, and more! Have feature requests? Want to add your brand-spankin'-new NeRF model? Have a new dataset? We welcome contributions ! Please do not hesitate to reach out to the nerfstudio team with any questions via Discord . Have feedback? We'd love for you to fill out our Nerfstudio Feedback Form if you want to let us know who you are, why you are interested in Nerfstudio, or provide any feedback! We hope nerfstudio enables you to build faster :hammer: learn together :books: and contribute to our NeRF community :sparkling_heart:. Sponsors Sponsors of this work includes Luma AI and the BAIR commons . Quickstart The quickstart will help you get started with the default vanilla NeRF trained on the classic Blender Lego scene.
For more complex changes (e.g., running with your own data/setting up a new NeRF graph), please refer to our references . 1. Installation: Setup the environment Prerequisites You must have an NVIDIA video card with CUDA installed on the system. This library has been tested with version 11.8 of CUDA. You can find more information about installing CUDA here Create environment Nerfstudio requires python >= 3.8 . We recommend using conda to manage dependencies. Make sure to install Conda before proceeding. bash
conda create --name nerfstudio -y python=3.8
conda activate nerfstudio
pip install --upgrade pip Dependencies Install PyTorch with CUDA (this repo has been tested with CUDA 11.7 and CUDA 11.8) and tiny-cuda-nn . cuda-toolkit is required for building tiny-cuda-nn . For CUDA 11.8: ```bash
pip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118 conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit
pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
``` See Dependencies in the Installation documentation for more. Installing nerfstudio Easy option: bash
pip install nerfstudio OR if you want the latest and greatest: bash
git clone https://github.com/nerfstudio-project/nerfstudio.git
cd nerfstudio
pip install --upgrade pip setuptools
pip install -e . OR if you want to skip all installation steps and directly start using nerfstudio, use the docker image: See Installation - Use docker image . 2. Training your first model! The following will train a nerfacto model, our recommended model for real world scenes. ```bash Download some test data: ns-download-data nerfstudio --capture-name=poster Train model ns-train nerfacto --data data/nerfstudio/poster
``` If everything works, you should see training progress like the following: Navigating to the link at the end of the terminal will load the webviewer. If you are running on a remote machine, you will need to port forward the websocket port (defaults to 7007). Resume from checkpoint / visualize existing run It is possible to load a pretrained model by running bash
ns-train nerfacto --data data/nerfstudio/poster --load-dir {outputs/.../nerfstudio_models} Visualize existing run Given a pretrained model checkpoint, you can start the viewer by running bash
ns-viewer --load-config {outputs/.../config.yml} 3. Exporting Results Once you have a NeRF model you can either render out a video or export a point cloud. Render Video First we must create a path for the camera to follow. This can be done in the viewer under the "RENDER" tab. Orient your 3D view to the location where you wish the video to start, then press "ADD CAMERA". This will set the first camera key frame. Continue to new viewpoints adding additional cameras to create the camera path. We provide other parameters to further refine your camera path. Once satisfied, press "RENDER" which will display a modal that contains the command needed to render the video. Kill the training job (or create a new terminal if you have lots of compute) and run the command to generate the video. Other video export options are available, learn more by running bash
ns-render --help Generate Point Cloud While NeRF models are not designed to generate point clouds, it is still possible. Navigate to the "EXPORT" tab in the 3D viewer and select "POINT CLOUD". If the crop option is selected, everything in the yellow square will be exported into a point cloud. Modify the settings as desired then run the command at the bottom of the panel in your command line. Alternatively you can use the CLI without the viewer. Learn about the export options by running bash
ns-export pointcloud --help 4. Using Custom Data Using an existing dataset is great, but likely you want to use your own data! We support various methods for using your own data. Before it can be used in nerfstudio, the camera location and orientations must be determined and then converted into our format using ns-process-data . We rely on external tools for this, instructions and information can be found in the documentation. | Data | Capture Device | Requirements | ns-process-data Speed |
| --------------------------------------------------------------------------------------------- | -------------- | ----------------------------------------------------------------- | ----------------------- |
| 📷 Images | Any | COLMAP | 🐢 |
| 📹 Video | Any | COLMAP | 🐢 |
| 🌎 360 Data | Any | COLMAP | 🐢 |
| 📱 Polycam | IOS with LiDAR | Polycam App | 🐇 |
| 📱 KIRI Engine | IOS or Android | KIRI Engine App | 🐇 |
| 📱 Record3D | IOS with LiDAR | Record3D app | 🐇 |
| 📱 Spectacular AI | IOS, OAK, others | App / sai-cli | 🐇 |
| 🖥 Metashape | Any | Metashape | 🐇 |
| 🖥 RealityCapture | Any | RealityCapture | 🐇 |
| 🖥 ODM | Any | ODM | 🐇 |
| 👓 Aria | Aria glasses | Project Aria | 🐇 |
| 🛠 Custom | Any | Camera Poses | 🐇 | 5. Advanced Options Training models other than nerfacto We provide other models than nerfacto, for example if you want to train the original nerf model, use the following command bash
ns-train vanilla-nerf --data DATA_PATH For a full list of included models run ns-train --help . Modify Configuration Each model contains many parameters that can be changed, too many to list here. Use the --help command to see the full list of configuration options. bash
ns-train nerfacto --help Tensorboard / WandB / Viewer We support four different methods to track training progress, using the viewer tensorboard , Weights and Biases , and , Comet . You can specify which visualizer to use by appending --vis {viewer, tensorboard, wandb, comet viewer+wandb, viewer+tensorboard, viewer+comet} to the training command. Simultaneously utilizing the viewer alongside wandb or tensorboard may cause stuttering issues during evaluation steps. The viewer only works for methods that are fast (ie. nerfacto, instant-ngp), for slower methods like NeRF, use the other loggers. Learn More And that's it for getting started with the basics of nerfstudio. If you're interested in learning more on how to create your own pipelines, develop with the viewer, run benchmarks, and more, please check out some of the quicklinks below or visit our documentation directly. | Section | Description |
| ---------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------- |
| Documentation | Full API documentation and tutorials |
| Viewer | Home page for our web viewer |
| 🎒 Educational |
| Model Descriptions | Description of all the models supported by nerfstudio and explanations of component parts. |
| Component Descriptions | Interactive notebooks that explain notable/commonly used modules in various models. |
| 🏃 Tutorials |
| Getting Started | A more in-depth guide on how to get started with nerfstudio from installation to contributing. |
| Using the Viewer | A quick demo video on how to navigate the viewer. |
| Using Record3D | Demo video on how to run nerfstudio without using COLMAP. |
| 💻 For Developers |
| Creating pipelines | Learn how to easily build new neural rendering pipelines by using and/or implementing new modules. |
| Creating datasets | Have a new dataset? Learn how to run it with nerfstudio. |
| Contributing | Walk-through for how you can start contributing now. |
| 💖 Community |
| Discord | Join our community to discuss more. We would love to hear from you! |
| Twitter | Follow us on Twitter @nerfstudioteam to see cool updates and announcements |
| Feedback Form | We welcome any feedback! This is our chance to learn what you all are using Nerfstudio for. | Supported Features We provide the following support structures to make life easier for getting started with NeRFs. If you are looking for a feature that is not currently supported, please do not hesitate to contact the Nerfstudio Team on Discord ! :mag_right: Web-based visualizer that allows you to: Visualize training in real-time + interact with the scene Create and render out scenes with custom camera trajectories View different output types And more! :pencil2: Support for multiple logging interfaces (Tensorboard, Wandb), code profiling, and other built-in debugging tools :chart_with_upwards_trend: Easy-to-use benchmarking scripts on the Blender dataset :iphone: Full pipeline support (w/ Colmap, Polycam, or Record3D) for going from a video on your phone to a full 3D render. Built On <img alt="tyro logo" src="https://brentyi.github.io/tyro/_static/logo-light.svg" width="150px" /> Easy-to-use config system Developed by Brent Yi <img alt="tyro logo" src="https://user-images.githubusercontent.com/3310961/199084143-0d63eb40-3f35-48d2-a9d5-78d1d60b7d66.png" width="250px" /> Library for accelerating NeRF renders Developed by Ruilong Li Citation You can find a paper writeup of the framework on arXiv . If you use this library or find the documentation useful for your research, please consider citing: @inproceedings{nerfstudio,
title = {Nerfstudio: A Modular Framework for Neural Radiance Field Development},
author = {
Tancik, Matthew and Weber, Ethan and Ng, Evonne and Li, Ruilong and Yi, Brent
and Kerr, Justin and Wang, Terrance and Kristoffersen, Alexander and Austin,
Jake and Salahi, Kamyar and Ahuja, Abhik and McAllister, David and Kanazawa,
Angjoo
},
year = 2023,
booktitle = {ACM SIGGRAPH 2023 Conference Proceedings},
series = {SIGGRAPH '23}
} Contributors | A collaboration friendly studio for NeRFs | nerf,pytorch,3d,3d-graphics,3d-reconstruction,computer-vision,deep-learning,machine-learning,photogrammetry,gaussian-splatting | 38 | 227 | 1,706 | 1,967 | 625 | 78 | 3 |
askorama/orama | Website • Blog • Documentation • Community Slack Full-text, vector, and hybrid search with a unique API. On your browser, server, mobile app, or at the edge. In less than 2kb. Join Orama's Slack channel If you need more info, help, or want to provide general feedback on Orama, join
the Orama Slack channel Highlighted features Vector Search Hybrid Search Search Filters Geosearch Facets Fields Boosting Typo Tolerance Exact Match BM25 Stemming and tokenization in 30 languages Plugin System Installation You can install Orama using npm , yarn , pnpm , bun : sh
npm i @orama/orama Or import it directly in a browser module: ```html ``` With Deno, you can just use the same CDN URL or use npm specifiers: js
import { create, search, insert } from 'npm:@orama/orama' Read the complete documentation at https://docs.askorama.ai . Usage Orama is quite simple to use. The first thing to do is to create a new database
instance and set an indexing schema: ```js
import { create, insert, remove, search, searchVector } from '@orama/orama' const db = await create({
schema: {
name: 'string',
description: 'string',
price: 'number',
embedding: 'vector[1536]', // Vector size must be expressed during schema initialization
meta: {
rating: 'number',
},
},
})
``` Orama currently supports 10 different data types: | Type | Description | example |
| ---------------- | --------------------------------------------------------------------------- | --------------------------------------------------------------------------- |
| string | A string of characters. | 'Hello world' |
| number | A numeric value, either float or integer. | 42 |
| boolean | A boolean value. | true |
| enum | An enum value. | 'drama' |
| geopoint | A geopoint value. | { lat: 40.7128, lon: 74.0060 } |
| string[] | An array of strings. | ['red', 'green', 'blue'] |
| number[] | An array of numbers. | [42, 91, 28.5] |
| boolean[] | An array of booleans. | [true, false, false] |
| enum[] | An array of enums. | ['comedy', 'action', 'romance'] |
| vector[<size>] | A vector of numbers to perform vector search on. | [0.403, 0.192, 0.830] | Orama will only index properties specified in the schema but will allow you to set and store additional data if needed. Once the db instance is created, you can start adding some documents: ```js
await insert(db, {
name: 'Wireless Headphones',
description: 'Experience immersive sound quality with these noise-cancelling wireless headphones.',
price: 99.99,
embedding: [...],
meta: {
rating: 4.5,
},
}) await insert(db, {
name: 'Smart LED Bulb',
description: 'Control the lighting in your home with this energy-efficient smart LED bulb, compatible with most smart home systems.',
price: 24.99,
embedding: [...],
meta: {
rating: 4.3,
},
}) await insert(db, {
name: 'Portable Charger',
description: 'Never run out of power on-the-go with this compact and fast-charging portable charger for your devices.',
price: 29.99,
embedding: [...],
meta: {
rating: 3.6,
},
})
``` After the data has been inserted, you can finally start to query the database. js
const searchResult = await search(db, {
term: 'headphones',
}) In the case above, you will be searching for all the documents containing the
word "headphones" , looking up in every string property specified in the schema: js
{
elapsed: {
raw: 99512,
formatted: '99μs',
},
hits: [
{
id: '41013877-56',
score: 0.925085832971998432,
document: {
name: 'Wireless Headphones',
description: 'Experience immersive sound quality with these noise-cancelling wireless headphones.',
price: 99.99,
meta: {
rating: 4.5
}
}
}
],
count: 1
} You can also restrict the lookup to a specific property: js
const searchResult = await search(db, {
term: 'immersive sound quality',
properties: ['description'],
}) Result: js
{
elapsed: {
raw: 21492,
formatted: '21μs',
},
hits: [
{
id: '41013877-56',
score: 0.925085832971998432,
document: {
name: 'Wireless Headphones',
description: 'Experience immersive sound quality with these noise-cancelling wireless headphones.',
price: 99.99,
meta: {
rating: 4.5
}
}
}
],
count: 1
} You can use non-string data to filter , group , and create facets : js
const searchResult = await search(db, {
term: 'immersive sound quality',
where: {
price: {
lte: 199.99
},
rating: {
gt: 4
}
},
}) Performing hybrid and vector search Orama is a full-text and vector search engine. This allows you to adopt different kinds of search paradigms depending on your specific use case. To perform vector or hybrid search, you can use the same search method used for full-text search. You'll just have to specify which property you want to perform vector search on, and a vector to be used to perform vector similarity: js
const searchResult = await searchVector(db, {
mode: 'vector', // or 'hybrid'
vector: {
value: [...], // OpenAI embedding or similar vector to be used as an input
property: 'embedding' // Property to search through. Mandatory for vector search
}
}) If you're using the Orama Secure AI Proxy (highly recommended), you can skip the vector configuration at search time, since the official Orama Secure AI Proxy plugin will take care of it automatically for you: ```js
import { create } from '@orama/orama'
import { pluginSecureProxy } from '@orama/plugin-secure-proxy' const secureProxy = secureProxyPlugin({
apiKey: ' ',
defaultProperty: 'embedding', // the default property to perform vector and hybrid search on
model: 'openai/text-embedding-ada-002' // the model to use to generate embeddings
}) const db = await create({
schema: {
name: 'string',
description: 'string',
price: 'number',
embedding: 'vector[1536]',
meta: {
rating: 'number',
},
},
plugins: [secureProxy]
}) const resultsHybrid = await search(db, {
mode: 'vector', // or 'hybrid'
term: 'Videogame for little kids with a passion about ice cream',
where: {
price: {
lte: 19.99
},
'meta.rating': {
gte: 4.5
}
}
})
``` Performing Geosearch Orama supports Geosearch as a search filter. It will search through all the properties specified as geopoint in the schema: ```js
import { create, insert } from '@orama/orama' const db = await create({
schema: {
name: 'string',
location: 'geopoint'
}
}) await insert(db, { name: 'Duomo di Milano', location: { lat: 45.46409, lon: 9.19192 } })
await insert(db, { name: 'Piazza Duomo', location: { lat: 45.46416, lon: 9.18945 } })
await insert(db, { name: 'Piazzetta Reale', location: { lat: 45.46339, lon: 9.19092 } }) const searchResult = await search(db, {
term: 'Duomo',
where: {
location: { // The property we want to filter by
radius: { // The filter we want to apply (in that case: "radius")
coordinates: { // The central coordinate
lat: 45.4648,
lon: 9.18998
},
unit: 'm', // The unit of measurement. The default is "m" (meters)
value: 1000, // The radius length. In that case, 1km
inside: true // Whether we want to return the documents inside or outside the radius. The default is "true"
}
}
}
})
``` Orama Geosearch APIs support distance-based search (via radius ), or polygon-based search (via polygon ). By default, Orama will use the Haversine formula to perform Geosearch, but high-precision search can be enabled by passing the highPrecision option in your radius or polygon configuration. This will tell Orama to use the Vicenty Formulae instead, which is more precise for longer distances. Read more in the official docs . Official Docs Read the complete documentation at https://docs.askorama.ai . Official Orama Plugins Plugin Vitepress Plugin Docusaurus Plugin Analytics Plugin Astro Plugin Data Persistence Plugin Nextra Write your own plugin: https://docs.askorama.ai/open-source/plugins/writing-your-own-plugins License Orama is licensed under the Apache 2.0 license. | 🌌 Fast, dependency-free, full-text and vector search engine with typo tolerance, filters, facets, stemming, and more. Works with any JavaScript runtime, browser, server, service! | data-structures,full-text,search,typo-tolerance,algiorithm,search-engine,search-algorithm,javascript,typescript,node | 124 | 89 | 459 | 804 | 31 | 18 | 1 |
lucia-auth/lucia | Lucia Lucia is an auth library written in TypeScript that abstracts away the complexity of handling sessions. It works alongside your database to provide an API that's easy to use, understand, and extend. No more endless configuration and callbacks Fully typed Works in any runtime - Node.js, Bun, Deno, Cloudflare Workers Extensive database support out of the box ```ts
import { Lucia } from "lucia"; const lucia = new Lucia(new Adapter(db)); const session = await lucia.createSession(userId, {});
await lucia.validateSession(session.id);
``` Lucia is an open source library released under the MIT license, with the help of 100+ contributors ! Resources Documentation Join the Discord server! Examples Changelog Installation npm i lucia
pnpm add lucia
yarn add lucia | Authentication, simple and clean | oauth,typescript,auth | 0 | 207 | 973 | 1,728 | 12 | 2 | 3 |
vfsfitvnm/ViMusic | ViMusic An Android application for streaming music from YouTube Music Features Play (almost) any song or video from YouTube Music Background playback Cache audio chunks for offline playback Search for songs, albums, artists videos and playlists Bookmark artists and albums Import playlists Fetch, display and edit songs lyrics or synchronized lyrics Local playlist management Reorder songs in playlist or queue Light/Dark/Dynamic theme Skip silence Sleep timer Audio normalization Android Auto Persistent queue Open YouTube/YouTube Music links ( watch , playlist , channel ) ... Installation Acknowledgments YouTube-Internal-Clients : A python script that discovers hidden YouTube API clients. Just a research project. ionicons : Premium hand-crafted icons built by Ionic, for Ionic apps and web apps everywhere. App icon based on icon created by Ilham Fitrotul Hayat - Flaticon Disclaimer This project and its contents are not affiliated with, funded, authorized, endorsed by, or in any way associated with YouTube, Google LLC or any of its affiliates and subsidiaries. Any trademark, service mark, trade name, or other intellectual property rights used in this project are owned by the respective owners. | An Android application for streaming music from YouTube Music. | android,jetpack-compose,music-player,youtube,music | 20 | 8 | 43 | 650 | 368 | 1 | 1 |
lucidrains/imagen-pytorch | Imagen - Pytorch Implementation of Imagen , Google's Text-to-Image Neural Network that beats DALL-E2, in Pytorch. It is the new SOTA for text-to-image synthesis. Architecturally, it is actually much simpler than DALL-E2. It consists of a cascading DDPM conditioned on text embeddings from a large pretrained T5 model (attention network). It also contains dynamic clipping for improved classifier free guidance, noise level conditioning, and a memory efficient unet design. It appears neither CLIP nor prior network is needed after all. And so research continues. AI Coffee Break with Letitia | Assembly AI | Yannic Kilcher Please join if you are interested in helping out with the replication with the LAION community Shoutouts StabilityAI for the generous sponsorship, as well as my other sponsors out there 🤗 Huggingface for their amazing transformers library. The text encoder portion is pretty much taken care of because of them Jonathan Ho for bringing about a revolution in generative artificial intelligence through his seminal paper Sylvain and Zachary for the Accelerate library, which this repository uses for distributed training Alex for einops , indispensable tool for tensor manipulation Jorge Gomes for helping out with the T5 loading code and advice on the correct T5 version Katherine Crowson , for her beautiful code , which helped me understand the continuous time version of gaussian diffusion Marunine and Netruk44 , for reviewing code, sharing experimental results, and help with debugging Marunine for providing a potential solution for a color shifting issue in the memory efficient u-nets. Thanks to Jacob for sharing experimental comparisons between the base and memory-efficient unets Marunine for finding numerous bugs, resolving an issue with resize right, and for sharing his experimental configurations and results MalumaDev for proposing the use of pixel shuffle upsampler to fix checkboard artifacts Valentin for pointing out insufficient skip connections in the unet, as well as the specific method of attention conditioning in the base-unet in the appendix BIGJUN for catching a big bug with continuous time gaussian diffusion noise level conditioning at inference time Bingbing for identifying a bug with sampling and order of normalizing and noising with low resolution conditioning image Kay for contributing one line command training of Imagen! Hadrien Reynaud for testing out text-to-video on a medical dataset, sharing his results, and identifying issues! Install bash
$ pip install imagen-pytorch Usage ```python
import torch
from imagen_pytorch import Unet, Imagen unet for imagen unet1 = Unet(
dim = 32,
cond_dim = 512,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = 3,
layer_attns = (False, True, True, True),
layer_cross_attns = (False, True, True, True)
) unet2 = Unet(
dim = 32,
cond_dim = 512,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = (2, 4, 8, 8),
layer_attns = (False, False, False, True),
layer_cross_attns = (False, False, False, True)
) imagen, which contains the unets above (base unet and super resoluting ones) imagen = Imagen(
unets = (unet1, unet2),
image_sizes = (64, 256),
timesteps = 1000,
cond_drop_prob = 0.1
).cuda() mock images (get a lot of this) and text encodings from large T5 text_embeds = torch.randn(4, 256, 768).cuda()
images = torch.randn(4, 3, 256, 256).cuda() feed images into imagen, training each unet in the cascade for i in (1, 2):
loss = imagen(images, text_embeds = text_embeds, unet_number = i)
loss.backward() do the above for many many many many steps now you can sample an image based on the text embeddings from the cascading ddpm images = imagen.sample(texts = [
'a whale breaching from afar',
'young girl blowing out candles on her birthday cake',
'fireworks with blue and green sparkles'
], cond_scale = 3.) images.shape # (3, 3, 256, 256)
``` For simpler training, you can directly supply text strings instead of precomputing text encodings. (Although for scaling purposes, you will definitely want to precompute the textual embeddings + mask) The number of textual captions must match the batch size of the images if you go this route. ```python mock images and text (get a lot of this) texts = [
'a child screaming at finding a worm within a half-eaten apple',
'lizard running across the desert on two feet',
'waking up to a psychedelic landscape',
'seashells sparkling in the shallow waters'
] images = torch.randn(4, 3, 256, 256).cuda() feed images into imagen, training each unet in the cascade for i in (1, 2):
loss = imagen(images, texts = texts, unet_number = i)
loss.backward()
``` With the ImagenTrainer wrapper class, the exponential moving averages for all of the U-nets in the cascading DDPM will be automatically taken care of when calling update ```python
import torch
from imagen_pytorch import Unet, Imagen, ImagenTrainer unet for imagen unet1 = Unet(
dim = 32,
cond_dim = 512,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = 3,
layer_attns = (False, True, True, True),
) unet2 = Unet(
dim = 32,
cond_dim = 512,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = (2, 4, 8, 8),
layer_attns = (False, False, False, True),
layer_cross_attns = (False, False, False, True)
) imagen, which contains the unets above (base unet and super resoluting ones) imagen = Imagen(
unets = (unet1, unet2),
text_encoder_name = 't5-large',
image_sizes = (64, 256),
timesteps = 1000,
cond_drop_prob = 0.1
).cuda() wrap imagen with the trainer class trainer = ImagenTrainer(imagen) mock images (get a lot of this) and text encodings from large T5 text_embeds = torch.randn(64, 256, 1024).cuda()
images = torch.randn(64, 3, 256, 256).cuda() feed images into imagen, training each unet in the cascade loss = trainer(
images,
text_embeds = text_embeds,
unet_number = 1, # training on unet number 1 in this example, but you will have to also save checkpoints and then reload and continue training on unet number 2
max_batch_size = 4 # auto divide the batch of 64 up into batch size of 4 and accumulate gradients, so it all fits in memory
) trainer.update(unet_number = 1) do the above for many many many many steps now you can sample an image based on the text embeddings from the cascading ddpm images = trainer.sample(texts = [
'a puppy looking anxiously at a giant donut on the table',
'the milky way galaxy in the style of monet'
], cond_scale = 3.) images.shape # (2, 3, 256, 256)
``` You can also train Imagen without text (unconditional image generation) as follows ```python
import torch
from imagen_pytorch import Unet, Imagen, SRUnet256, ImagenTrainer unets for unconditional imagen unet1 = Unet(
dim = 32,
dim_mults = (1, 2, 4),
num_resnet_blocks = 3,
layer_attns = (False, True, True),
layer_cross_attns = False,
use_linear_attn = True
) unet2 = SRUnet256(
dim = 32,
dim_mults = (1, 2, 4),
num_resnet_blocks = (2, 4, 8),
layer_attns = (False, False, True),
layer_cross_attns = False
) imagen, which contains the unets above (base unet and super resoluting ones) imagen = Imagen(
condition_on_text = False, # this must be set to False for unconditional Imagen
unets = (unet1, unet2),
image_sizes = (64, 128),
timesteps = 1000
) trainer = ImagenTrainer(imagen).cuda() now get a ton of images and feed it through the Imagen trainer training_images = torch.randn(4, 3, 256, 256).cuda() train each unet separately in this example, only training on unet number 1 loss = trainer(training_images, unet_number = 1)
trainer.update(unet_number = 1) do the above for many many many many steps now you can sample images unconditionally from the cascading unet(s) images = trainer.sample(batch_size = 16) # (16, 3, 128, 128)
``` Or train only super-resoluting unets ```python
import torch
from imagen_pytorch import Unet, NullUnet, Imagen unet for imagen unet1 = NullUnet() # add a placeholder "null" unet for the base unet unet2 = Unet(
dim = 32,
cond_dim = 512,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = (2, 4, 8, 8),
layer_attns = (False, False, False, True),
layer_cross_attns = (False, False, False, True)
) imagen, which contains the unets above (base unet and super resoluting ones) imagen = Imagen(
unets = (unet1, unet2),
image_sizes = (64, 256),
timesteps = 250,
cond_drop_prob = 0.1
).cuda() mock images (get a lot of this) and text encodings from large T5 text_embeds = torch.randn(4, 256, 768).cuda()
images = torch.randn(4, 3, 256, 256).cuda() feed images into imagen, training each unet in the cascade loss = imagen(images, text_embeds = text_embeds, unet_number = 2)
loss.backward() do the above for many many many many steps now you can sample an image based on the text embeddings as well as low resolution images lowres_images = torch.randn(3, 3, 64, 64).cuda() # starting un-resoluted images images = imagen.sample(
texts = [
'a whale breaching from afar',
'young girl blowing out candles on her birthday cake',
'fireworks with blue and green sparkles'
],
start_at_unet_number = 2, # start at unet number 2
start_image_or_video = lowres_images, # pass in low resolution images to be resoluted
cond_scale = 3.) images.shape # (3, 3, 256, 256)
``` At any time you can save and load the trainer and all associated states with the save and load methods. It is recommended you use these methods instead of manually saving with a state_dict call, as there are some device memory management being done underneath the hood within the trainer. ex. ```python
trainer.save('./path/to/checkpoint.pt') trainer.load('./path/to/checkpoint.pt') trainer.steps # (2,) step number for each of the unets, in this case 2
``` Dataloader You can also rely on the ImagenTrainer to automatically train off DataLoader instances. You simply have to craft your DataLoader to return either images (for unconditional case), or of ('images', 'text_embeds') for text-guided generation. ex. unconditional training ```python
from imagen_pytorch import Unet, Imagen, ImagenTrainer
from imagen_pytorch.data import Dataset unets for unconditional imagen unet = Unet(
dim = 32,
dim_mults = (1, 2, 4, 8),
num_resnet_blocks = 1,
layer_attns = (False, False, False, True),
layer_cross_attns = False
) imagen, which contains the unet above imagen = Imagen(
condition_on_text = False, # this must be set to False for unconditional Imagen
unets = unet,
image_sizes = 128,
timesteps = 1000
) trainer = ImagenTrainer(
imagen = imagen,
split_valid_from_train = True # whether to split the validation dataset from the training
).cuda() instantiate your dataloader, which returns the necessary inputs to the DDPM as tuple in the order of images, text embeddings, then text masks. in this case, only images is returned as it is unconditional training dataset = Dataset('/path/to/training/images', image_size = 128) trainer.add_train_dataset(dataset, batch_size = 16) working training loop for i in range(200000):
loss = trainer.train_step(unet_number = 1, max_batch_size = 4)
print(f'loss: {loss}') if not (i % 50):
valid_loss = trainer.valid_step(unet_number = 1, max_batch_size = 4)
print(f'valid loss: {valid_loss}')
if not (i % 100) and trainer.is_main: # is_main makes sure this can run in distributed
images = trainer.sample(batch_size = 1, return_pil_images = True) # returns List[Image]
images[0].save(f'./sample-{i // 100}.png') ``` Multi GPU Thanks to 🤗 Accelerate , you can do multi GPU training easily with two steps. First you need to invoke accelerate config in the same directory as your training script (say it is named train.py ) bash
$ accelerate config Next, instead of calling python train.py as you would for single GPU, you would use the accelerate CLI as so bash
$ accelerate launch train.py That's it! Command-line Imagen can also be used via CLI directly. Configuration ex. bash
$ imagen config or bash
$ imagen config --path ./configs/config.json In the config you are able to change settings for the trainer, dataset and the imagen config. The Imagen config parameters can be found here The Elucidated Imagen config parameters can be found here The Imagen Trainer config parameters can be found here For the dataset parameters all dataloader parameters can be used. Training This command allows you to train or resume training your model ex. bash
$ imagen train or bash
$ imagen train --unet 2 --epoches 10 You can pass following arguments to the training command. --config specify the config file to use for training [default: ./imagen_config.json] --unet the index of the unet to train [default: 1] --epoches how many epoches to train for [default: 50] Sampling Be aware when sampling your checkpoint should have trained all unets to get a usable result. ex. ```bash
$ imagen sample --model ./path/to/model/checkpoint.pt "a squirrel raiding the birdfeeder" image is saved to ./a_squirrel_raiding_the_birdfeeder.png ``` You can pass following arguments to the sample command. --model specify the model file to use for sampling --cond_scale conditioning scale (classifier free guidance) in decoder --load_ema load EMA version of unets if available In order to use a saved checkpoint with this feature, you either must instantiate your Imagen instance using the config classes, ImagenConfig and ElucidatedImagenConfig or create a checkpoint via the CLI directly For proper training, you'll likely want to setup config-driven training anyways. ex. ```python
import torch
from imagen_pytorch import ImagenConfig, ElucidatedImagenConfig, ImagenTrainer in this example, using elucidated imagen imagen = ElucidatedImagenConfig(
unets = [
dict(dim = 32, dim_mults = (1, 2, 4, 8)),
dict(dim = 32, dim_mults = (1, 2, 4, 8))
],
image_sizes = (64, 128),
cond_drop_prob = 0.5,
num_sample_steps = 32
).create() trainer = ImagenTrainer(imagen) do your training ... then save it trainer.save('./checkpoint.pt') you should see a message informing you that ./checkpoint.pt is commandable from the terminal ``` It really should be as simple as that You can also pass this checkpoint file around, and anyone can continue finetune on their own data ```python
from imagen_pytorch import load_imagen_from_checkpoint, ImagenTrainer imagen = load_imagen_from_checkpoint('./checkpoint.pt') trainer = ImagenTrainer(imagen) continue training / fine-tuning ``` Inpainting Inpainting follows the formulation laid out by the recent Repaint paper . Simply pass in inpaint_images and inpaint_masks to the sample function on either Imagen or ElucidatedImagen ```python inpaint_images = torch.randn(4, 3, 512, 512).cuda() # (batch, channels, height, width)
inpaint_masks = torch.ones((4, 512, 512)).bool().cuda() # (batch, height, width) inpainted_images = trainer.sample(texts = [
'a whale breaching from afar',
'young girl blowing out candles on her birthday cake',
'fireworks with blue and green sparkles',
'dust motes swirling in the morning sunshine on the windowsill'
], inpaint_images = inpaint_images, inpaint_masks = inpaint_masks, cond_scale = 5.) inpainted_images # (4, 3, 512, 512)
``` For video, similarly pass in your videos to inpaint_videos keyword on .sample . Inpainting mask can either be the same across all frames (batch, height, width) or different (batch, frames, height, width) ```python inpaint_videos = torch.randn(4, 3, 8, 512, 512).cuda() # (batch, channels, frames, height, width)
inpaint_masks = torch.ones((4, 8, 512, 512)).bool().cuda() # (batch, frames, height, width) inpainted_videos = trainer.sample(texts = [
'a whale breaching from afar',
'young girl blowing out candles on her birthday cake',
'fireworks with blue and green sparkles',
'dust motes swirling in the morning sunshine on the windowsill'
], inpaint_videos = inpaint_videos, inpaint_masks = inpaint_masks, cond_scale = 5.) inpainted_videos # (4, 3, 8, 512, 512)
``` Experimental Tero Karras of StyleGAN fame has written a new paper with results that have been corroborated by a number of independent researchers as well as on my own machine. I have decided to create a version of Imagen , the ElucidatedImagen , so that one can use the new elucidated DDPM for text-guided cascading generation. Simply import ElucidatedImagen , and then instantiate the instance as you did before. The hyperparameters are different than the usual ones for discrete and continuous time gaussian diffusion, and can be individualized for each unet in the cascade. Ex. ```python
from imagen_pytorch import ElucidatedImagen instantiate your unets ... imagen = ElucidatedImagen(
unets = (unet1, unet2),
image_sizes = (64, 128),
cond_drop_prob = 0.1,
num_sample_steps = (64, 32), # number of sample steps - 64 for base unet, 32 for upsampler (just an example, have no clue what the optimal values are)
sigma_min = 0.002, # min noise level
sigma_max = (80, 160), # max noise level, @crowsonkb recommends double the max noise level for upsampler
sigma_data = 0.5, # standard deviation of data distribution
rho = 7, # controls the sampling schedule
P_mean = -1.2, # mean of log-normal distribution from which noise is drawn for training
P_std = 1.2, # standard deviation of log-normal distribution from which noise is drawn for training
S_churn = 80, # parameters for stochastic sampling - depends on dataset, Table 5 in apper
S_tmin = 0.05,
S_tmax = 50,
S_noise = 1.003,
).cuda() rest is the same as above ``` Text to Video This repository will also start accumulating new research around text guided video synthesis. For starters it will adopt the 3d unet architecture described by Jonathan Ho in Video Diffusion Models Update: verified working by Hadrien Reynaud ! Ex. ```python
import torch
from imagen_pytorch import Unet3D, ElucidatedImagen, ImagenTrainer unet1 = Unet3D(dim = 64, dim_mults = (1, 2, 4, 8)).cuda() unet2 = Unet3D(dim = 64, dim_mults = (1, 2, 4, 8)).cuda() elucidated imagen, which contains the unets above (base unet and super resoluting ones) imagen = ElucidatedImagen(
unets = (unet1, unet2),
image_sizes = (16, 32),
random_crop_sizes = (None, 16),
temporal_downsample_factor = (2, 1), # in this example, the first unet would receive the video temporally downsampled by 2x
num_sample_steps = 10,
cond_drop_prob = 0.1,
sigma_min = 0.002, # min noise level
sigma_max = (80, 160), # max noise level, double the max noise level for upsampler
sigma_data = 0.5, # standard deviation of data distribution
rho = 7, # controls the sampling schedule
P_mean = -1.2, # mean of log-normal distribution from which noise is drawn for training
P_std = 1.2, # standard deviation of log-normal distribution from which noise is drawn for training
S_churn = 80, # parameters for stochastic sampling - depends on dataset, Table 5 in apper
S_tmin = 0.05,
S_tmax = 50,
S_noise = 1.003,
).cuda() mock videos (get a lot of this) and text encodings from large T5 texts = [
'a whale breaching from afar',
'young girl blowing out candles on her birthday cake',
'fireworks with blue and green sparkles',
'dust motes swirling in the morning sunshine on the windowsill'
] videos = torch.randn(4, 3, 10, 32, 32).cuda() # (batch, channels, time / video frames, height, width) feed images into imagen, training each unet in the cascade for this example, only training unet 1 trainer = ImagenTrainer(imagen) you can also ignore time when training on video initially, shown to improve results in video-ddpm paper. eventually will make the 3d unet trainable with either images or video. research shows it is essential (with current data regimes) to train first on text-to-image. probably won't be true in another decade. all big data becomes small data trainer(videos, texts = texts, unet_number = 1, ignore_time = False)
trainer.update(unet_number = 1) videos = trainer.sample(texts = texts, video_frames = 20) # extrapolating to 20 frames from training on 10 frames videos.shape # (4, 3, 20, 32, 32) ``` You can also train on text - image pairs first. The Unet3D will automatically convert it to single framed videos and learn without the temporal components (by automatically setting ignore_time = True ), whether it be 1d convolutions or causal attention across time. This is the current approach taken by all the big artificial intelligence labs (Brain, MetaAI, Bytedance) FAQ Why are my generated images not aligning well with the text? Imagen uses an algorithm called Classifier Free Guidance . When sampling, you apply a scale to the conditioning (text in this case) of greater than 1.0 . Researcher Netruk44 have reported 5-10 to be optimal, but anything greater than 10 to break. python
trainer.sample(texts = [
'a cloud in the shape of a roman gladiator'
], cond_scale = 5.) # <-- cond_scale is the conditioning scale, needs to be greater than 1.0 to be better than average Are there any pretrained models yet? Not at the moment but one will likely be trained and open sourced within the year, if not sooner. If you would like to participate, you can join the community of artificial neural network trainers at Laion (discord link is in the Readme above) and start collaborating. Will this technology take my job? More the reason why you should start training your own model, starting today! The last thing we need is this technology being in the hands of an elite few. Hopefully this repository reduces the work to just finding the necessary compute, and augmenting with your own curated dataset. What am I allowed to do with this repository? Anything! It is MIT licensed. In other words, you can freely copy / paste for your own research, remixed for whatever modality you can think of. Go train amazing models for profit, for science, or simply to satiate your own personal pleasure at witnessing something divine unravel in front of you. Cool Applications! Echocardiogram synthesis [Code] SOTA Hi-C contact matrix synthesis [Code] Floor plan generation Ultra High Resolution Histopathology Slides Synthetic Laparoscopic Images Designing MetaMaterials Related Works Audio diffusion from Flavio Schneider Mini Imagen from Ryan O. | AssemblyAI writeup Todo [x] use huggingface transformers for T5-small text embeddings [x] add dynamic thresholding [x] add dynamic thresholding DALLE2 and video-diffusion repository as well [x] allow for one to set T5-large (and perhaps small factory method to take in any huggingface transformer) [x] add the lowres noise level with the pseudocode in appendix, and figure out what is this sweep they do at inference time [x] port over some training code from DALLE2 [x] need to be able to use a different noise schedule per unet (cosine was used for base, but linear for SR) [x] just make one master-configurable unet [x] complete resnet block (biggan inspired? but with groupnorm) - complete self attention [x] complete conditioning embedding block (and make it completely configurable, whether it be attention, film etc) [x] consider using perceiver-resampler from https://github.com/lucidrains/flamingo-pytorch in place of attention pooling [x] add attention pooling option, in addition to cross attention and film [x] add optional cosine decay schedule with warmup, for each unet, to trainer [x] switch to continuous timesteps instead of discretized, as it seems that is what they used for all stages - first figure out the linear noise schedule case from the variational ddpm paper https://openreview.net/forum?id=2LdBqxc1Yv [x] figure out log(snr) for alpha cosine noise schedule. [x] suppress the transformers warning because only T5encoder is used [x] allow setting for using linear attention on layers where full attention cannot be used [x] force unets in continuous time case to use non-fouriered conditions (just pass the log(snr) through an MLP with optional layernorms), as that is what i have working locally [x] removed learned variance [x] add p2 loss weighting for continuous time [x] make sure cascading ddpm can be trained without text condition, and make sure both continuous and discrete time gaussian diffusion works [x] use primer's depthwise convs on the qkv projections in linear attention (or use token shifting before projections) - also use new dropout proposed by bayesformer, as it seems to work well with linear attention [x] explore skip layer excitation in unet decoder [x] accelerate integration [x] build out CLI tool and one-line generation of image [x] knock out any issues that arised from accelerate [x] add inpainting ability using resampler from repaint paper https://arxiv.org/abs/2201.09865 [x] build a simple checkpointing system, backed by a folder [x] add skip connection from outputs of all upsample blocks, used in unet squared paper and some previous unet works [x] add fsspec, recommended by Romain @rom1504, for cloud / local file system agnostic persistence of checkpoints [x] test out persistence in gcs with https://github.com/fsspec/gcsfs [x] extend to video generation, using axial time attention as in Ho's video ddpm paper [x] allow elucidated imagen to generalize to any shape [x] allow for imagen to generalize to any shape [x] add dynamic positional bias for the best type of length extrapolation across video time [x] move video frames to sample function, as we will be attempting time extrapolation [x] attention bias to null key / values should be a learned scalar of head dimension [x] add self-conditioning from bit diffusion paper, already coded up at ddpm-pytorch [x] add v-parameterization (https://arxiv.org/abs/2202.00512) from imagen video paper, the only thing new [x] incorporate all learnings from make-a-video (https://makeavideo.studio/) [x] build out CLI tool for training, resuming training off config file [x] allow for temporal interpolation at specific stages [x] make sure temporal interpolation works with inpainting [x] make sure one can customize all interpolation modes (some researchers are finding better results with trilinear) [x] imagen-video : allow for conditioning on preceding (and possibly future) frames of videos. ignore time should not be allowed in that scenario [x] make sure to automatically take care of temporal down/upsampling for conditioning video frames, but allow for an option to turn it off [x] make sure inpainting works with video [x] make sure inpainting mask for video can accept be customized per frame [ ] add flash attention [ ] reread cogvideo and figure out how frame rate conditioning could be used [ ] bring in attention expertise for self attention layers in unet3d [ ] consider bringing in NUWA's 3d convolutional attention [ ] consider transformer-xl memories in the temporal attention blocks [ ] consider perceiver-ar approach to attending to past time [ ] frame dropouts during attention for achieving both regularizing effect as well as shortened training time [ ] investigate frank wood's claims https://github.com/lucidrains/flexible-diffusion-modeling-videos-pytorch and either add the hierarchical sampling technique, or let people know about its deficiencies [ ] offer challenging moving mnist (with distractor objects) as a one-line trainable baseline for researchers to branch off of for text to video [ ] preencoding of text to memmapped embeddings [ ] be able to create dataloader iterators based on the old epoch style, also configure shuffling etc [ ] be able to also pass in arguments (instead of requiring forward to be all keyword args on model) [ ] bring in reversible blocks from revnets for 3d unet, to lessen memory burden [ ] add ability to only train super-resolution network [ ] read dpm-solver see if it is applicable to continuous time gaussian diffusion [ ] allow for conditioning video frames with arbitrary absolute times (calculate RPE during temporal attention) [ ] accommodate dream booth fine tuning [ ] add textual inversion [ ] cleanup self conditioning to be extracted at imagen instantiation [ ] make sure eventual dreambooth works with imagen-video [ ] add framerate conditioning for video diffusion [ ] make sure one can simulataneously condition on video frames as a prompt, as well as some conditioning image across all frames [ ] test and add distillation technique from consistency models Citations bibtex
@inproceedings{Saharia2022PhotorealisticTD,
title = {Photorealistic Text-to-Image Diffusion Models with Deep Language Understanding},
author = {Chitwan Saharia and William Chan and Saurabh Saxena and Lala Li and Jay Whang and Emily L. Denton and Seyed Kamyar Seyed Ghasemipour and Burcu Karagol Ayan and Seyedeh Sara Mahdavi and Raphael Gontijo Lopes and Tim Salimans and Jonathan Ho and David Fleet and Mohammad Norouzi},
year = {2022}
} bibtex
@article{Alayrac2022Flamingo,
title = {Flamingo: a Visual Language Model for Few-Shot Learning},
author = {Jean-Baptiste Alayrac et al},
year = {2022}
} bibtex
@inproceedings{Sankararaman2022BayesFormerTW,
title = {BayesFormer: Transformer with Uncertainty Estimation},
author = {Karthik Abinav Sankararaman and Sinong Wang and Han Fang},
year = {2022}
} bibtex
@article{So2021PrimerSF,
title = {Primer: Searching for Efficient Transformers for Language Modeling},
author = {David R. So and Wojciech Ma'nke and Hanxiao Liu and Zihang Dai and Noam M. Shazeer and Quoc V. Le},
journal = {ArXiv},
year = {2021},
volume = {abs/2109.08668}
} bibtex
@misc{cao2020global,
title = {Global Context Networks},
author = {Yue Cao and Jiarui Xu and Stephen Lin and Fangyun Wei and Han Hu},
year = {2020},
eprint = {2012.13375},
archivePrefix = {arXiv},
primaryClass = {cs.CV}
} bibtex
@article{Karras2022ElucidatingTD,
title = {Elucidating the Design Space of Diffusion-Based Generative Models},
author = {Tero Karras and Miika Aittala and Timo Aila and Samuli Laine},
journal = {ArXiv},
year = {2022},
volume = {abs/2206.00364}
} bibtex
@inproceedings{NEURIPS2020_4c5bcfec,
author = {Ho, Jonathan and Jain, Ajay and Abbeel, Pieter},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Larochelle and M. Ranzato and R. Hadsell and M.F. Balcan and H. Lin},
pages = {6840--6851},
publisher = {Curran Associates, Inc.},
title = {Denoising Diffusion Probabilistic Models},
url = {https://proceedings.neurips.cc/paper/2020/file/4c5bcfec8584af0d967f1ab10179ca4b-Paper.pdf},
volume = {33},
year = {2020}
} bibtex
@article{Lugmayr2022RePaintIU,
title = {RePaint: Inpainting using Denoising Diffusion Probabilistic Models},
author = {Andreas Lugmayr and Martin Danelljan and Andr{\'e}s Romero and Fisher Yu and Radu Timofte and Luc Van Gool},
journal = {ArXiv},
year = {2022},
volume = {abs/2201.09865}
} bibtex
@misc{ho2022video,
title = {Video Diffusion Models},
author = {Jonathan Ho and Tim Salimans and Alexey Gritsenko and William Chan and Mohammad Norouzi and David J. Fleet},
year = {2022},
eprint = {2204.03458},
archivePrefix = {arXiv},
primaryClass = {cs.CV}
} bibtex
@inproceedings{rogozhnikov2022einops,
title = {Einops: Clear and Reliable Tensor Manipulations with Einstein-like Notation},
author = {Alex Rogozhnikov},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://openreview.net/forum?id=oapKSVM2bcj}
} bibtex
@misc{chen2022analog,
title = {Analog Bits: Generating Discrete Data using Diffusion Models with Self-Conditioning},
author = {Ting Chen and Ruixiang Zhang and Geoffrey Hinton},
year = {2022},
eprint = {2208.04202},
archivePrefix = {arXiv},
primaryClass = {cs.CV}
} bibtex
@misc{Singer2022,
author = {Uriel Singer},
url = {https://makeavideo.studio/Make-A-Video.pdf}
} bibtex
@article{Sunkara2022NoMS,
title = {No More Strided Convolutions or Pooling: A New CNN Building Block for Low-Resolution Images and Small Objects},
author = {Raja Sunkara and Tie Luo},
journal = {ArXiv},
year = {2022},
volume = {abs/2208.03641}
} bibtex
@article{Salimans2022ProgressiveDF,
title = {Progressive Distillation for Fast Sampling of Diffusion Models},
author = {Tim Salimans and Jonathan Ho},
journal = {ArXiv},
year = {2022},
volume = {abs/2202.00512}
} bibtex
@article{Ho2022ImagenVH,
title = {Imagen Video: High Definition Video Generation with Diffusion Models},
author = {Jonathan Ho and William Chan and Chitwan Saharia and Jay Whang and Ruiqi Gao and Alexey A. Gritsenko and Diederik P. Kingma and Ben Poole and Mohammad Norouzi and David J. Fleet and Tim Salimans},
journal = {ArXiv},
year = {2022},
volume = {abs/2210.02303}
} bibtex
@misc{gilmer2023intriguing
title = {Intriguing Properties of Transformer Training Instabilities},
author = {Justin Gilmer, Andrea Schioppa, and Jeremy Cohen},
year = {2023},
status = {to be published - one attention stabilization technique is circulating within Google Brain, being used by multiple teams}
} bibtex
@inproceedings{Hang2023EfficientDT,
title = {Efficient Diffusion Training via Min-SNR Weighting Strategy},
author = {Tiankai Hang and Shuyang Gu and Chen Li and Jianmin Bao and Dong Chen and Han Hu and Xin Geng and Baining Guo},
year = {2023}
} bibtex
@article{Zhang2021TokenST,
title = {Token Shift Transformer for Video Classification},
author = {Hao Zhang and Y. Hao and Chong-Wah Ngo},
journal = {Proceedings of the 29th ACM International Conference on Multimedia},
year = {2021}
} bibtex
@inproceedings{anonymous2022normformer,
title = {NormFormer: Improved Transformer Pretraining with Extra Normalization},
author = {Anonymous},
booktitle = {Submitted to The Tenth International Conference on Learning Representations },
year = {2022},
url = {https://openreview.net/forum?id=GMYWzWztDx5},
note = {under review}
} | Implementation of Imagen, Google's Text-to-Image Neural Network, in Pytorch | artificial-intelligence,deep-learning,text-to-image,imagination-machine,text-to-video | 350 | 21 | 50 | 522 | 100 | 1 | 1 |
tracel-ai/burn | [![Discord](https://img.shields.io/discord/1038839012602941528.svg?color=7289da&&logo=discord)](https://discord.gg/uPEBbYYDB6)
[![Current Crates.io Version](https://img.shields.io/crates/v/burn.svg)](https://crates.io/crates/burn)
[![Documentation](https://img.shields.io/badge/docs-latest-blue)](https://burn.dev/docs/burn)
[![Test Status](https://github.com/tracel-ai/burn/actions/workflows/test.yml/badge.svg)](https://github.com/tracel-ai/burn/actions/workflows/test.yml)
[![CodeCov](https://codecov.io/gh/tracel-ai/burn/branch/main/graph/badge.svg)](https://codecov.io/gh/tracel-ai/burn)
[![Blaze](https://runblaze.dev/gh/114041730602611213183421653564341667516/badge.svg)](https://runblaze.dev)
[![Rust Version](https://img.shields.io/badge/Rust-1.75.0+-blue)](https://releases.rs/docs/1.75.0)
![license](https://shields.io/badge/license-MIT%2FApache--2.0-blue)
---
**Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme
flexibility, compute efficiency and portability as its primary goals.** ## Performance Because we believe the goal of a deep learning framework is to convert computation into useful
intelligence, we have made performance a core pillar of Burn. We strive to achieve top efficiency by
leveraging multiple optimization techniques described below.
**Click on each section for more details** 👇 Automatic kernel fusion 💥 Using Burn means having your models optimized on any backend. When possible, we provide a way to
automatically and dynamically create custom kernels that minimize data relocation between different
memory spaces, extremely useful when moving memory is the bottleneck.
As an example, you could write your own GELU activation function with the high level tensor api (see
Rust code snippet below).
```rust
fn gelu_custom (x: Tensor ) -> Tensor {
let x = x.clone() * ((x / SQRT_2).erf() + 1);
x / 2
}
```
Then, at runtime, a custom low-level kernel will be automatically created for your specific
implementation and will rival a handcrafted GPU implementation. The kernel consists of about 60
lines of WGSL [WebGPU Shading Language]("https://www.w3.org/TR/WGSL/https://www.w3.org/TR/WGSL/"),
an extremely verbose lower level shader language you probably don't want to program your deep
learning models in!
> As of now, our fusion strategy is only implemented for our own WGPU backend and supports only a
> subset of operations. We plan to add more operations very soon and extend this technique to other
> future in-house backends. Asynchronous execution ❤️🔥 For [backends developed from scratch by the Burn team](#backends), an asynchronous execution style
is used, which allows to perform various optimizations, such as the previously mentioned automatic
kernel fusion.
Asynchronous execution also ensures that the normal execution of the framework does not block the
model computations, which implies that the framework overhead won't impact the speed of execution
significantly. Conversely, the intense computations in the model do not interfere with the
responsiveness of the framework. For more information about our asynchronous backends, see
[this blog post](https://burn.dev/blog/creating-high-performance-asynchronous-backends-with-burn-compute). Thread-safe building blocks 🦞 Burn emphasizes thread safety by leveraging the
[ownership system of Rust](https://doc.rust-lang.org/book/ch04-00-understanding-ownership.html).
With Burn, each module is the owner of its weights. It is therefore possible to send a module to
another thread for computing the gradients, then send the gradients to the main thread that can
aggregate them, and _voilà_, you get multi-device training.
This is a very different approach from what PyTorch does, where backpropagation actually mutates the
_grad_ attribute of each tensor parameter. This is not a thread-safe operation and therefore
requires lower level synchronization primitives, see
[distributed training](https://pytorch.org/docs/stable/distributed.html) for reference. Note that
this is still very fast, but not compatible across different backends and quite hard to implement. Intelligent memory management 🦀 One of the main roles of a deep learning framework is to reduce the amount of memory necessary to
run models. The naive way of handling memory is that each tensor has its own memory space, which is
allocated when the tensor is created then deallocated as the tensor gets out of scope. However,
allocating and deallocating data is very costly, so a memory pool is often required to achieve good
throughput. Burn offers an infrastructure that allows for easily creating and selecting memory
management strategies for backends. For more details on memory management in Burn, see
[this blog post](https://burn.dev/blog/creating-high-performance-asynchronous-backends-with-burn-compute).
Another very important memory optimization of Burn is that we keep track of when a tensor can be
mutated in-place just by using the ownership system well. Even though it is a rather small memory
optimization on its own, it adds up considerably when training or running inference with larger
models and contributes to reduce the memory usage even more. For more information, see
[this blog post about tensor handling](https://burn.dev/blog/burn-rusty-approach-to-tensor-handling). Automatic kernel selection 🎯 A good deep learning framework should ensure that models run smoothly on all hardware. However, not
all hardware share the same behavior in terms of execution speed. For instance, a matrix
multiplication kernel can be launched with many different parameters, which are highly sensitive to
the size of the matrices and the hardware. Using the wrong configuration could reduce the speed of
execution by a large factor (10 times or even more in extreme cases), so choosing the right kernels
becomes a priority.
With our home-made backends, we run benchmarks automatically and choose the best configuration for
the current hardware and matrix sizes with a reasonable caching strategy.
This adds a small overhead by increasing the warmup execution time, but stabilizes quickly after a
few forward and backward passes, saving lots of time in the long run. Note that this feature isn't
mandatory, and can be disabled when cold starts are a priority over optimized throughput. Hardware specific features 🔥 It is no secret that deep learning is mosly relying on matrix multiplication as its core operation,
since this is how fully-connected neural networks are modeled.
More and more, hardware manufacturers optimize their chips specifically for matrix mutiliplication
workloads. For instance, Nvidia has its _Tensor Cores_ and today most cellphones have AI specialized
chips. As of this moment, we support Tensor Cores with our LibTorch and Candle backends, but not
other accelerators yet. We hope [this issue](https://github.com/gpuweb/gpuweb/issues/4195) gets
resolved at some point to bring support to our WGPU backend. Custom Backend Extension 🎒 Burn aims to be the most flexible deep learning framework. While it's crucial to maintain
compatibility with a wide variety of backends, Burn also provides the ability to extend the
functionalities of a backend implementation to suit your personal modeling requirements.
This versatility is advantageous in numerous ways, such as supporting custom operations like flash
attention or manually writing your own kernel for a specific backend to enhance performance. See
[this section](https://burn.dev/book/advanced/backend-extension/index.html) in the Burn Book 🔥 for
more details. ## Training & Inference The whole deep learning workflow is made easy with Burn, as you can monitor your training progress
with an ergonomic dashboard, and run inference everywhere from embedded devices to large GPU
clusters.
Burn was built from the ground up with training and inference in mind. It's also worth noting how
Burn, in comparison to frameworks like PyTorch, simplifies the transition from training to
deployment, eliminating the need for code changes. **Click on the following sections to expand 👇** Training Dashboard 📈 As you can see in the previous video (click on the picture!), a new terminal UI dashboard based on
the [Ratatui](https://github.com/ratatui-org/ratatui) crate allows users to follow their training
with ease without having to connect to any external application.
You can visualize your training and validation metrics updating in real-time and analyze the
lifelong progression or recent history of any registered metrics using only the arrow keys. Break
from the training loop without crashing, allowing potential checkpoints to be fully written or
important pieces of code to complete without interruption 🛡 ONNX Support 🐫 ONNX (Open Neural Network Exchange) is an open-standard format that exports both the architecture
and the weights of a deep learning model.
Burn supports the importation of models that follow the ONNX standard so you can easily port a model
you have written in another framework like TensorFlow or PyTorch to Burn to benefit from all the
advantages our framework offers.
Our ONNX support is further described in
[this section of the Burn Book 🔥](https://burn.dev/book/import/onnx-model.html).
> **Note**: This crate is in active development and currently supports a
> [limited set of ONNX operators](./crates/burn-import/SUPPORTED-ONNX-OPS.md). Importing PyTorch Models 🚚 Support for loading of PyTorch model weights into Burn’s native model architecture, ensuring
seamless integration. See
[Burn Book 🔥 section on importing PyTorch](https://burn.dev/book/import/pytorch-model.html) Inference in the Browser 🌐 Several of our backends can compile to Web Assembly: Candle and NdArray for CPU, and WGPU for GPU.
This means that you can run inference directly within a browser. We provide several examples of
this:
- [MNIST](./examples/mnist-inference-web) where you can draw digits and a small convnet tries to
find which one it is! 2️⃣ 7️⃣ 😰
- [Image Classification](./examples/image-classification-web) where you can upload images and
classify them! 🌄 Embedded: no_std support ⚙️ Burn's core components support [no_std](https://docs.rust-embedded.org/book/intro/no-std.html). This
means it can run in bare metal environment such as embedded devices without an operating system.
> As of now, only the NdArray backend can be used in a _no_std_ environment. ## Backends Burn strives to be as fast as possible on as many hardwares as possible, with robust implementations.
We believe this flexibility is crucial for modern needs where you may train your models in the cloud, then deploy on customer hardwares, which vary from user to user. Compared to other frameworks, Burn has a very different approach to supporting many backends. By
design, most code is generic over the Backend trait, which allows us to build Burn with swappable
backends. This makes composing backend possible, augmenting them with additional functionalities
such as autodifferentiation and automatic kernel fusion.
**We already have many backends implemented, all listed below 👇** WGPU (WebGPU): Cross-Platform GPU Backend 🌐 **The go-to backend for running on any GPU.**
Based on the most popular and well-supported Rust graphics library, [WGPU](https://wgpu.rs), this
backend automatically targets Vulkan, OpenGL, Metal, Direct X11/12, and WebGPU, by using the WebGPU
shading language [WGSL](https://www.w3.org/TR/WGSL/https://www.w3.org/TR/WGSL/). It can also be
compiled to Web Assembly to run in the browser while leveraging the GPU, see
[this demo](https://antimora.github.io/image-classification/). For more information on the benefits
of this backend, see [this blog](https://burn.dev/blog/cross-platform-gpu-backend).
The WGPU backend is our first "in-house backend", which means we have complete control over its
implementation details. It is fully optimized with the
[performance characteristics mentioned earlier](#performance), as it serves as our research
playground for a variety of optimizations.
See the [WGPU Backend README](./crates/burn-wgpu/README.md) for more details. Candle: Backend using the Candle bindings 🕯 Based on [Candle by Hugging Face](https://github.com/huggingface/candle), a minimalist ML framework
for Rust with a focus on performance and ease of use, this backend can run on CPU with support for
Web Assembly or on Nvidia GPUs using CUDA.
See the [Candle Backend README](./crates/burn-candle/README.md) for more details.
> _Disclaimer:_ This backend is not fully completed yet, but can work in some contexts like
> inference. LibTorch: Backend using the LibTorch bindings 🎆 PyTorch doesn't need an introduction in the realm of deep learning. This backend leverages
[PyTorch Rust bindings](https://github.com/LaurentMazare/tch-rs), enabling you to use LibTorch C++
kernels on CPU, CUDA and Metal.
See the [LibTorch Backend README](./crates/burn-tch/README.md) for more details. NdArray: Backend using the NdArray primitive as data structure 🦐 This CPU backend is admittedly not our fastest backend, but offers extreme portability.
It is our only backend supporting _no_std_.
See the [NdArray Backend README](./crates/burn-ndarray/README.md) for more details. Autodiff: Backend decorator that brings backpropagation to any backend 🔄 Contrary to the aforementioned backends, Autodiff is actually a backend _decorator_. This means that
it cannot exist by itself; it must encapsulate another backend.
The simple act of wrapping a base backend with Autodiff transparently equips it with
autodifferentiation support, making it possible to call backward on your model.
```rust
use burn::backend::{Autodiff, Wgpu};
use burn::tensor::{Distribution, Tensor};
fn main() {
type Backend = Autodiff ;
let x: Tensor = Tensor::random([32, 32], Distribution::Default);
let y: Tensor = Tensor::random([32, 32], Distribution::Default).require_grad();
let tmp = x.clone() + y.clone();
let tmp = tmp.matmul(x);
let tmp = tmp.exp();
let grads = tmp.backward();
let y_grad = y.grad(&grads).unwrap();
println!("{y_grad}");
}
```
Of note, it is impossible to make the mistake of calling backward on a model that runs on a backend
that does not support autodiff (for inference), as this method is only offered by an Autodiff
backend.
See the [Autodiff Backend README](./crates/burn-autodiff/README.md) for more details. Fusion: Backend decorator that brings kernel fusion to backends that support it 💥 This backend decorator enhances a backend with kernel fusion, provided that the inner backend
supports it. Note that you can compose this backend with other backend decorators such as Autodiff.
For now, only the WGPU backend has support for fused kernels.
```rust
use burn::backend::{Autodiff, Fusion, Wgpu};
use burn::tensor::{Distribution, Tensor};
fn main() {
type Backend = Autodiff >;
let x: Tensor = Tensor::random([32, 32], Distribution::Default);
let y: Tensor = Tensor::random([32, 32], Distribution::Default).require_grad();
let tmp = x.clone() + y.clone();
let tmp = tmp.matmul(x);
let tmp = tmp.exp();
let grads = tmp.backward();
let y_grad = y.grad(&grads).unwrap();
println!("{y_grad}");
}
```
Of note, we plan to implement automatic gradient checkpointing based on compute bound and memory
bound operations, which will work gracefully with the fusion backend to make your code run even
faster during training, see [this issue](https://github.com/tracel-ai/burn/issues/936).
See the [Fusion Backend README](./crates/burn-fusion/README.md) for more details. ## Getting Started Just heard of Burn? You are at the right place! Just continue reading this section and we hope you
can get on board really quickly. The Burn Book 🔥 To begin working effectively with Burn, it is crucial to understand its key components and
philosophy. This is why we highly recommend new users to read the first sections of
[The Burn Book 🔥](https://burn.dev/book/). It provides detailed examples and explanations covering
every facet of the framework, including building blocks like tensors, modules, and optimizers, all
the way to advanced usage, like coding your own GPU kernels.
> The project is constantly evolving, and we try as much as possible to keep the book up to date
> with new additions. However, we might miss some details sometimes, so if you see something weird,
> let us know! We also gladly accept Pull Requests 😄 Examples 🙏 Let's start with a code snippet that shows how intuitive the framework is to use! In the following,
we declare a neural network module with some parameters along with its forward pass.
```rust
use burn::nn;
use burn::module::Module;
use burn::tensor::backend::Backend;
#[derive(Module, Debug)]
pub struct PositionWiseFeedForward {
linear_inner: nn::Linear ,
linear_outer: nn::Linear ,
dropout: nn::Dropout,
gelu: nn::Gelu,
}
impl PositionWiseFeedForward {
pub fn forward (&self, input: Tensor ) -> Tensor {
let x = self.linear_inner.forward(input);
let x = self.gelu.forward(x);
let x = self.dropout.forward(x);
self.linear_outer.forward(x)
}
}
```
We have a somewhat large amount of [examples](./examples) in the repository that shows how to use
the framework in different scenarios. For more practical insights, you can clone the repository and
run any of them directly on your computer! Pre-trained Models 🤖 We keep an updated and curated list of models and examples built with Burn, see the
[tracel-ai/models repository](https://github.com/tracel-ai/models) for more details.
Don't see the model you want? Don't hesitate to open an issue, and we may prioritize it. Built a
model using Burn and want to share it? You can also open a Pull Request and add your model under the
community section! Why use Rust for Deep Learning? 🦀 Deep Learning is a special form of software where you need very high level abstractions as well as
extremely fast execution time. Rust is the perfect candidate for that use case since it provides
zero-cost abstractions to easily create neural network modules, and fine-grained control over memory
to optimize every detail.
It's important that a framework be easy to use at a high level so that its users can focus on
innovating in the AI field. However, since running models relies so heavily on computations,
performance can't be neglected.
To this day, the mainstream solution to this problem has been to offer APIs in Python, but rely on
bindings to low-level languages such as C/C++. This reduces portability, increases complexity and
creates frictions between researchers and engineers. We feel like Rust's approach to abstractions
makes it versatile enough to tackle this two languages dichotomy.
Rust also comes with the Cargo package manager, which makes it incredibly easy to build, test, and
deploy from any environment, which is usually a pain in Python.
Although Rust has the reputation of being a difficult language at first, we strongly believe it
leads to more reliable, bug-free solutions built faster (after some practice 😅)! ## Community If you are excited about the project, don't hesitate to join our
[Discord](https://discord.gg/uPEBbYYDB6)! We try to be as welcoming as possible to everybody from
any background. You can ask your questions and share what you built with the community! **Contributing**
Before contributing, please take a moment to review our
[code of conduct](https://github.com/tracel-ai/burn/tree/main/CODE-OF-CONDUCT.md). It's also highly
recommended to read the
[architecture overview](https://github.com/tracel-ai/burn/tree/main/contributor-book/src/project-architecture), which explains
some of our architectural decisions. Refer to our [contributing guide](/CONTRIBUTING.md) for more
details.
## Status
Burn is currently in active development, and there will be breaking changes. While any resulting
issues are likely to be easy to fix, there are no guarantees at this stage.
## License
Burn is distributed under the terms of both the MIT license and the Apache License (Version 2.0).
See [LICENSE-APACHE](./LICENSE-APACHE) and [LICENSE-MIT](./LICENSE-MIT) for details. Opening a pull
request is assumed to signal agreement with these licensing terms. | Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals. | autodiff,deep-learning,machine-learning,rust,scientific-computing,ndarray,tensor,neural-network,pytorch,autotune | 15 | 115 | 1,321 | 1,206 | 190 | 48 | 10 |
PlayCover/PlayCover | PlayCover Run iOS apps and games on Apple Silicon Macs with mouse, keyboard and controller support. Documentation · Discord · Website About The Project Welcome to PlayCover! This software is all about allowing you to run iOS apps and games on Apple Silicon devices running macOS 12.0 or newer. PlayCover works by putting applications through a wrapper which imitates an iPad. This allows the apps to run natively and perform very well. PlayCover also allows you to map custom touch controls to keyboard, which is not possible in alternative sideloading methods such as Sideloadly. These controls include all the essentials, from WASD, camera movement, left and right clicks, and individual keymapping, similar to a popular Android emulator’s keymapping system called Bluestacks. This software was originally designed to run Genshin Impact on your Apple Silicon device, but it can now run a wide range of applications. Unfortunately, not all games are supported, and some may have bugs. Localisations handled in Weblate . ⬆️ Back to top️ Getting Started Follow the instructions below to get Genshin Impact, and many other games, up and running in no time. Prerequisites At the moment, PlayCover can only run on Apple Silicon Macs. Devices with the following chips are supported: M1 M1 Pro M1 Max M1 Ultra M2 M2 Pro M2 Max M2 Ultra M3 M3 Pro M3 Max If you have an Intel Mac, you can explore alternatives like Bootcamp or emulators. Download You can download stable releases here , or build from source by following the instructions in the Documentation. Documentation To learn how to setup and use PlayCover, visit the documentation here . Homebrew Cask We host a Homebrew tap with the PlayCover cask . To install from it run: sh
brew install --cask PlayCover/playcover/playcover-community To uninstall:
1. Remove PlayCover using brew uninstall --cask playcover-community ;
2. Untap PlayCover/playcover with brew untap PlayCover/playcover . ⬆️ Back to top️ License Distributed under the GPLv3 License. See LICENSE for more information. Contact Lucas Lee - playcover@lucas.icu Depal - depal@playcover.io Libraries Used These open source libraries were used to create this project. inject PTFakeTouch DownloadManager DataCache SwiftUI CachedAsyncImage Thanks to @iVoider for creating such a great project! ⬆️ Back to top️ | Community fork of PlayCover | [] | 14 | 87 | 471 | 1,192 | 294 | 7 | 3 |
mage-ai/mage-ai | 🧙 A modern replacement for Airflow. Documentation 🌪️ Get a 5 min overview 🌊 Play with live tool 🔥 Get instant help ### Give your data team `magical` powers Integrate and synchronize data from 3rd party sources Build real-time and batch pipelines to transform data using Python, SQL, and R Run, monitor, and orchestrate thousands of pipelines without losing sleep 1️⃣ 🏗️ Build Have you met anyone who said they loved developing in Airflow? That’s why we designed an easy developer experience that you’ll enjoy. | | |
| --- | --- |
| Easy developer experience Start developing locally with a single command or launch a dev environment in your cloud using Terraform. Language of choice Write code in Python, SQL, or R in the same data pipeline for ultimate flexibility. Engineering best practices built-in Each step in your pipeline is a standalone file containing modular code that’s reusable and testable with data validations. No more DAGs with spaghetti code. | | ↓ 2️⃣ 🔮 Preview Stop wasting time waiting around for your DAGs to finish testing. Get instant feedback from your code each time you run it. | | |
| --- | --- |
| Interactive code Immediately see results from your code’s output with an interactive notebook UI. Data is a first-class citizen Each block of code in your pipeline produces data that can be versioned, partitioned, and cataloged for future use. Collaborate on cloud Develop collaboratively on cloud resources, version control with Git, and test pipelines without waiting for an available shared staging environment. | | ↓ 3️⃣ 🚀 Launch Don’t have a large team dedicated to Airflow? Mage makes it easy for a single developer or small team to scale up and manage thousands of pipelines. | | |
| --- | --- |
| Fast deploy Deploy Mage to AWS, GCP, or Azure with only 2 commands using maintained Terraform templates. Scaling made simple Transform very large datasets directly in your data warehouse or through a native integration with Spark. Observability Operationalize your pipelines with built-in monitoring, alerting, and observability through an intuitive UI. | | 🧙 Intro Mage is an open-source data pipeline tool for transforming and integrating data. Install Demo Tutorials Documentation Features Core design principles Core abstractions Contributing 🏃♀️ Install The recommended way to install the latest version of Mage is through Docker with the following command: bash
docker pull mageai/mageai:latest You can also install Mage using pip or conda, though this may cause dependency issues without the proper environment. bash
pip install mage-ai bash
conda install -c conda-forge mage-ai Looking for help? The fastest way to get started is by checking out our documentation here . Looking for quick examples? Open a demo project right in your browser or check out our guides . 🎮 Demo Live demo Build and run a data pipeline with our demo app . WARNING The live demo is public to everyone, please don’t save anything sensitive (e.g. passwords, secrets, etc). Demo video (5 min) Click the image to play video 👩🏫 Tutorials Load data from API, transform it, and export it to PostgreSQL Integrate Mage into an existing Airflow project Train model on Titanic dataset Set up dbt models and orchestrate dbt runs 🔮 Features | | | |
| --- | --- | --- |
| 🎶 | Orchestration | Schedule and manage data pipelines with observability. |
| 📓 | Notebook | Interactive Python, SQL, & R editor for coding data pipelines. |
| 🏗️ | Data integrations | Synchronize data from 3rd party sources to your internal destinations. |
| 🚰 | Streaming pipelines | Ingest and transform real-time data. |
| ❎ | dbt | Build, run, and manage your dbt models with Mage. | A sample data pipeline defined across 3 files ➝ Load data ➝ python
@data_loader
def load_csv_from_file():
return pd.read_csv('default_repo/titanic.csv') Transform data ➝ python
@transformer
def select_columns_from_df(df, *args):
return df[['Age', 'Fare', 'Survived']] Export data ➝ python
@data_exporter
def export_titanic_data_to_disk(df) -> None:
df.to_csv('default_repo/titanic_transformed.csv') What the data pipeline looks like in the UI ➝ New? We recommend reading about blocks and
learning from a hands-on tutorial . 🏔️ Core design principles Every user experience and technical design decision adheres to these principles. | | | |
| --- | --- | --- |
| 💻 | Easy developer experience | Open-source engine that comes with a custom notebook UI for building data pipelines. |
| 🚢 | Engineering best practices built-in | Build and deploy data pipelines using modular code. No more writing throwaway code or trying to turn notebooks into scripts. |
| 💳 | Data is a first-class citizen | Designed from the ground up specifically for running data-intensive workflows. |
| 🪐 | Scaling is made simple | Analyze and process large data quickly for rapid iteration. | 🛸 Core abstractions These are the fundamental concepts that Mage uses to operate. | | |
| --- | --- |
| Project | Like a repository on GitHub; this is where you write all your code. |
| Pipeline | Contains references to all the blocks of code you want to run, charts for visualizing data, and organizes the dependency between each block of code. |
| Block | A file with code that can be executed independently or within a pipeline. |
| Data product | Every block produces data after it's been executed. These are called data products in Mage. |
| Trigger | A set of instructions that determine when or how a pipeline should run. |
| Run | Stores information about when it was started, its status, when it was completed, any runtime variables used in the execution of the pipeline or block, etc. | 🙋♀️ Contributing and developing Add features and instantly improve the experience for everyone. Check out the contributing guide to set up your development environment and start building. 👨👩👧👦 Community Individually, we’re a mage. 🧙 Mage Magic is indistinguishable from advanced technology.
A mage is someone who uses magic (aka advanced technology).
Together, we’re Magers! 🧙♂️🧙 Magers ( /ˈmājər/ ) A group of mages who help each other realize their full potential!
Let’s hang out and chat together ➝ For real-time news, fun memes, data engineering topics, and more, join us on ➝ | | |
| --- | --- |
| | Twitter |
| | LinkedIn |
| | GitHub |
| | Slack | 🤔 Frequently Asked Questions (FAQs) Check out our FAQ page to find answers to some of our most asked questions. 🪪 License See the LICENSE file for licensing information. | 🧙 Build, run, and manage data pipelines for integrating and transforming data. | machine-learning,artificial-intelligence,data,data-engineering,data-science,python,elt,etl,pipelines,data-pipelines | 53 | 114 | 4,469 | 5,344 | 266 | 227 | 5 |
williamboman/mason.nvim | Portable package manager for Neovim that runs everywhere Neovim runs. Easily install and manage LSP servers, DAP servers, linters, and formatters. :help mason.nvim Latest version: v1.10.0 Table of Contents Introduction How to use installed packages Screenshots Requirements Installation Setup Extensions Commands Registries Configuration Introduction :h mason-introduction mason.nvim is a Neovim plugin that allows you to easily manage external editor tooling such as LSP servers, DAP servers,
linters, and formatters through a single interface. It runs everywhere Neovim runs (across Linux, macOS, Windows, etc.),
with only a small set of external requirements needed. Packages are installed in Neovim's data directory ( :h standard-path ) by default. Executables are
linked to a single bin/ directory, which mason.nvim will add to Neovim's PATH during setup, allowing seamless access
from Neovim builtins (shell, terminal, etc.) as well as other 3rd party plugins. For a list of all available packages, see https://mason-registry.dev/registry/list . How to use installed packages :h mason-how-to-use-packages Although many packages are perfectly usable out of the box through Neovim builtins, it is recommended to use other 3rd
party plugins to further integrate these. The following plugins are recommended: LSP: lspconfig & mason-lspconfig.nvim DAP: nvim-dap & nvim-dap-ui Linters: null-ls.nvim or nvim-lint Formatters: null-ls.nvim or formatter.nvim Screenshots | | | |
| :----------------------------------------------------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------: |
| | | |
| | | | Requirements :h mason-requirements mason.nvim relaxes the minimum requirements by attempting multiple different utilities (for example, wget , curl , and Invoke-WebRequest are all perfect substitutes).
The minimum recommended requirements are: neovim >= 0.7.0 For Unix systems: git(1) curl(1) or wget(1) unzip(1) GNU tar ( tar(1) or gtar(1) depending on platform) gzip(1) For Windows systems: pwsh or powershell git GNU tar One of the following: 7zip peazip archiver winzip WinRAR Note that mason.nvim will regularly shell out to external package managers, such as cargo and npm . Depending on
your personal usage, some of these will also need to be installed. Refer to :checkhealth mason for a full list. Installation Packer lua
use {
"williamboman/mason.nvim"
} lazy.nvim lua
{
"williamboman/mason.nvim"
} vim-plug vim
Plug 'williamboman/mason.nvim' Setup :h mason-quickstart lua
require("mason").setup() mason.nvim is optimized to load as little as possible during setup. Lazy-loading the plugin, or somehow deferring the
setup, is not recommended. Refer to the Configuration section for information about which settings are available. Extensions Refer to the Wiki for a list of 3rd party extensions. mason-lspconfig.nvim - recommended for usage with lspconfig Commands :h mason-commands :Mason - opens a graphical status window :MasonUpdate - updates all managed registries :MasonInstall <package> ... - installs/re-installs the provided packages :MasonUninstall <package> ... - uninstalls the provided packages :MasonUninstallAll - uninstalls all packages :MasonLog - opens the mason.nvim log file in a new tab window Registries Mason's core package registry is located at mason-org/mason-registry .
Before any packages can be used, the registry needs to be downloaded. This is done automatically for you when using the
different Mason commands (e.g. :MasonInstall ), but can also be done manually by using the :MasonUpdate command. If you're utilizing Mason's Lua APIs to access packages, it's recommended to use the :h mason-registry.refresh() and/or :h mason-registry.update() functions to ensure you have the latest package information before retrieving packages. Configuration :h mason-settings You may optionally configure certain behavior of mason.nvim when calling the .setup() function. Refer to the default configuration for a list of all available settings. Example: lua
require("mason").setup({
ui = {
icons = {
package_installed = "✓",
package_pending = "➜",
package_uninstalled = "✗"
}
}
}) Default configuration ```lua
---@class MasonSettings
local DEFAULT_SETTINGS = {
---@since 1.0.0
-- The directory in which to install packages.
install_root_dir = path.concat { vim.fn.stdpath "data", "mason" }, ---@since 1.0.0
-- Where Mason should put its bin location in your PATH. Can be one of:
-- - "prepend" (default, Mason's bin location is put first in PATH)
-- - "append" (Mason's bin location is put at the end of PATH)
-- - "skip" (doesn't modify PATH)
---@type '"prepend"' | '"append"' | '"skip"'
PATH = "prepend",
---@since 1.0.0
-- Controls to which degree logs are written to the log file. It's useful to set this to vim.log.levels.DEBUG when
-- debugging issues with package installations.
log_level = vim.log.levels.INFO,
---@since 1.0.0
-- Limit for the maximum amount of packages to be installed at the same time. Once this limit is reached, any further
-- packages that are requested to be installed will be put in a queue.
max_concurrent_installers = 4,
---@since 1.0.0
-- [Advanced setting]
-- The registries to source packages from. Accepts multiple entries. Should a package with the same name exist in
-- multiple registries, the registry listed first will be used.
registries = {
"github:mason-org/mason-registry",
},
---@since 1.0.0
-- The provider implementations to use for resolving supplementary package metadata (e.g., all available versions).
-- Accepts multiple entries, where later entries will be used as fallback should prior providers fail.
-- Builtin providers are:
-- - mason.providers.registry-api - uses the https://api.mason-registry.dev API
-- - mason.providers.client - uses only client-side tooling to resolve metadata
providers = {
"mason.providers.registry-api",
"mason.providers.client",
},
github = {
---@since 1.0.0
-- The template URL to use when downloading assets from GitHub.
-- The placeholders are the following (in order):
-- 1. The repository (e.g. "rust-lang/rust-analyzer")
-- 2. The release version (e.g. "v0.3.0")
-- 3. The asset name (e.g. "rust-analyzer-v0.3.0-x86_64-unknown-linux-gnu.tar.gz")
download_url_template = "https://github.com/%s/releases/download/%s/%s",
},
pip = {
---@since 1.0.0
-- Whether to upgrade pip to the latest version in the virtual environment before installing packages.
upgrade_pip = false,
---@since 1.0.0
-- These args will be added to `pip install` calls. Note that setting extra args might impact intended behavior
-- and is not recommended.
--
-- Example: { "--proxy", "https://proxyserver" }
install_args = {},
},
ui = {
---@since 1.0.0
-- Whether to automatically check for new versions when opening the :Mason window.
check_outdated_packages_on_open = true,
---@since 1.0.0
-- The border to use for the UI window. Accepts same border values as |nvim_open_win()|.
border = "none",
---@since 1.0.0
-- Width of the window. Accepts:
-- - Integer greater than 1 for fixed width.
-- - Float in the range of 0-1 for a percentage of screen width.
width = 0.8,
---@since 1.0.0
-- Height of the window. Accepts:
-- - Integer greater than 1 for fixed height.
-- - Float in the range of 0-1 for a percentage of screen height.
height = 0.9,
icons = {
---@since 1.0.0
-- The list icon to use for installed packages.
package_installed = "◍",
---@since 1.0.0
-- The list icon to use for packages that are installing, or queued for installation.
package_pending = "◍",
---@since 1.0.0
-- The list icon to use for packages that are not installed.
package_uninstalled = "◍",
},
keymaps = {
---@since 1.0.0
-- Keymap to expand a package
toggle_package_expand = "<CR>",
---@since 1.0.0
-- Keymap to install the package under the current cursor position
install_package = "i",
---@since 1.0.0
-- Keymap to reinstall/update the package under the current cursor position
update_package = "u",
---@since 1.0.0
-- Keymap to check for new version for the package under the current cursor position
check_package_version = "c",
---@since 1.0.0
-- Keymap to update all installed packages
update_all_packages = "U",
---@since 1.0.0
-- Keymap to check which installed packages are outdated
check_outdated_packages = "C",
---@since 1.0.0
-- Keymap to uninstall a package
uninstall_package = "X",
---@since 1.0.0
-- Keymap to cancel a package installation
cancel_installation = "<C-c>",
---@since 1.0.0
-- Keymap to apply language filter
apply_language_filter = "<C-f>",
---@since 1.1.0
-- Keymap to toggle viewing package installation log
toggle_package_install_log = "<CR>",
---@since 1.8.0
-- Keymap to toggle the help view
toggle_help = "g?",
},
}, }
``` 👋 didn't find what you were looking for? Try looking in the help docs :help mason.nvim ! | Portable package manager for Neovim that runs everywhere Neovim runs. Easily install and manage LSP servers, DAP servers, linters, and formatters. | lspinstall,lua,manager,mason,neovim,nvim,nvim-lsp-installer,package,package-manager,packages | 20 | 175 | 917 | 1,598 | 154 | 3 | 5 |
markdoc/markdoc | A powerful, flexible, Markdown-based authoring framework. Markdoc is a Markdown -based syntax and toolchain for creating custom documentation sites and experiences. We designed Markdoc to power Stripe's public docs , our largest and most complex content site. Installation To get started with Markdoc, first install the library: shell
npm install @markdoc/markdoc or shell
yarn add @markdoc/markdoc and import it in your app: js
const Markdoc = require('@markdoc/markdoc'); or if you are using ESM js
import Markdoc from '@markdoc/markdoc'; then use Markdoc in your app or tool: ``js
const doc = Markdoc README {% image src="/logo.svg" /%}
`; const ast = Markdoc.parse(doc);
const content = Markdoc.transform(ast);
return Markdoc.renderers.react(content, React);
``` Check out our docs for more guidance on how to use Markdoc. TypeScript This is the minimal tsconfig.json required to use Markdoc in your TypeScript project: json
{
"compilerOptions": {
"moduleResolution": "node",
"target": "esnext", // works with es2015 or greater
"esModuleInterop": true
}
} React If you are using React, install Markdoc with: sh
npm install @markdoc/markdoc react @types/react Contributing Contributions and feedback are welcome and encouraged. Check out our contributing guidelines on how to do so. Development Run npm install Run npm run build Run the tests using npm test Code of conduct This project has adopted the Stripe Code of conduct . License This project uses the MIT license . Credits Special shout out to: @marcioAlmada for providing us with the @markdoc GitHub org. @koomen for gifting us https://markdoc.dev. | A powerful, flexible, Markdown-based authoring framework. | authoring,documentation,markdoc,markdown,react,toolchain | 24 | 24 | 141 | 130 | 21 | 6 | 2 |
bluesky-social/social-app | Bluesky Social App Welcome friends! This is the codebase for the Bluesky Social app. Get the app itself: Web: bsky.app iOS: App Store Android: Play Store Development Resources This is a React Native application, written in the TypeScript programming language. It builds on the atproto TypeScript packages (like @atproto/api ), code for which is also on open source, but in a different git repository . There is a small amount of Go language source code (in ./bskyweb/ ), for a web service that returns the React Native Web application. The Build Instructions are a good place to get started with the app itself. The Authenticated Transfer Protocol ("AT Protocol" or "atproto") is a decentralized social media protocol. You don't need to understand AT Protocol to work with this application, but it can help. Learn more at: Overview and Guides Github Discussions 👈 Great place to ask questions Protocol Specifications Blogpost on self-authenticating data structures The Bluesky Social application encompasses a set of schemas and APIs built in the overall AT Protocol framework. The namespace for these "Lexicons" is app.bsky.* . Contributions While we do accept contributions, we prioritize high quality issues and pull requests. Adhering to the below guidelines will ensure a more timely review. Rules: We may not respond to your issue or PR. We may close an issue or PR without much feedback. We may lock discussions or contributions if our attention is getting DDOSed. We're not going to provide support for build issues. Guidelines: Check for existing issues before filing a new one please. Open an issue and give some time for discussion before submitting a PR. Stay away from PRs like... Changing "Post" to "Skeet." Refactoring the codebase, eg to replace mobx with redux or something. Adding entirely new features without prior discussion. Remember, we serve a wide community of users. Our day to day involves us constantly asking "which top priority is our top priority." If you submit well-written PRs that solve problems concisely, that's an awesome contribution. Otherwise, as much as we'd love to accept your ideas and contributions, we really don't have the bandwidth. That's what forking is for! Forking guidelines You have our blessing 🪄✨ to fork this application! However, it's very important to be clear to users when you're giving them a fork. Please be sure to: Change all branding in the repository and UI to clearly differentiate from Bluesky. Change any support links (feedback, email, terms of service, etc) to your own systems. Replace any analytics or error-collection systems with your own so we don't get super confused. Security disclosures If you discover any security issues, please send an email to security@bsky.app. The email is automatically CCed to the entire team and we'll respond promptly. Are you a developer interested in building on atproto? Bluesky is an open social network built on the AT Protocol, a flexible technology that will never lock developers out of the ecosystems that they help build. With atproto, third-party can be as seamless as first-party through custom feeds, federated services, clients, and more. License (MIT) See ./LICENSE for the full license. P.S. We ❤️ you and all of the ways you support us. Thank you for making Bluesky a great place! | The Bluesky Social application for Web, iOS, and Android | [] | 9 | 169 | 2,976 | 3,998 | 698 | 515 | 11 |
fbelavenuto/arpl | Automated Redpill Loader Project archived, please use: https://github.com/RROrg/rr 中文说明 This particular project was created to facilitate my testing with Redpill and I decided to share it with other users. I'm Brazilian and my English is not good, so I apologize for my translations. I tried to make the system as user-friendly as possible, to make life easier. The loader automatically detects which device is being used, SATADoM or USB, detecting its VID and PID correctly. redpill-lkm has been edited to allow booting the kernel without setting the variables related to network interfaces so the loader (and user) doesn't have to worry about that. The Jun's code that makes the zImage and Ramdisk patch is embedded, if there is a change in "zImage" or "rd.gz" by some smallupdate, the loader re-applies the patches. The most important kernel modules are built into the DSM ramdisk image for automatic peripheral detection. Important considerations Some users have experienced an excessively long time to boot. In this case is highly recommended to use an SSD for the loader in the case of the option via DoM or a fast USB flash drive; You must have at least 4GB of RAM, both in baremetal and VMs; The DSM kernel is compatible with SATA ports, not SAS/SCSI/etc. For device-tree models only SATA ports work. For the other models, another type of disks may work; It is possible to use HBA cards, however SMART and serial numbers are only functional on DS3615xs, DS3617xs and DS3622xs+ models. Use General To use this project, download the latest image available and burn it to a USB stick or SATA disk-on-module. Set the PC to boot from the burned media and follow the informations on the screen. The loader will automatically increase the size of the last partition and use this space as cache if it is larger than 2GiB. Acessing loader Via terminal Call the "menu.sh" command from the computer itself. Via web From another machine into same network, type the address provided on the screen http://<ip>:7681 in browser. Via ssh From another machine into same network, use a ssh client, username root and password Redp1lL-1s-4weSomE Using loader The menu system is dynamic and I hope it is intuitive enough that the user can use it without any problems. There is no need to configure the VID/PID (if using a USB stick) or define the MAC Addresses of the network interfaces. If the user wants to modify the MAC Address of any interface, uses the "Change MAC" into "cmdline" menu. If a model is chosen that uses the Device-tree system to define the HDs, there is no need to configure anything. In the case of models that do not use device-tree, the configurations must be done manually and for this there is an option in the "cmdline" menu to display the SATA controllers, DUMMY ports and ports in use, to assist in the creation of the "SataPortMap", "DiskIdxMap" and "sata_remap" if necessary. Another important point is that the loader detects whether or not the CPU has the MOVBE instruction and does not display the models that require it. So if the DS918+ and DVA3221 models are not displayed it is because of the CPU's lack of support for MOVBE instructions. You can disable this restriction and test at your own risk. I developed a simple patch to no longer display the DUMMY port error on models without device-tree, the user will be able to install without having to worry about it. Use proxy The proxy only support for terminal。 For example, if you have one clashx (the default port is 7890) in the lan,follow the steps: Enabel 'Allow connent from Lan' (clashx only) Get the proxy (the computer which runing clashx) ip address open arpl termianl (you can access arpl via ssh) declare -x https_proxy="http://ipaddress:7890/"
declare -x http_proxy="http://ipaddress:7890/"
declare -x all_proxy="socks5:://ipaddress:7890/" Change the ipaddress to your proxy host. Call the "menu.sh" And then the proxy is effective。 Quickstart guide After booting the loader, the following screen should appear. Type menu.sh and press <ENTER> : If you prefer, you can access it via the web: Select the "model" option and choose the model you prefer: Select the "Buildnumber" option and choose the first option: Go to "Serial" menu and choose "Generate a random serial number". Select the "Build" option and wait for the loader to be generated: Select the "Boot" option and wait for the DSM to boot: The DSM kernel does not display messages on the screen, so it is necessary to continue the process of configuring DSM through the browser by accessing the address http://<ip> .
There are several tutorials on how to configure DSM over the internet, which will not be covered here. Tutorials An ARPL user (Rikkie) created a tutorial to install ARPL on a proxmox server:
https://hotstuff.asia/2023/01/03/xpenology-with-arpl-on-proxmox-the-easy-way/ Troubles/questions/etc Please search the forums at https://xpenology.com/forum if your question/problem has been discussed and resolved. If you can't find a solution, use github issues. Thanks All code was based on the work of TTG, pocopico, jumkey and others involved in continuing TTG's original redpill-load project. More information will be added in the future. | Automated Redpill Loader | [] | 51 | 8 | 213 | 547 | 245 | 1 | 1 |
edoardottt/awesome-hacker-search-engines | Awesome Hacker Search Engines A curated list of awesome search engines useful during Penetration testing, Vulnerability assessments, Red/Blue Team operations, Bug Bounty and more General • Servers • Vulnerabilities • Exploits • Attack surface • Code • Mail addresses • Domains • URLs • DNS • Certificates • WiFi networks • Device Info • Credentials • Leaks • Hidden Services • Social Networks • Phone numbers • Images • Threat Intelligence • Web History • Surveillance cameras • Crypto • People General Search Engines Google Bing Yahoo! Yandex Ask Baidu You SearXNG EXALead DuckDuckGo Swisscows Naver AOL Brave Yep Gibiru Kagi Stract Servers Shodan - Search Engine for the Internet of Everything Censys Search - Search Engine for every server on the Internet to reduce exposure and improve security Onyphe.io - Cyber Defense Search Engine for open-source and cyber threat intelligence data ZoomEye - Global cyberspace mapping GreyNoise - The source for understanding internet noise Natlas - Scaling Network Scanning Netlas.io - Discover, Research and Monitor any Assets Available Online FOFA - Cyberspace mapping Quake - Cyberspace surveying and mapping system Hunter - Internet Search Engines For Security Researchers ODIN - One of the most powerful search engines for Scanned Internet Assets Vulnerabilities NIST NVD - US National Vulnerability Database MITRE CVE - Identify, define, and catalog publicly disclosed cybersecurity vulnerabilities GitHub Advisory Database - Security vulnerability database inclusive of CVEs and GitHub originated security advisories cloudvulndb.org - The Open Cloud Vulnerability & Security Issue Database osv.dev - Open Source Vulnerabilities Vulners.com - Your Search Engine for Security Intelligence opencve.io - Easiest way to track CVE updates and be alerted about new vulnerabilities security.snyk.io - Open Source Vulnerability Database Mend Vulnerability Database - The largest open source vulnerability DB Rapid7 - DB - Vulnerability & Exploit Database CVEDetails - The ultimate security vulnerability datasource VulnIQ - Vulnerability intelligence and management solution SynapsInt - The unified OSINT research tool Aqua Vulnerability Database - Vulnerabilities and weaknesses in open source applications and cloud native infrastructure Vulmon - Vulnerability and exploit search engine VulDB - Number one vulnerability database ScanFactory - Realtime Security Monitoring Trend Micro Zero Day Initiative - Publicly disclosed vulnerabilities discovered by Zero Day Initiative researchers Google Project Zero - Vulnerabilities including Zero Days Trickest CVE Repository - Gather and update all available and newest CVEs with their PoC cnvd.org.cn - Chinese National Vulnerability Database InTheWild.io - Check CVEs in our free, open source feed of exploited vulnerabilities Vulnerability Lab - Vulnerability research, bug bounties and vulnerability assessments Red Hat Security Advisories - Information about security flaws that affect Red Hat products and services in the form of security advisories Cisco Security Advisories - Security advisories and vulnerability information for Cisco products, including network equipment and software Microsoft Security Response Center - Reports of security vulnerabilities affecting Microsoft products and services VARIoT - VARIoT IoT Vulnerabilities Database Exploits Exploit-DB - Exploit Database Sploitus - Convenient central place for identifying the newest exploits Rapid7 - DB - Vulnerability & Exploit Database Vulmon - Vulnerability and exploit search engine packetstormsecurity.com - Information Security Services, News, Files, Tools, Exploits, Advisories and Whitepapers 0day.today - Ultimate database of exploits and vulnerabilities LOLBAS - Living Off The Land Binaries, Scripts and Libraries GTFOBins - Curated list of Unix binaries that can be used to bypass local security restrictions in misconfigured systems Payloads All The Things - A list of useful payloads and bypasses for Web Application Security XSS Payloads - The wonderland of JavaScript unexpected usages, and more exploitalert.com - Database of Exploits Reverse Shell generator - Online Reverse Shell generator with Local Storage functionality, URI & Base64 Encoding, MSFVenom Generator, and Raw Mode HackerOne hacktivity - See the latest hacker activity on HackerOne Bugcrowd Crowdstream - Showcase of accepted and disclosed submissions on Bugcrowd programs GTFOArgs - Curated list of Unix binaries that can be manipulated for argument injection shell-storm.org/shellcode - Shellcodes database for study cases Hacking the Cloud - Encyclopedia of the attacks/tactics/techniques that offensive security professionals can use on their next cloud exploitation adventure LOLDrivers - Open-source project that brings together vulnerable, malicious, and known malicious Windows drivers PwnWiki - Collection of TTPs (tools, tactics, and procedures) for what to do after access has been gained CVExploits Search - Your comprehensive database for CVE exploits from across the internet VARIoT - VARIoT IoT exploits database LOOBins - Detailed information on various built-in macOS binaries and how they can be used by threat actors for malicious purposes Coalition Exploit Scoring System - Model that dynamically scores new and existing vulnerabilities to reflect their exploit likelihood WADComs - Interactive cheat sheet containing a curated list of offensive security tools and their respective commands to be used against Windows/AD environments LOLAPPS - Compendium of applications that can be used to carry out day-to-day exploitation Living off the Hardware - Resource collection that provides guidance on identifying and utilizing malicious hardware and malicious devices Living Off the Pipeline - How development tools commonly used in CI/CD pipelines can be used to achieve arbitrary code execution hackyx.io - The aim of this project is to easily find any resource related to IT security like CTF writeups, articles or Bug Bounty reports Attack Surface FullHunt.io - Attack surface database of the entire Internet BinaryEdge - We scan the web and gather data for you Censys ASM - Attack Surface Management Solutions RedHunt Labs - Discover your Attack Surface, Continuously SecurityTrails - The Total Internet Inventory overcast-security.com - We make tracking your external attack surface easy IPInfo.io - The trusted source for IP address data IPData.co - IP Geolocation and Threat Intelligence API NetworksDB - Information about the public IPv4 and IPv6 addresses, networks and domains owned by companies and organisations across the world ASNlookup - Quickly lookup updated information about specific Autonomous System Number (ASN), Organization, CIDR, or registered IP addresses (IPv4 and IPv6) among other relevant data BGPtools - Browse the Internet ecosystem BGPview - Debug and investigate information about IP addresses, ASN, IXs, BGP, ISPs, Prefixes and Domain names BigDataCloud - The API provides comprehensive location and network data RADb - The world's largest public routing registry Deepinfo - Empower your security with the most comprehensive Internet data CloudFlare Radar - Global Internet traffic, attack, and technology trends and insights Code GitHub Code Search - Search globally across all of GitHub, or scope your search to a particular repository or organization GitLab Code Search - Advanced search for faster, more efficient search across the entire GitLab instance Sourceforge - Complete Open-Source and Business Software Platform grep.app - Search across a half million git repos publicwww.com - Find any alphanumeric snippet, signature or keyword in the web pages HTML, JS and CSS code SearchCode - Search 75 billion lines of code from 40 million projects NerdyData - Find companies based on their website's tech stack or code RepoSearch - Source code search engine that helps you find implementation details, example usages or just analyze code SourceGraph - Understand and search across your entire codebase HotExamples - Search code examples from over 1 million projects WP Directory - Lightning fast regex searching of code in the WordPress Plugin and Theme Directories GitHub Gists - Instantly share code, notes, and snippets CodeBerg - Collaboration platform and Git hosting for free and open source software, content and projects Fedora Pagure - Open Source software code hosting system LaunchPad - Software collaboration platform that provides: Bug tracking, Code hosting, Code reviews, Ubuntu package building and hosting, Translations... repo.or.cz - Public Git hosting site gitorious.org - Read-only mirror of the former gitorious.org code hosting website Sourcehut - Collection of tools useful for software development android.googlesource.com - Git repositories on android deps.dev - Service developed and hosted by Google to help developers better understand the structure, construction, and security of open source software packages WebFinery - Search the source code of the web Google Code Archive - Data found on the Google Code Project Hosting Service, which was turned down in early 2016 Snipplr - Code snippet search engine that allows users to search and share code snippets across various programming languages and frameworks Postman Public Collections - Explore the best APIs, collections, workspaces in the world on the Postman Public API Network ScriptMafia - Download full nulled scripts SearchFTPs - The most advanced FTP Search Engine service maintained by members Ecosyste.ms - An open API service providing package, version and dependency metadata of many open source software ecosystems and registries SwaggerHub - Search public APIs and Domains in SwaggerHub Mail Addresses Hunter.io - Find professional email addresses in seconds PhoneBook - Lists all domains, email addresses, or URLs for the given input domain IntelligenceX - Search engine and data archive Reacher.email - Open-Source Email Verification RocketReach - Your first-degree connection to any professional email-format.com - Find the email address formats in use at thousands of companies EmailHippo - Email address verification technology ThatsThem - Reverse email lookup verify-email.org - Checks whether the mailbox exists or not Melissa - Emailcheck - Check email addresses and verify they are live VoilaNorbert - I can find anyone's email address SynapsInt - The unified OSINT research tool skymem.info - Find email addresses of companies and people findemails.com - Find Anyone's Email Address in Seconds Experte email finder - Find the right email address, even if you only know the name and the company EmailSherlock - Search for the Person behind the Email address and find our reputation score Anymail Finder - Find verified emails Tomba.io - With 430+ million email addresses indexed, effective search filters, and deliverability checks, Tomba's email finder is its most powerful tool Snov Email Finder Find any email. Anywhere EmailSearch.io - Find any emails and phones from a domain, Linkedin, name, and company Email Permutator+ - Find potential email addresses permutating different combinations Emailrep.io - Simple Email Reputation Mailboxvalidator - Secure and reliable email validation service to check for invalid email addresses ContactOut - Most accurate email finder for personal and work email outreach validemail.io - Validate email addresses for deliverability with our Email Validation API Domains PhoneBook - Lists all domains, email addresses, or URLs for the given input domain IntelligenceX - Search engine and data archive Omnisint - Subdomain enumeration Riddler - Allows you to search in a high quality dataset RobTex - Various kinds of research of IP numbers, Domain names, etc CentralOps - DomainDossier - Investigate domains and IP addresses DomainIQ - Comprehensive Domain Intelligence whois.domaintools.com - Industry’s fastest domain discovery engine and broadest, most accurate data grayhatwarfare.com - domains - How to search URLs exposed by Shortener services whoisology.com - Deep Connections Between Domain Names & Their Owners who.is - WHOIS Search, Domain Name, Website, and IP Tools pentest-tools.com - Discover subdomains and determine the attack surface of an organization BuiltWith - Find out what websites are Built With MoonSearch - Backlinks checker & SEO Report sitereport.netcraft.com - Find out the infrastructure and technologies used by any site SynapsInt - The unified OSINT research tool statscrop.com - Millions of amazing websites across the web are being analyzed with StatsCrop securityheaders.com - Scan your site now visualsitemapper.com - Create a visual map of your site similarweb.com - The easiest and fastest tool to find out what's really going on online buckets.grayhatwarfare.com - Public buckets C99.nl - Over 57 quality API's and growing! wannabe1337.xyz - Online Tools subdomainfinder.c99.nl - Scanner that scans an entire domain to find as many subdomains as possible AnubisDB - Subdomain enumeration and information gathering tool HypeStat - Free statistics and analytics service, where you can find information about every website Private Key Project - Information security tools from Private Key Project SiteDossier - Profiles for millions of sites on the web SpyOnWeb - Quick and convenient search for the websites that probably belong to the same owner HaveIBeenSquatted - Check if a domain has been typosquatted expireddomains.net - ExpiredDomains.net gathers all the information you need to find good Expired Domains that are Pending Delete and you can Backorder URLs PhoneBook - Lists all domains, email addresses, or URLs for the given input domain IntelligenceX - Search engine and data archive URLScan - A sandbox for the web HackerTarget - Collect information about IP Addresses, Networks, Web Pages and DNS records MOZ Link Explorer - The world's best backlink checker with over 40 trillion links shorteners.grayhatwarfare.com - Search URLs exposed by Shortener services CommonCrawl Index - Open repository of web crawl data URLVoid - Check the online reputation/safety of a website Norton SafeWeb - Look up a site, Get our rating CheckPhish - Real-time URL and website scanner web-check.xyz - All-in-one OSINT tool, for quickly checking a websites data TinyScan - Effortlessly Dive into URL Details DNS DNSDumpster - DNS recon & research, find & lookup dns records Chaos - Enhance research and analyse changes around DNS for better insights RapidDNS - DNS query tool which make querying subdomains or sites of a same ip easy DNSdb - Passive DNS historical database Omnisint - Reverse DNS lookup HackerTarget - Collect information about IP Addresses, Networks, Web Pages and DNS records passivedns.mnemonic.no - Web interface for querying passive DNS data collected in our malware lab ptrarchive.com - Over 230 billion reverse DNS entries from 2008 to the present dnshistory.org - Domain Name System Historical Record Archive DNSTwister - The anti-phishing domain name search engine and DNS monitoring service DNSviz - Tool for visualizing the status of a DNS zone C99.nl - Over 57 quality API's and growing wannabe1337.xyz - Online Tools DNSlytics - Find out everything about a domain name, IP address or provider dnsrepo.noc.org - DNS Database Repository Search DNSSpy - Monitor, validate and verify your DNS configurations ZETAlytics - We offer unrivalled geographic diversity and exclusive global network visibility in searchable datasets for use by cyber security analysts AskDNS - Lookup Connected Domain Names and IP Addresses 360 PassiveDNS.CN - Biggest public available db in China designed for security and research purpose MXtoolbox - All of your MX record, DNS, blacklist and SMTP diagnostics in one integrated tool NSLookup.io - Find all DNS records for a domain name using this online tool Robtex DNS Lookup - Get detailed information on the nameservers associated with a domain name DNSMap - Worldwide DNS Propagation Checker Validin - Massive collection of DNS records with free DNS history search dnslookup.pro - Advanced DNS Record Analysis & Troubleshooting Certificates Crt.sh - Certificate Search CTSearch - Certificate Transparency Search Tool tls.bufferover.run - Quickly find certificates in IPv4 space CertSpotter - Monitors your domains for expiring, unauthorized, and invalid SSL certificates SynapsInt - The unified OSINT research tool Censys Search - Certificates - Certificates Search ciphersuite.info - TLS Ciphersuite Search. Search for a particular cipher suite by using IANA, OpenSSL or GnuTLS name format certificatedetails - Online certificate viewer. Inspect and download certificates from your browser FacebookCT - Search for certificates issued for a given domain and subscribe to notifications from Facebook regarding new certificates certs.io - Search TLS certificates across the internet. ODIN Certificates Search - ODIN Certificates Search WiFi Networks Wigle.net - Maps and database of 802.11 wireless networks with statistics wifimap.io - Connect to all Free WiFi Hotspots using WiFi Map App all over the World! wificafespots.com - Free WiFi Cafe Spots wifispc.com - Free map of Wi-Fi passwords anywhere you go! openwifimap.net - HTML5 map with OpenWiFiMap data mylnikov.org - Public API implementation of Wi-Fi Geo-Location database Device Information MACVendorLookup.com - Look up the vendor for a specific MAC Address macvendors.com - Find MAC Address Vendors. Now macaddress.io - MAC address vendor lookup maclookup.app - Find the vendor name of a device by entering an OUI or a MAC address macvendors.co - Get vendor name of your network device using its mac address Credentials Have I Been Pwned - Check if your email or phone is in a data breach Dehashed - Free deep-web scans and protection against credential leaks LeakCheck.io - Make sure your credentials haven't been compromised crackstation.net -Massive pre-computed lookup tables to crack password hashes HashKiller - Pre-cracked Hashes, easily searchable LeakedPassword - Search across multiple data breaches to see if your pass has been compromised BugMeNot - Find and share logins Hashes.com - Decrypt and crack your MD5, SHA1, SHA256, MySQL, MD5 Email, SHA256 Email, and NTLM hashes for free online Hashmob - The Largest Password Recovery Community WhiteIntel - Check if a company or its customers was victim of an information stealer malware ntlm.pw - Instantly look up NTLM hashes and resolve them to plaintext passwords using our database with 8B+ entries Hudson Rock - Use Hudson Rock’s free cybercrime intelligence tools to learn how compromised credentials are impacting your business Leaks WikiLeaks - News leaks and classified media provided by anonymous sources Leak-Lookup - Search across thousands of data breaches Snusbase - Stay on top of the latest database breaches breachdirectory.org - Check if your information was exposed in a data breach BreachForums - Breaches, Data leaks, databases and more Siph0n Breach DB (onionsite) - Breaches, Data leaks, Exploits Exposed Forum - The premier Databreach discussion & leaks forum Distributed Denial of Secrets - Journalist 501(c)(3) non-profit devoted to publishing and archiving leaks Have I Been Zuckered - Facebook Data Breach Checker Cryptome - Documents for publication that are prohibited by governments worldwide Hidden Services AHMIA - Search hidden services on the Tor network thehiddenwiki.org - The darknet guide tor.link - Free anonymous deepweb / Darknet search engine deepweblinks.net - Onion Links onionengine.com - A search engine for services accessible on the Tor network OnionLand - Discover Hidden Services and access to Tor's onion sites Social Networks These can be useful for osint and social engineering. Facebook Instagram YouTube Twitter/X LinkedIn Reddit Pinterest Tumblr Flickr SnapChat Whatsapp Quora TikTok Vimeo Medium WeChat VK Weibo Tinder Threads Phone Numbers NumLookup - Free reverse phone lookup SpyDialer - Free Reverse Lookup Search WhitePages - Find people, contact info & background checks National Cellular Directory - Begin your comprehensive people search now Phone Validator - Is it a cell phone or is it a landline or is it a fake? Free Carrier Lookup - Enter a phone number and we'll return the carrier name RocketReach - Your first-degree connection to any professional sync.me - Find out who called EmobileTracker - Track Mobile Owner Name, Location and Mobile Service Provider Reverse Phone Lookup - Find Out The Owner Of A Phone Number ThatsThem - Reverse phone lookup thisnumber.com - International Phone Directories usphonebook.com - Free Reverse Phone Number Lookup truepeoplesearch.com - Get current address, cell phone number, email address, relatives, friends and a lot more Tellows - Who is calling? The phone number reverse search SynapsInt - The unified OSINT research tool C99.nl - Over 57 quality API's and growing ValidNumber.com - Free reverse phone lookup service to let you identify a caller associated with any 10-digit phone number from the US and Canada CellIdFinder - Nonprofit project which helps you to find GSM BTS by MCC, MNC, LAC and CellID OldPhoneBook - Instantly search a large selection from the past 20 years of USA phone listings Spokeo - Search by name, phone, address, or email to confidentially lookup information about people you know Intelius Phone Lookup - Look up a phone number to find owner information, carrier details, and more ZabaSearch Phone Lookup - Reverse Phone Lookup Tool Can Uncover Personal Information, Social Media Data, Online Activity, Photos, and More AnyWho Phone Lookup - Find out information associated with a phone number Radaris Phone Lookup - Look up any phone number to see its owner and identify who's calling or texting you Reverse Phone Lookup - Reverse phone number lookup with millions of listings including name or address Images Google Image Search - The most comprehensive image search on the web Baidu Image - Baidu Image Search Yahoo Image - Yahoo Image Search Yandex Image - Yandex Image Search Facecheck.id - Search for people by photo and verify you are talking to the person they claim to be Bing Visual Search - See it, search it Reverse Image Search - Super-fast image finder that helps you find similar images online Reverse Image - Find Where Images Appear Online Pixsy - Find and fight image theft Pimeyes - Face Search Engine, Reverse Image Search Pictriev - Find look-alike celebrities on the web using the face recognition Karmadecay - Reverse image search of Reddit.com Infringement Report - The web's best image copyright infringement search tool Tineye - Image search and recognition company Flickr - Home to tens of billions of photos and 2 million groups Sogou - Chinese technology company that offers a search engine Jimpl - Online photo metadata and EXIF data viewer Same Energy - Find beautiful images Pixabay - Stunning royalty-free images & royalty-free stock FotoForensics - Tools and training for digital picture analysis, including error level analysis, metadata, and tutorials Exif data - Online application that lets you take a deeper look at your favorite images Image Identify - Image recognition site, just drag your image & identify Threat Intelligence MITRE ATT&CK - Globally-accessible knowledge base of adversary tactics and techniques PulseDive - Threat intelligence made easy ThreatCrowd - A Search Engine for Threats ThreatMiner - Data Mining for Threat Intelligence VirusTotal - Analyze suspicious files, domains, IPs and URLs to detect malware and other breaches vx-underground.org - The largest collection of malware source code, samples, and papers on the internet bazaar.abuse.ch - Malware sample database feodotracker.abuse.ch - List of botnet Command&Control servers sslbl.abuse.ch - All malicious SSL certificates urlhaus.abuse.ch - Propose new malware urls threatfox.abuse.ch - Indicator Of Compromise (IOC) database yaraify.abuse.ch - Scan suspicious files such as malware samples or process dumps against a large repository of YARA rules Rescure - Curated cyber threat intelligence for everyone otx.alienvault - The World's First Truly Open Threat Intelligence Community urlquery.net - Service for detecting and analyzing web-based malware socradar.io - Extension to your SOC team VirusShare - System currently contains 48 million malware samples PassiveTotal - Security intelligence that scales security operations and response malapi.io - Windows APIs used for malicious purposes filesec.io - Latest file extensions being used by attackers leakix.net - Search engine indexing public information and an open reporting platform linked to the results tria.ge - Fully automated solution for high-volume malware analysis using advanced sandboxing technology Polyswarm - Launchpad for new technologies and innovative threat detection methods Cisco Talos - The threat intelligence organization at the center of the Cisco Security portfolio scamsearch.io - Find your scammer online & report them CyberCampaigns - Threat Actor information and Write-Ups ORKL - The Community Driven Cyber Threat Intelligence Library Maltiverse - Data from more than 100 different Threat Intelligence sources Inquest Labs - Threat intelligence from hundreds of public, private, and internal sources to develop new FDR signatures and rules PhishTank - Collaborative clearing house for data and information about phishing on the Internet IntelOwl - Open Source Intelligence, or OSINT solution to get threat intelligence data about a specific file, an IP or a domain from a single API at scale Lupovis - Analyze and collect data on Internet-wide scans and attacks in real-time. We use this data to identify and classify malicious actors AbuseIPDB - Check the report history of any IP address to see if anyone else has reported malicious activities Sucuri SiteCheck - Check websites for known malware, viruses, blacklisting status, website errors, out-of-date software, and malicious code Spamhaus - Protect and investigate using IP and domain reputation data ThreatBook - One step ahead of your adversary with high-fidelity, efficient and actionable cyber threat intelligence ShadowServer - Nonprofit security organization working altruistically behind the scenes to make the Internet more secure for everyone Team Cymru - Global leader in cyber threat intelligence and attack surface management BeVigil - Search engine for mobile application security testing CIRCL - The Computer Incident Response Center Luxembourg is a government-driven initiative designed to gather, review, report and respond to computer security threats and incidents MetaDefender Cloud - Advanced threat detection and prevention platform Cybersixgill - Threat intelligence platform that provides access to a wide range of cybersecurity information, including dark web monitoring and threat actor analysis Hybrid Analysis - Free malware analysis service for the community that detects and analyzes unknown threats using a unique Hybrid Analysis technology IBM X-Force Exchange - Threat intelligence sharing platform enabling research on security threats, aggregation of intelligence, and collaboration with peers The DFIR Report - Real Intrusions by Real Attackers, The Truth Behind the Intrusion Detection.FYI - Search Sigma rules WhoisXMLAPI - Domain & IP Data Intelligence for Greater Enterprise Security APIVoid - Threat analysis centered on IP and Domain reputation, along with additional services AnyRun - Browse thousands of malware samples in our database Filescan.io - Search reports for file name, URL, IP, Domain or Hash MalShare - Community driven public malware repository that works to provide free access to malware samples Kaspersky TIP - Scan files, domains, IP addresses, and URLs for threats, malware, viruses Malwares.com - Search malwares online ApkLab - Mobile threat intelligence platform designed to provide the most relevant information for Android security researchers Scumware - Find latest reports about malware and other threats Living off the False Positive - Autogenerated collection of false positives sourced from some of the most popular rule sets HijackLibs - Project for tracking publicly disclosed DLL Hijacking opportunities bootloaders.io - Curated list of known malicious bootloaders for various operating systems WTFBins - Catalogue benign applications that exhibit suspicious behavior. These binaries can emit noise and false positives in threat hunting and automated detections LOFLCAB - Document every cmdlet, binary, script, and WMI class that can be used for Living Off the Foreign Land techniques OpSecFailure - Site that lists how individuals messed up their opsec, no personal info is shared on this site TrailDiscover - An evolving repository of CloudTrail events with detailed descriptions, MITRE ATT&CK insights, real-world incidents, references and security implications Web History Web Archive - Explore more than 702 billion web pages saved over time Archive.ph - Create a copy of a webpage that will always be up even if the original link is down CachedPages - Get the cached page of any URL stored.website - View cached web pages/website CommonCrawl - Open repository of web crawl data UK Web Archive - Collects millions of websites each year, preserving them for future generations Arquivo - Non-profit service that maintains information published on the web of interest to the Portuguese community Archive-It - An archive of digital government and non-government organization (NGO) documents and reports HAW - Croatian Web Archive Surveillance cameras Insecam.org - The world biggest directory of online surveillance security cameras Surveillance under Surveillance - Cameras and guards watching you almost everywhere World Cams - Live Streaming Webcams Like Never Seen Before Skylinewebcams - Live HD webcams broadcasting from the world's best attractions and destinations WebKams - Live Web Cameras Everywhere WorldCam - Webcams from around the world Webcam Hopper - Live Webcams from around the world Live Traffic - Real-time monitoring of Europe’s live traffic cameras Geocam.ru - Webcams of the world Moldova's borders webcams - Official list of webcams at various border crossings around Moldova Earth Cam - Leading network of live streaming webcams for tourism and entertainment Webcam Taxi - Live Virtual Travel LiveWorldWebcams - Live streaming webcams from around the world Crypto ChainAbuse - Report malicious crypto activity BlockChair - Blockchain explorer, analytics and web services BlockCypher - Search the block chain People TruePeopleSearch - Free people search tool. Search billions of public records TruthFinder - A people search is a quick and simple way to find information on someone by name BeenVerified - BeenVerified's mission is to give people easy and affordable access to public record information ZabaSearch - Free* People Search and Public Information Search Engine PeekYou - Fast People Search Made Easy PeopleFinders - People Search Pipl - The #1 source for identity & trust That's Them - Find Someone's Contact Details By Name snitch.name - Social White Pages application with helps you Search for People's Profiles on Social Sites Webmii - People search engine FastPeopleSearch - Find a person by name, phone number, or street address Sorted By Name - A curated collection of links to genealogy details mentioned on other websites, or acquired by the webmaster Radaris - Find People Fast and Free Addresses.com - Free People Search and Public Information Search Engine Advanced Background Checks - Free People Search Yasni - Search phone, email, address for any name. News, pictures & links for any person. Find anyone on the internet with the world's largest free people search USA Data Search - The USA official website provides access to public data that can be searched and viewed by anyone AnyWho - Finding People, Places, and Businesses Lullar - Profile Search by Email, First Last Name or Username Ancestry - The largest for-profit genealogy company in the world, it operates a network of genealogical, historical records, and related genetic genealogy websites genealogy.com - Source for family history buffs to find genealogical research originally posted in GenForum and our most popular genealogy articles US Search - Access to details about the people in your life. Access public records, contact information, background checks & more Find My Past - Discover your global ancestors by searching millions of records across the world FamilySearch - Search for your ancestors in birth certificates, marriage registrations, census records, and other documents iTools - Wink People Search - Free people search. Find people on social networks and across the Web Intelius - Leading provider of public data about people and their connections to others Unclassified DorkSearch - Speed up your Dorking usersearch.org - Find someone by username or email on Social Networks, Dating Sites, Forums, Crypto Forums, Chat Sites and Blogs Pastebin - Website where you can store text online for a set period of time Wappalyzer - Instant access to website technology stacks, company and contact details, social media profiles, email verification and more Awakari - Real-Time Search from unlimited sources like RSS, Fediverse, Telegram, etc. Filter events by keywords, numeric conditions, condition groups Not working / Paused DNS.BufferOver.run NetoGraph - Captures and indexes detailed, low-level snapshots of website behaviour Hashdd - Known Good Cryptographic Hashes If you want to propose changes, just open an issue or a pull request . edoardoottavianelli.it to contact me. | A curated list of awesome search engines useful during Penetration testing, Vulnerability assessments, Red/Blue Team operations, Bug Bounty and more | awesome,awesome-list,hacking,search-engine,osint,awesome-lists,dns,hacking-tools,exploit,security | 0 | 51 | 121 | 672 | 4 | 1 | 2 |
mxgmn/MarkovJunior | MarkovJunior MarkovJunior is a probabilistic programming language where programs are combinations of rewrite rules and inference is performed via constraint propagation. MarkovJunior is named after mathematician Andrey Andreyevich Markov , who defined and studied what is now called Markov algorithms . In its basic form, a MarkovJunior program is an ordered list of rewrite rules. For example, MazeBacktracker (animation on the left below) is a list of 2 rewrite rules:
1. RBB=GGR or "replace red-black-black with green-green-red".
2. RGG=WWR or "replace red-green-green with white-white-red". On each execution step MJ interpreter finds the first rule in the list that has a match on the grid, finds all matches for that rule and applies that rule for a random match. In the maze backtracker example, interpreter first applies a bunch of RBB=GGR rules. But eventually the green self-avoiding walk gets stuck. At this point the first rule has no matches, so interpreter applies the second rule RGG=WWR until the walk gets unstuck. Then it can apply the first rule again, and so on. Interpreter stops when there are no matches for any rule. Probabilistic inference in MarkovJunior allows to impose constraints on the future state, and generate only those runs that lead to the constrained future. For example, inference in Sokoban rules {RWB=BRW RB=BR} makes a group of (red) agents organize (white) crates into specified shapes. Using these ideas, we construct many probabilistic generators of dungeons, architecture, puzzles and fun simulations. Additional materials:
1. Xml syntax overview .
2. Higher resolution screenshots and more seeds: ModernHouse , SeaVilla , Apartemazements , CarmaTower , Escheresque , PillarsOfEternity , Surface , Knots .
3. Unofficial technical notes by Dan Ogles and code documentation by Andrew Kay. Markov algorithms A Markov algorithm over an alphabet A is an ordered list of rules. Each rule is a string of the form x=y , where x and y are words in A , and some rules may be marked as halt rules. Application of a Markov algorithm to a word w proceeds as follows:
1. Find the first rule x=y where x is a substring of w . If there are no such rules, then halt.
2. Replace the leftmost x in w by y .
3. If the found rule was a halt rule, then halt. Otherwise, go to step 1. For example, consider this Markov algorithm in the alphabet {0, 1, x} (ε is the empty word): 1=0x
x0=0xx
0=ε If we apply it to the string 110 we get this sequence of strings: 110 -> 0x10 -> 0x0x0 -> 00xxx0 -> 00xx0xx -> 00x0xxxx -> 000xxxxxx -> 00xxxxxx -> 0xxxxxx -> xxxxxx In general, this algorithm converts a binary representation of a number into its unary representation. Markov's student Vilnis Detlovs proved that for any Turing machine there exists a Markov algorithm that computes the same function. In comparison, grammars are unordered sets of rewrite rules and L-systems are rewrite rules that are applied in parallel. For more interesting examples of Markov algorithms check Markov's book or see the greatest common divisor example in the comment section or multiplication example on Wikipedia. How would one generalize Markov algorithms to multiple dimensions? First, in multiple dimensions there are no natural ways to insert a string into another string, so the lefts and rights of our rewrite rules should have the same size. Second, there are no natural ways to choose the leftmost match. Possible options are:
* Choose a random match. This is what MJ's (exists) nodes do.
* Choose all matches. There is a problem with this option however because different matches can overlap and have conflicts. Possible solutions are:
* Greedily choose a maximal subset of non-conflicting matches. This is what MJ's {forall} nodes do.
* Consider all matches in superposition. That is, instead of separate values, keep waves in each grid cell - boolean vectors that tell which spacetime patterns are forbidden and which are not. And this is how MJ performs inference. We lose Turing completeness because our new procedure is not deterministic, but practice shows that this formalism still allows to describe a huge range of interesting random processes. Rewrite rules The simplest MarkovJunior program is probably (B=W) . It contains just a single rule B=W . On each turn, this program converts a random black square into a white square. (B=W) | (WB=WW) | (WBB=WAW) | (WBB=WAW) Growth model (WB=WW) is more interesting. On each turn it replaces a black-white pair of adjacent cells BW with a white-white pair WW . In other words, on each turn it picks a random black cell adjacent to some white cell and color it into white. This model is almost identical to the Eden growth model : on each turn both models choose among the same set of black cells. They differ only in probability distributions: a uniform distribution over black cells adjacent to white cells is not the same as a uniform distribution over pairs of adjacent black and white cells. Model (WBB=WAW) generates a maze, with a single line of code! Compare it with an implementation in a conventional language. Any MarkovJunior model can be run in any number of dimensions without changes. On the right you can see the end result of MazeGrowth in 3d, rendered in MagicaVoxel . By default, we use PICO-8 palette : Model (RBB=WWR) is a self-avoiding random walk . Note that self-avoiding walks in 3d are longer on average than in 2d. In general, comparing the behaviors of similar random processes in different dimensions is a fascinating topic. A classic result of George Pólya says that a random walk in 2d returns to its initial position with probability one, while in 3d this is no longer the case. (RBB=WWR) | LoopErasedWalk | (RB=WR RW=WR) We can put several rules into one rulenode . For example, (RBB=WWR RBW=GWP PWG=PBU UWW=BBU UWP=BBR) is a loop-erased random walk . Trail model (RB=WR RW=WR) generates decent connected caves . Model (RBB=WWR R*W=W*R) is known as the Aldous-Broder maze generation algorithm . The wildcard symbol * in the input means that any color is allowed to be in the square. The wildcard symbol in the output means that the color doesn't change after the application of the rule. Aldous-Broder algorithm takes much more turns on average to generate a maze than MazeGrowth, for example, but it has a nice property that MazeGrowth doesn't have: each maze has the same probability to be generated. In other words, MazeTrail is an unbiased maze generation algorithm, or it samples mazes (or spanning trees) with the uniform distribution. Wilson's algorithm is a more efficient unbiased maze generation algorithm. Compare its MarkovJunior implementation with an implementation in a conventional language! Combining rulenodes We can put several rulenodes into a sequence node , to be run one after the other. In the River model we first construct a stochastic Voronoi diagram with 2 sources, and use the boundary between the formed regions as a base for a river. Then we spawn a couple more Voronoi seeds to grow forests and simultaneously grow grass from the river. As a result, we get random river valleys! In Apartemazements we start with a WFC node and then do constructive postprocessing with rulenodes:
1. Prepare constraints: mark bottom cells with a separate bottom color, mark the remaining border cells (sides and top) with a separate border color. Border cells should map to Empty, bottom cells should map to all tiles except Down.
2. Run WFC Paths tileset to generate closed stairy cycles.
3. Randomize light sources.
4. Drop columns from corners of flat tiles.
5. Retract double columns, columns that touch ground and columns that touch stairs, except columns growing from corners of the Turn tiles.
6. Grow windows between neighboring columns.
7. Merge windows into bigger rectangles. We do this in several steps:
1. Detect uneven patterns of windows when window corners touch window midpoints.
2. Mark these patterns and propagate the markings through the whole lengths of window sides.
3. Merge unmarked pairs of window sides.
8. Turn the remaining 1x1 windows into walls. A more interesting way to combine nodes is to put them into a Markov node . Markov nodes substantially expand what we can do, because they allow to return to past nodes. When a Markov node is active, interpreter finds its first child node that matches and applies it. On the next turn, it finds the first matching node in the list again, and so on. The simplest example of the Markov node use is MazeBacktracker explained in the top section. One of my favorite examples that motivated the development of MarkovJunior is Bob Nystrom's dungeon generation algorithm . It goes as follows:
1. Draw a grid {PBB=**P} .
2. Spawn a bunch of rooms (room.png) .
3. Generate a maze on the rest of the grid. We can use any maze generation algorithm, but MazeBacktracker is preferred because it produces fewer branching points.
4. Make the resulting configuration of rooms and corridors connected. This can be elegantly done with a Markov node ({GWW=**G}(GBW=*WG)) .
5. Make some additional connections (GBG=*W* #5) , so the resulting dungeon has cycles. Dungeons without cycles are pretty boring, since the player has to return through already explored zones.
6. Retract dead ends {BBB/BWB=BBB/BBB} . Like in REFAL, Markov nodes can be nested: once we go into a child node, we ignore outer nodes until the child branch completes. Inference Probabilistic inference in MarkovJunior allows to impose constraints on the future state, and generate only those runs that lead to the constrained future. In other words, inference connects 2 given states (or partially observed states) with a chain of rewrite rules. The simplest example of inference use is connecting 2 points with a path. In the self-avoiding walk model (RBB=WWR) we can observe a given square on the grid to become R red. Then the interpreter would generate only those walks that lead to the observed square. We can set the interpreter to follow the goal more strictly or less strictly by varying the temperature parameter. By default, temperature is set to zero. Coldest | Cold | Hot | Hottest Another thing we can do is to observe all odd grid squares becoming white or red. Then the interpreter would generate self-avoiding walks that cover the entire grid. We can engage inference for any rewrite rules. For example, inference for stair-drawing rules connects 2 points with a stairy path. Inference for rule R**/**B=B**/**R generates paths that a chess knight can take. Inference in the CrossCountry model connects 2 points with a path taking terrain costs into account. Inference for the Sokoban ruleset {RB=BR RWB=BRW} solves Sokoban puzzles or even multiagent Sokoban puzzles ! Inference in MarkovJunior is done via unidirectional (fast) or bidirectional (slow, but more powerful) constraint propagation. Unidirectional constraint propagation for rewrite rules can be described equivalently in terms of rule propagation fields which generalize Dijkstra fields for arbitrary rewrite rules. Dijkstra fields is a popular technique in grid-based procedural generation ( 1 , 2 , 3 ). They in turn generalize distance fields used in computer graphics. If constraint propagation completes it doesn't necessarily mean that the goal state is achievable. But if the propagation fails then we know for sure that the goal is not achievable. This allows to catch states where a crate is pushed to the wrong wall in Sokoban, or where the grid-covering walk splits the grid into 2 disconnected parts. In addition to this boolean heuristic, it's worth looking at the minimal number of turns required for constraint propagation to complete. This integer-valued heuristic is admissible , and we use it in A* search to sample paths made of rewrite rules between 2 given states. Open problems Program synthesis for procedural generation . William Chyr's talk "Level Design in Impossible Geometry" is not at all about procedural generation, yet I find one slide to be very characteristic for pcg practice. William compares his earlier and later approaches to level design. The earlier one produced chaotic levels, while the later approach produced more structured, more intentional levels based on one central idea. Later levels weren't simpler, yet they were more memorable and easier for players to perceive. To me, the left level looks like it was generated procedurally! It has a very similar feel to my procedural voxel puzzles . Can we make generators that produce levels that are more like the one on the right? This problem may seem AI-complete. But I'd argue it is very similar to classic genetic programming problems like Koza's lawnmower problem . For example, take a simple procgen task of generating Hamiltonian paths on the grid . Even for small grid sizes like 29x29 this task is already computationally demanding. But do we really need to sample from all possible paths in practice? If we give this task to a human, they would probably draw a spiral or a zigzag curve - these are much more memorable and intentional designs than a random Hamiltonian path, plus they generalize to any grid sizes. To summarize, we can ask the system either to find a random Hamiltonian path or to find a short program that generates Hamiltonian paths. In the first case the result would look like the left level on the slide, and in the second case like the right level. Solving the latter program synthesis problem would create more memorable and intentional generators. Model synthesis from examples . Markov algorithms seem to be a perfect environment for program/model synthesis: no variables, ifs or whiles, nodes can be easily moved around without breaking correctness, models are easy to make differentiable. Random MJ programs are often fun and can produce human-relatable results and behaviors. Can we synthesize a MJ model from a result, or a set of results? Given a maze, is it possible to determine (or assign probabilities) whether it was generated by MazeGrowth or MazeBacktracker ? Solve the Abstraction and Reasoning Challenge by inferring MarkovJunior models. Adjoint problem: use insights from the ARC challenge to build a better DSL for procedural generation on a grid. Custom algorithms that run in the wave space . To unite the advantages of constructive and constrained-based procedural generation. Related: custom algorithms (MJ rewrite rules) with custom energy functions like Ising energy or ConvChain energy. Generalize the notion of a pattern. Investigate MJ-like processes on other (possibly nonregular) grids or arbitrary graphs. Experiment with interactive extensions of Markov algorithms. It's possible to turn any MJ model into a game by assigning specific rewrite rules or nodes to key presses. Push the state of the art in grid-based procedural generation. ModernHouse does not yet reach the structural variety of human-designed houses like Sims 2 houses . Use more subtle constraints. Comments Compared to Turing machines and lambda calculus, Markov algorithms is probably the shortest and simplest way to rigorously define what an algorithm is. Exercise: prove that the following Markov algorithm finds the greatest common divisor of 2 numbers written in a unary representation. For example, if we apply it to 111111*1111111111 we get 11 . 1a=a1
1*1=a*
1*=*b
b=1
a=c
c=1
*=ε (halt) Fast pattern matching. MarkovJunior interpreter samples matches uniformly, but it doesn't scan the whole grid every turn. To keep pattern matching fast, the interpreter remembers previously found matches and searches only around the places that got changed. When a rulenode is encountered for the first time, MJ interpreter uses a multidimensional version of the Boyer–Moore algorithm . Stochastic relaxation. Markov nodes have a very nice representations as limits of differentiable nodes. Consider an unordered set of rewrite rules where each rule r is assigned a weight w(r) . On each step the interpreter finds all matches for all rules and chooses a random match according to the Boltzmann distribution p(r) ~ exp(-w(r)/t) . Then in the freezing limit t->0 we get a Markov node, ordered by weights. What's good about this construction, is that for any t>0 and for a typical score function, score's average on multiple runs would be a continuous (and smooth for practical purposes) function of weights. This means that one can find the optimal weights by gradient descent and then freeze the system to get the final discrete program. Read this essay by Boris Kushner about A. A. Markov and his work in constructive mathematics. Used work Main used work:
1. Andrey A. Markov, The Theory of Algorithms , 1951. Markov used these ideas earlier in 1947 in his proof of the algorithmic undecidability of the word problem in semigroups. See also a later book with a more detailed treatment. I would be grateful for links to English translations in open access.
2. Guilherme S. Tows, Imagegram , 2009. MarkovJunior takes forall-nodes from Imagegram.
3. Valentin Turchin, REFAL language , 1968. MJ takes the idea of nested Markov nodes from REFAL.
4. Brian Walker et al., The incredible power of Dijkstra maps , 2010. A discussion in the the roguelike community that contains many techniques of using Dijkstra maps/distance fields for procedural generation and NPC AI. Later writeups: 1 , 2 . We generalize Dijkstra maps to arbitrary rewrite rules.
5. Pavlos S. Efraimidis, Paul Spirakis, Weighted Random Sampling , 2005.
6. Work used in custom nodes: Model Synthesis , Wave Function Collapse Algorithm , ConvChain Algorithm .
7. Classic algorithms: constraint propagation , constraint solving algorithms , graph traversal , A* search . Related work:
1. Daniel Ritchie, Probabilistic Programming for Procedural Modeling and Design , 2016.
2. Lingfeng Yang, From Execution Traces to Specialized Inference , 2015. Sources of examples:
1. BasicKeys and Keys are adaptations of graph grammars formulated by Joris Dormans, Engineering Emergence: Applied Theory for Game Design , 2012. Which in turn are development of the earlier work by David Adams, Automatic Generation of Dungeons for Computer Games , 2002. I use a variation of these models to generate key-lock-bridge puzzles in SeaVilla .
1. CarmaTower is a proceduralization of a voxel scene by Antoine Lendrevie.
1. The NystromDungeon model is a MarkovJunior port of Bob Nystrom's dungeon generator .
1. HamiltonianPath algorithm is adapted from this article. Compare it with an implementation in a conventional language.
1. Room shapes in DungeonGrowth are taken from the r/proceduralgeneration post . Note that MJ interpreter automatically performs the optimizations described in the post.
1. The Wilson model is a rewrite rule formulation of the Wilson's algorithm . Compare it with an implementation in a conventional language.
1. MazeGrowth model is also known as maze generation via random traversal. Compare it with an implementation in a conventional language.
1. Growth is closely related to the Eden growth model .
1. BernoulliPercolation is a well studied model in a percolation theory .
1. NestedGrowth is taken from Imagegram .
1. SmoothTrail is adapted from 128_mhz's tweet .
1. SokobanLevel1 seems to be the first level from Hiroyuki Imabayashi's Sokoban puzzle. SokobanLevel2 is the level 452 from Ionic Catalysts XI set.
1. RainbowGrowth was proposed by mure .
1. MultiHeadedWalk , MultiHeadedDungeon and MultiHeadedWalkDungeon are based on the idea by Ilya Kudritsky .
1. Island model is by Guillaume Fiette .
1. LostCity , Forest and Texture models are based on the model by Andrew Kay . Voxel scenes were rendered in MagicaVoxel by ephtracy . Special thanks to Brian Bucklew for demonstrating the power of Dijkstra fields to me in roguelike level generation and Kevin Chapelier for a number of good suggestions. The font used in GUI is Tamzen . How to build MarkovJunior interpreter is a console application that depends only on the standard library. Get .NET Core for Windows, Linux or macOS and run dotnet run --configuration Release MarkovJunior.csproj Alternatively, download and run the latest release for Windows. Generated results are put into the output folder. Edit models.xml to change model parameters. Open .vox files with MagicaVoxel . Notable ports, forks and spinoffs Yuu made a TypeScript version of MarkovJunior that runs on the web , extended the language and added the ability to bind nodes to keypresses. Aseaday is porting MarkovJunior to JavaScript. Andrew Kay added XML documentation to C# source code. Dan Ogles wrote MarkovJunior technical notes with the focus on fields and inference. Andrew Kay designed MJr , a compiled language based on pattern rewriting. Funding MarkovJunior development was funded by
1. Embark Studios 2. Oskar Stålberg 3. Freehold Games 4. Bob Burrough | Probabilistic language based on pattern matching and constraint propagation, 153 examples | algorithms,csharp,gamedev,language,probabilistic-programming,procedural-generation,voxel,markovjunior,cellular-automata | 2 | 5 | 11 | 31 | 5 | 1 | 0 |
facebookresearch/metaseq | Metaseq A codebase for working with Open Pre-trained Transformers , originally forked from fairseq . Community Integrations Using OPT with 🤗 Transformers The OPT 125M--66B models are now available in Hugging Face Transformers . You can access them under the facebook organization on the Hugging Face Hub Using OPT-175B with Alpa The OPT 125M--175B models are now supported in the Alpa project , which
enables serving OPT-175B with more flexible parallelisms on older generations of GPUs, such as 40GB A100, V100, T4, M60, etc. Using OPT with Colossal-AI The OPT models are now supported in the Colossal-AI , which helps users to efficiently and quickly deploy OPT models training and inference, reducing large AI model budgets and scaling down the labor cost of learning and deployment. Using OPT with CTranslate2 The OPT 125M--66B models can be executed with CTranslate2 , which is a fast inference engine for Transformer models. The project integrates the SmoothQuant technique to allow 8-bit quantization of OPT models. See the usage example to get started. Using OPT with FasterTransformer The OPT models can be served with FasterTransformer , a highly optimized inference framework written and maintained by NVIDIA. We provide instructions to convert OPT checkpoints into FasterTransformer format and a usage example with some benchmark results. Using OPT with DeepSpeed The OPT models can be finetuned using DeepSpeed . See the DeepSpeed-Chat example to get started. Getting Started in Metaseq Follow setup instructions here to get started. Documentation on workflows Training API Background Info Background & relationship to fairseq Chronicles of training OPT-175B Support If you have any questions, bug reports, or feature requests regarding either the codebase or the models released in the projects section, please don't hesitate to post on our Github Issues page . Please remember to follow our Code of Conduct . Contributing We welcome PRs from the community! You can find information about contributing to metaseq in our Contributing document. The Team Metaseq is currently maintained by the CODEOWNERS: Susan Zhang , Naman Goyal , Punit Singh Koura , Moya Chen , Kurt Shuster , David Esiobu , Igor Molybog , Peter Albert , Andrew Poulton , Nikolay Bashlykov , Binh Tang , Uriel Singer , Yuchen Zhang , Armen Aghajanya , Lili Yu , and Adam Polyak . License The majority of metaseq is licensed under the MIT license, however portions of the project are available under separate license terms:
* Megatron-LM is licensed under the Megatron-LM license | Repo for external large-scale work | [] | 0 | 86 | 455 | 309 | 103 | 132 | 1 |
modelscope/modelscope | [![PyPI](https://img.shields.io/pypi/v/modelscope)](https://pypi.org/project/modelscope/) [![license](https://img.shields.io/github/license/modelscope/modelscope.svg)](https://github.com/modelscope/modelscope/blob/master/LICENSE)
[![open issues](https://isitmaintained.com/badge/open/modelscope/modelscope.svg)](https://github.com/modelscope/modelscope/issues)
[![GitHub pull-requests](https://img.shields.io/github/issues-pr/modelscope/modelscope.svg)](https://GitHub.com/modelscope/modelscope/pull/)
[![GitHub latest commit](https://badgen.net/github/last-commit/modelscope/modelscope)](https://GitHub.com/modelscope/modelscope/commit/)
[![Leaderboard](https://img.shields.io/badge/ModelScope-Check%20Your%20Contribution-orange)](https://opensource.alibaba.com/contribution_leaderboard/details?projectValue=modelscope) English | 中文 | 日本語 # Introduction
[ModelScope]( https://www.modelscope.cn) is built upon the notion of “Model-as-a-Service” (MaaS). It seeks to bring together most advanced machine learning models from the AI community, and streamlines the process of leveraging AI models in real-world applications. The core ModelScope library open-sourced in this repository provides the interfaces and implementations that allow developers to perform model inference, training and evaluation.
In particular, with rich layers of API-abstraction, the ModelScope library offers unified experience to explore state-of-the-art models spanning across domains such as CV, NLP, Speech, Multi-Modality, and Scientific-computation. Model contributors of different areas can integrate models into the ModelScope ecosystem through the layered-APIs, allowing easy and unified access to their models. Once integrated, model inference, fine-tuning, and evaluations can be done with only a few lines of codes. In the meantime, flexibilities are also provided so that different components in the model applications can be customized wherever necessary.
Apart from harboring implementations of a wide range of different models, ModelScope library also enables the necessary interactions with ModelScope backend services, particularly with the Model-Hub and Dataset-Hub. Such interactions facilitate management of various entities (models and datasets) to be performed seamlessly under-the-hood, including entity lookup, version control, cache management, and many others.
# Models and Online Accessibility
Hundreds of models are made publicly available on [ModelScope]( https://www.modelscope.cn) (700+ and counting), covering the latest development in areas such as NLP, CV, Audio, Multi-modality, and AI for Science, etc. Many of these models represent the SOTA in their specific fields, and made their open-sourced debut on ModelScope. Users can visit ModelScope([modelscope.cn](http://www.modelscope.cn)) and experience first-hand how these models perform via online experience, with just a few clicks. Immediate developer-experience is also possible through the ModelScope Notebook, which is backed by ready-to-use CPU/GPU development environment in the cloud - only one click away on [ModelScope](https://www.modelscope.cn). Some representative examples include:
LLM:
* [Yi-1.5-34B-Chat](https://modelscope.cn/models/01ai/Yi-1.5-34B-Chat/summary)
* [Qwen1.5-110B-Chat](https://modelscope.cn/models/qwen/Qwen1.5-110B-Chat/summary)
* [DeepSeek-V2-Chat](https://modelscope.cn/models/deepseek-ai/DeepSeek-V2-Chat/summary)
* [Ziya2-13B-Chat](https://modelscope.cn/models/Fengshenbang/Ziya2-13B-Chat/summary)
* [Meta-Llama-3-8B-Instruct](https://modelscope.cn/models/LLM-Research/Meta-Llama-3-8B-Instruct/summary)
* [Phi-3-mini-128k-instruct](https://modelscope.cn/models/LLM-Research/Phi-3-mini-128k-instruct/summary)
Multi-Modal:
* [Qwen-VL-Chat](https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary)
* [Yi-VL-6B](https://modelscope.cn/models/01ai/Yi-VL-6B/summary)
* [InternVL-Chat-V1-5](https://modelscope.cn/models/AI-ModelScope/InternVL-Chat-V1-5/summary)
* [deepseek-vl-7b-chat](https://modelscope.cn/models/deepseek-ai/deepseek-vl-7b-chat/summary)
* [OpenSoraPlan](https://modelscope.cn/models/AI-ModelScope/Open-Sora-Plan-v1.0.0/summary)
* [OpenSora](https://modelscope.cn/models/luchentech/OpenSora-STDiT-v1-HQ-16x512x512/summary)
* [I2VGen-XL](https://modelscope.cn/models/iic/i2vgen-xl/summary)
CV:
* [DamoFD Face Detection Key Point Model - 0.5G](https://modelscope.cn/models/damo/cv_ddsar_face-detection_iclr23-damofd/summary)
* [BSHM Portrait Matting](https://modelscope.cn/models/damo/cv_unet_image-matting/summary)
* [DCT-Net Portrait Cartoonization - 3D](https://modelscope.cn/models/damo/cv_unet_person-image-cartoon-3d_compound-models/summary)
* [DCT-Net Portrait Cartoonization Model - 3D](https://modelscope.cn/models/damo/face_chain_control_model/summary)
* [DuGuang - Text Recognition - Line Recognition Model - Chinese and English - General Domain](https://modelscope.cn/models/damo/cv_convnextTiny_ocr-recognition-general_damo/summary)
* [DuGuang - Text Recognition - Line Recognition Model - Chinese and English - General Domain](https://modelscope.cn/models/damo/cv_resnet18_ocr-detection-line-level_damo/summary)
* [LaMa Image Inpainting](https://modelscope.cn/models/damo/cv_fft_inpainting_lama/summary)
Audio:
* [Paraformer Speech Recognition - Chinese - General - 16k - Offline - Large - Long Audio Version](https://modelscope.cn/models/damo/speech_paraformer-large-vad-punc_asr_nat-zh-cn-16k-common-vocab8404-pytorch/summary)
* [FSMN Voice Endpoint Detection - Chinese - General - 16k - onnx](https://modelscope.cn/models/damo/speech_fsmn_vad_zh-cn-16k-common-onnx/summary)
* [Monotonic-Aligner Speech Timestamp Prediction - 16k - Offline](https://modelscope.cn/models/damo/speech_timestamp_prediction-v1-16k-offline/summary)
* [CT-Transformer Punctuation - Chinese - General - onnx](https://modelscope.cn/models/damo/punc_ct-transformer_zh-cn-common-vocab272727-onnx/summary)
* [Speech Synthesis - Chinese - Multiple Emotions Domain - 16k - Multiple Speakers](https://modelscope.cn/models/damo/speech_sambert-hifigan_tts_zh-cn_16k/summary)
* [CAM++ Speaker Verification - Chinese - General - 200k-Spkrs](https://modelscope.cn/models/damo/speech_campplus_sv_zh-cn_16k-common/summary)
AI for Science:
* [uni-fold-monomer](https://modelscope.cn/models/DPTech/uni-fold-monomer/summary)
* [uni-fold-multimer](https://modelscope.cn/models/DPTech/uni-fold-multimer/summary)
**Note:** Most models on ModelScope are public and can be downloaded without account registration on modelscope website([www.modelscope.cn](www.modelscope.cn)), please refer to instructions for [model download](https://modelscope.cn/docs/%E6%A8%A1%E5%9E%8B%E7%9A%84%E4%B8%8B%E8%BD%BD), for dowloading models with api provided by modelscope library or git.
# QuickTour
We provide unified interface for inference using `pipeline`, fine-tuning and evaluation using `Trainer` for different tasks.
For any given task with any type of input (image, text, audio, video...), inference pipeline can be implemented with only a few lines of code, which will automatically load the underlying model to get inference result, as is exemplified below:
```python
>>> from modelscope.pipelines import pipeline
>>> word_segmentation = pipeline('word-segmentation',model='damo/nlp_structbert_word-segmentation_chinese-base')
>>> word_segmentation('今天天气不错,适合出去游玩')
{'output': '今天 天气 不错 , 适合 出去 游玩'}
```
Given an image, portrait matting (aka. background-removal) can be accomplished with the following code snippet:
![image](data/resource/portrait_input.png)
```python
>>> import cv2
>>> from modelscope.pipelines import pipeline
>>> portrait_matting = pipeline('portrait-matting')
>>> result = portrait_matting('https://modelscope.oss-cn-beijing.aliyuncs.com/test/images/image_matting.png')
>>> cv2.imwrite('result.png', result['output_img'])
```
The output image with the background removed is:
![image](data/resource/portrait_output.png)
Fine-tuning and evaluation can also be done with a few more lines of code to set up training dataset and trainer, with the heavy-lifting work of training and evaluation a model encapsulated in the implementation of `traner.train()` and
`trainer.evaluate()` interfaces.
For example, the gpt3 base model (1.3B) can be fine-tuned with the chinese-poetry dataset, resulting in a model that can be used for chinese-poetry generation.
```python
>>> from modelscope.metainfo import Trainers
>>> from modelscope.msdatasets import MsDataset
>>> from modelscope.trainers import build_trainer
>>> train_dataset = MsDataset.load('chinese-poetry-collection', split='train'). remap_columns({'text1': 'src_txt'})
>>> eval_dataset = MsDataset.load('chinese-poetry-collection', split='test').remap_columns({'text1': 'src_txt'})
>>> max_epochs = 10
>>> tmp_dir = './gpt3_poetry'
>>> kwargs = dict(
model='damo/nlp_gpt3_text-generation_1.3B',
train_dataset=train_dataset,
eval_dataset=eval_dataset,
max_epochs=max_epochs,
work_dir=tmp_dir)
>>> trainer = build_trainer(name=Trainers.gpt3_trainer, default_args=kwargs)
>>> trainer.train()
```
# Why should I use ModelScope library
1. A unified and concise user interface is abstracted for different tasks and different models. Model inferences and training can be implemented by as few as 3 and 10 lines of code, respectively. It is convenient for users to explore models in different fields in the ModelScope community. All models integrated into ModelScope are ready to use, which makes it easy to get started with AI, in both educational and industrial settings.
2. ModelScope offers a model-centric development and application experience. It streamlines the support for model training, inference, export and deployment, and facilitates users to build their own MLOps based on the ModelScope ecosystem.
3. For the model inference and training process, a modular design is put in place, and a wealth of functional module implementations are provided, which is convenient for users to customize their own model inference, training and other processes.
4. For distributed model training, especially for large models, it provides rich training strategy support, including data parallel, model parallel, hybrid parallel and so on.
# Installation
## Docker
ModelScope Library currently supports popular deep learning framework for model training and inference, including PyTorch, TensorFlow and ONNX. All releases are tested and run on Python 3.7+, Pytorch 1.8+, Tensorflow1.15 or Tensorflow2.0+.
To allow out-of-box usage for all the models on ModelScope, official docker images are provided for all releases. Based on the docker image, developers can skip all environment installation and configuration and use it directly. Currently, the latest version of the CPU image and GPU image can be obtained from:
CPU docker image
```shell
# py37
registry.cn-hangzhou.aliyuncs.com/modelscope-repo/modelscope:ubuntu20.04-py37-torch1.11.0-tf1.15.5-1.6.1
# py38
registry.cn-hangzhou.aliyuncs.com/modelscope-repo/modelscope:ubuntu20.04-py38-torch2.0.1-tf2.13.0-1.9.5
```
GPU docker image
```shell
# py37
registry.cn-hangzhou.aliyuncs.com/modelscope-repo/modelscope:ubuntu20.04-cuda11.3.0-py37-torch1.11.0-tf1.15.5-1.6.1
# py38
registry.cn-hangzhou.aliyuncs.com/modelscope-repo/modelscope:ubuntu20.04-cuda11.8.0-py38-torch2.0.1-tf2.13.0-1.9.5
```
## Setup Local Python Environment
One can also set up local ModelScope environment using pip and conda. ModelScope supports python3.7 and above.
We suggest [anaconda](https://docs.anaconda.com/anaconda/install/) for creating local python environment:
```shell
conda create -n modelscope python=3.8
conda activate modelscope
```
PyTorch or TensorFlow can be installed separately according to each model's requirements.
* Install pytorch [doc](https://pytorch.org/get-started/locally/)
* Install tensorflow [doc](https://www.tensorflow.org/install/pip)
After installing the necessary machine-learning framework, you can install modelscope library as follows:
If you only want to play around with the modelscope framework, of trying out model/dataset download, you can install the core modelscope components:
```shell
pip install modelscope
```
If you want to use multi-modal models:
```shell
pip install modelscope[multi-modal]
```
If you want to use nlp models:
```shell
pip install modelscope[nlp] -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html
```
If you want to use cv models:
```shell
pip install modelscope[cv] -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html
```
If you want to use audio models:
```shell
pip install modelscope[audio] -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html
```
If you want to use science models:
```shell
pip install modelscope[science] -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html
```
`Notes`:
1. Currently, some audio-task models only support python3.7, tensorflow1.15.4 Linux environments. Most other models can be installed and used on Windows and Mac (x86).
2. Some models in the audio field use the third-party library SoundFile for wav file processing. On the Linux system, users need to manually install libsndfile of SoundFile([doc link](https://github.com/bastibe/python-soundfile#installation)). On Windows and MacOS, it will be installed automatically without user operation. For example, on Ubuntu, you can use following commands:
```shell
sudo apt-get update
sudo apt-get install libsndfile1
```
3. Some models in computer vision need mmcv-full, you can refer to mmcv [installation guide](https://github.com/open-mmlab/mmcv#installation), a minimal installation is as follows:
```shell
pip uninstall mmcv # if you have installed mmcv, uninstall it
pip install -U openmim
mim install mmcv-full
```
# Learn More
We provide additional documentations including:
* [More detailed Installation Guide](https://modelscope.cn/docs/%E7%8E%AF%E5%A2%83%E5%AE%89%E8%A3%85)
* [Introduction to tasks](https://modelscope.cn/docs/%E4%BB%BB%E5%8A%A1%E7%9A%84%E4%BB%8B%E7%BB%8D)
* [Use pipeline for model inference](https://modelscope.cn/docs/%E6%A8%A1%E5%9E%8B%E7%9A%84%E6%8E%A8%E7%90%86Pipeline)
* [Finetuning example](https://modelscope.cn/docs/%E6%A8%A1%E5%9E%8B%E7%9A%84%E8%AE%AD%E7%BB%83Train)
* [Preprocessing of data](https://modelscope.cn/docs/%E6%95%B0%E6%8D%AE%E7%9A%84%E9%A2%84%E5%A4%84%E7%90%86)
* [Evaluation](https://modelscope.cn/docs/%E6%A8%A1%E5%9E%8B%E7%9A%84%E8%AF%84%E4%BC%B0)
* [Contribute your own model to ModelScope](https://modelscope.cn/docs/ModelScope%E6%A8%A1%E5%9E%8B%E6%8E%A5%E5%85%A5%E6%B5%81%E7%A8%8B%E6%A6%82%E8%A7%88)
# License
This project is licensed under the [Apache License (Version 2.0)](https://github.com/modelscope/modelscope/blob/master/LICENSE). | ModelScope: bring the notion of Model-as-a-Service to life. | nlp,cv,speech,multi-modal,science,deep-learning,machine-learning,python | 28 | 94 | 377 | 2,387 | 73 | 97 | 5 |
ast-grep/ast-grep | ast-grep(sg) ast-grep(sg) is a CLI tool for code structural search, lint, and rewriting. Introduction ast-grep is an AST-based tool to search code by pattern code. Think it as your old-friend grep but it matches AST nodes instead of text.
You can write patterns as if you are writing ordinary code. It will match all code that has the same syntactical structure.
You can use $ sign + upper case letters as wildcard, e.g. $MATCH , to match any single AST node. Think it as REGEX dot . , except it is not textual. Try the online playground for a taste! Demo See more screenshots on the website . Installation You can install it from npm , pip , cargo , homebrew , scoop or MacPorts ! ```bash
npm install --global @ast-grep/cli
pip install ast-grep-cli
cargo install ast-grep --locked install via homebrew, thank @henryhchchc brew install ast-grep install via scoop, thank @brian6932 scoop install main/ast-grep install via MacPorts sudo port install ast-grep Or you can build ast-grep from source. You need to install rustup, clone the repository and then bash
cargo install --path ./crates/cli --locked
``` Packages are available on other platforms too. Command line usage example ast-grep has following form. sg --pattern 'var code = $PATTERN' --rewrite 'let code = new $PATTERN' --lang ts Example Rewrite code in null coalescing operator bash
sg -p '$A && $A()' -l ts -r '$A?.()' Rewrite Zodios bash
sg -p 'new Zodios($URL, $CONF as const,)' -l ts -r 'new Zodios($URL, $CONF)' -i Implement eslint rule using YAML. Sponsor If you find ast-grep interesting and useful for your work, please buy me a coffee so I can spend more time on the project! Feature Highlight ast-grep's core is an algorithm to search and replace code based on abstract syntax tree produced by tree-sitter.
It can help you to do lightweight static analysis and massive scale code manipulation in an intuitive way. Key highlights: An intuitive pattern to find and replace AST.
ast-grep's pattern looks like ordinary code you would write every day (you could say the pattern is isomorphic to code). jQuery like API for AST traversal and manipulation. YAML configuration to write new linting rules or code modification. Written in compiled language, with tree-sitter based parsing and utilizing multiple cores. Beautiful command line interface :) ast-grep's vision is to democratize abstract syntax tree magic and to liberate one from cumbersome AST programming! If you are an open-source library author, ast-grep can help your library users adopt breaking changes more easily. if you are a tech lead in your team, ast-grep can help you enforce code best practice tailored to your business need. If you are a security researcher, ast-grep can help you write rules much faster. | ⚡A CLI tool for code structural search, lint and rewriting. Written in Rust | codemod,linter,ast,babel,command-line,command-line-tool,grep,tree-sitter,codereview,rust | 106 | 40 | 768 | 2,409 | 48 | 4 | 5 |
elebumm/RedditVideoMakerBot | Reddit Video Maker Bot 🎥 All done WITHOUT video editing or asset compiling. Just pure ✨programming magic✨. Created by Lewis Menelaws & TMRRW Video Explainer Motivation 🤔 These videos on TikTok, YouTube and Instagram get MILLIONS of views across all platforms and require very little effort.
The only original thing being done is the editing and gathering of all materials... ... but what if we can automate that process? 🤔 Disclaimers 🚨 At the moment , this repository won't attempt to upload this content through this bot. It will give you a file that
you will then have to upload manually. This is for the sake of avoiding any sort of community guideline issues. Requirements Python 3.10 Playwright (this should install automatically in installation) Installation 👩💻 Clone this repository Run pip install -r requirements.txt Run python -m playwright install and python -m playwright install-deps EXPERIMENTAL!!!! On macOS and Linux (debian, arch, fedora and centos, and based on those), you can run an install script that will automatically install steps 1 to 3. (requires bash) bash <(curl -sL https://raw.githubusercontent.com/elebumm/RedditVideoMakerBot/master/install.sh) This can also be used to update the installation Run python main.py Visit the Reddit Apps page. , and set up an app that is a "script". Paste any URL in redirect URL. Ex: https://jasoncameron.dev The bot will ask you to fill in your details to connect to the Reddit API, and configure the bot to your liking Enjoy 😎 If you need to reconfigure the bot, simply open the config.toml file and delete the lines that need to be changed. On the next run of the bot, it will help you reconfigure those options. (Note if you got an error installing or running the bot try first rerunning the command with a three after the name e.g. python3 or pip3) If you want to read more detailed guide about the bot, please refer to the documentation Video https://user-images.githubusercontent.com/66544866/173453972-6526e4e6-c6ef-41c5-ab40-5d275e724e7c.mp4 Contributing & Ways to improve 📈 In its current state, this bot does exactly what it needs to do. However, improvements can always be made! I have tried to simplify the code so anyone can read it and start contributing at any skill level. Don't be shy :) contribute! [ ] Creating better documentation and adding a command line interface. [x] Allowing the user to choose background music for their videos. [x] Allowing users to choose a reddit thread instead of being randomized. [x] Allowing users to choose a background that is picked instead of the Minecraft one. [x] Allowing users to choose between any subreddit. [x] Allowing users to change voice. [x] Checks if a video has already been created [x] Light and Dark modes [x] NSFW post filter Please read our contributing guidelines for more detailed information. For any questions or support join the Discord server Developers and maintainers. Elebumm (Lewis#6305) - https://github.com/elebumm (Founder) Jason (personality.json) - https://github.com/JasonLovesDoggo (Maintainer) Simon (OpenSourceSimon) - https://github.com/OpenSourceSimon CallumIO (c.#6837) - https://github.com/CallumIO Verq (Verq#2338) - https://github.com/CordlessCoder LukaHietala (Pix.#0001) - https://github.com/LukaHietala Freebiell (Freebie#3263) - https://github.com/FreebieII Aman Raza (electro199#8130) - https://github.com/electro199 Cyteon (cyteon) - https://github.com/cyteon LICENSE Roboto Fonts are licensed under Apache License V2 | Create Reddit Videos with just✨ one command ✨ | [] | 15 | 86 | 648 | 1,278 | 10 | 3 | 4 |
cxli233/FriendsDontLetFriends | Friends Don't Let Friends Make Bad Graphs Friends don't let friends make certain types of data visualization - What are they and why are they bad. Author: Chenxin Li, postdoctoral associate at Center for Applied Genetic Technologies, University of Georgia. Contact: Chenxin.Li@uga.edu | @ChenxinLi2 This is an opinionated essay about good and bad practices in data visualization.
Examples and explanations are below. The Scripts/ directory contains .Rmd files that generate the graphics shown below.
It requires R, RStudio, and the rmarkdown package. R: R Download RStudio: RStudio Download rmarkdown can be installed using the install packages interface in RStudio Table of contents Friends Don't Let Friends Make Bar Plots For Mean Separation Friends Don't Let Friends Make Violin Plots for Small Sample Sizes Friends Don't Let Friends Use Bidirectional Color Scales for Unidirectional Data Friends Don't Let Friends Make Bar Plot Meadow Friends Don't Let Friends Make Heatmap without Reordering Rows & Columns Friends Don't Let Friends Make Heatmap without Checking Outliers Friends Don't Let Friends Forget to Check Data Range at Each Factor Level Friends Don't Let Friends Make Network Graphs without Trying Different Layouts Friends Don't Let Friends Confuse Position and Length Based Visualizations Friends Don't Let Friends Make Pie Charts Friends Don't Let Friends Make Concentric Donuts Friends Don't Let Friends Use Red/green and Rainbow for Color Scales Friends Don't Let Friends Forget to Reorder Stacked Bar Plot Friends Don't Let Friends Mix Stacked Bars and Mean separation 1. Friends Don't Let Friends Make Bar Plots for Means Separation This has to be the first one.
Means separation plots are some of the most common in scientific publications.
We have two or more groups, which contains multiple observations; they may have different means, variances, and distributions.
The task of the visualization is to show the means and the spread (dispersion) of the data. In this example, two groups have similar means and standard deviations, but quite different distributions. Are they really "the same"? Just don't use bar plot for means separation, or at least check a couple things before settling down on a bar plot. It's worth mentioning that I was inspired by many researchers who have tweeted on the limitation of bar graphs.
Here is a publication: Weissgerber et al., 2015, PLOS Biology . 2. Friends Don't Let Friends Make Violin Plots for Small Sample Sizes This is quite common in the literature as well, but unfortunately, violin plots (or any sort of smoothed distribution curves) make no sense for small n. Distributions and quartiles can vary widely with small n, even if the underlying observations are similar.
Distribution and quartiles are only meaningful with large n.
I did an experiment before, where I sampled the same normal distribution several times and computed the quartiles for each sample.
The quartiles only stablize when n gets larger than 50. 3. Friends Don't Let Friends Use Bidirectional Color Scales for Unidirectional Data Excuse my language, but this is a truly data visualization sin, and again quite common.
I can understand why this error is common, because it appears that many of us have not spent a lot of thoughts on this issue. Color scales are pretty, but we have to be extra careful.
When color scales (or color gradients) are used to represent numerical data, the darkest and lightest colors should have special meanings.
You can decide what those special meanings are: e.g., max, min, mean, zero. But they should represent something meaningful.
A data visualization sin for heat maps/color gradients is when the lightest or darkest colors are some arbitrary numbers. This is as bad as the longest bar in a bar chart not being the largest value. Can you imagine that? 4. Friends Don't Let Friends Make Bar Plot Meadow We talked about no bar charts for mean separation, but this is a different issue.
It has to do with presenting results of a multi-factorial experiment.
Bar plot meadows are very common in scientific publications and unfortunately also ineffective in communicating the results. Data from: Matand et al., 2020, BMC Plant Biology Bar plot meadows are common because multi-factorial experiments are common.
However, a bar plot meadow is poorly designed for its purpose.
To communicate results of a multi-factorial experiment, it requires thoughtful designs regarding grouping/faceting by factors of interest. In this example, I focus on comparing the effect of Treatment & Explant on Response at the level of each Variety .
However, if the focus is the effect of Treatment & Variety on Response at the level of each Exaplant , then it will require a different layout. 5. Friends Don't Let Friends Make Heatmap without (Considering) Reordering Rows & Columns Heatmaps are very common in scientific publications, and very very common in omics papers.
However, for heatmaps to be effective, we have to consider the ordering of rows & columns. In this example, I have cells as columns and features as rows. Grids are showing z scores.
It is impossible to get anything useful out of the heatmap without reordering rows and columns.
We can reorder rows and columns using clustering, but that is not the only way.
Of course, if the rows and columns are mapping to physical entities (rows and columns of a 96-well plate), then you can't reorder them.
But it is a very good idea to at least consider reordering rows and columns. Data from: Li et al., 2022, BioRxiv Bonus: heatmaps can be very pretty ...if you are good are reordering rows/columns and choosing color gradients.
Here is an example "abstract aRt" generated from simulated data. R code for this aRt piece can be found here . For a tutorial on how to reorder rows and columns of a heatmap, see this markdown file . 6. Friends Don't Let Friends Make Heatmap without Checking Outliers Outliers in heatmap can really change how we perceive and interpret the visualization.
This generalizes to all sort of visualizations that use colors to represent numeric data.
Let me show you an example: In this example, I have 2 observations. For each observations, I measured 20 features.
Without checking for outliers, it may appear that the 2 observations are overall similar, except at 2 features.
However, after maxing out the color scale around 95th percentile of the data, it reveals that the two observations are distinct across all features. 7. Friends Don't Let Friends Forget to Check Data Range at Each Factor Level This is a common issue that many of us have encountered.
In a multifactor experiment, sometimes the range of the response variable changes widely between different factor levels. This hypothetical experiment measured 3 compounds across 2 groups (control vs. treatment).
Without checking data range for each compound, you will likely have missed that the treatment had a strong effect on compound 1.
This is because the concentration of compound 1 has a much narrower range than the other compounds in this experiment. 8. Friends Don't Let Friends Make Network Graphs without Trying Different Layouts Network graphs are common in scientific publications. They are super useful in presenting relationship data.
However, the appearance (not the topology) of the network can make a huge difference in determining if a network graph is effective. Layouts can drastically change the appearance of networks, making them easier or harder to interpret.
Here are 3 network graphs from the same data. They look very different from each other.
Data from: Li et al., 2022, BioRxiv Here is 9 different layouts for the same network. They can look very different. The R script to make this animation is available here 9. Friends Don't Let Friends Confuse Position-based Visualizations with Length-based Visualizations This is always the elephant in the room and the essence of many misleading visualizations.
In this example, I measured a response variable across 3 time points.
Two of the following graphs are fine, but one of them is a data visualization crime. Can you see why? In dot and line plots, values are represented by positions along the x and y axis.
The same idea applies to other position based visualizations, such as box plots.
In bar plots, values are represented by the distance from the x axis, and thus the length of the bar. The 3rd graph is not 0-based, which makes the bar length at time point 2 about 3x longer than that at time point 1.
In fact, the true difference in means is closer to 1.6x.
I hope you can see how confusing length and position based visualizations can lead to misleading graphs. Watch out for bar plots with broken axis Broken axis may be useful for depicting data across a wide range of numeric values.
(Alternatively, log scaled axis can be used instead.)
Broken axis are fine for position based graphics, because the data are represented by positions along the axis.
However, we must be very careful with bar plots that have broken axis. Here is an example. In this example, two graphs (left vs. right) are showing the same data.
However, by changing where the axis is broken, one can make certain bars looks longer or shorter.
In this example, the length of bar "d" can look really different.
The illusion of bar "d" being very short on the right graph boils down to bar plot being a length based graphics, not a position based graphics. Example R code for broken axis can be found here . 10. Friends Don't Let Friends Make Pie Chart Pie chart is a common type of visualization for fractional data, where fractions add up to 100%.
This is achieved by dividing a circle into sectors, and the sectors add up to a full circle.
Pie charts have been criticized, because human are much worse in reading angles and area than reading lengths.
Here is a blog post that explores that. In this example, we have two groups, each contains 4 sub-categories.
In classic pie charts, the angles (and thus arc lengths & sector area) represent the data.
The problem is that it is very difficult to compare between groups.
We can visually simplify the pie chart into donut charts, where the data are now represented by arc lengths.
However, if we want to use lengths to represent the data, why don't we just unwrap the donut and make stacked bars?
In stacked bar graphs, bars are shown side-by-side and thus easier to compare across groups. Fun fact: the scripts underlying stacked bars are much simpler than those underlying the pie charts and donut charts.
If you want to produce sub-optimal graph types with ggplot, you actually have to work extra hard. 11. Friends Don't Let Friends Make Concentric Donuts In this example, we have 3 groups, each of which contains two sub-categories (Type I or Type II). In concentric donuts, you might be tempted to say the data are represented by the arc lengths, which is in fact inaccurate .
The arc lengths on the outer rings are much longer than those in the inner rings.
Group 2 and Group 3 have the same exact values, but the arc lengths of Group 3 are much longer.
In fact the data are represented by the arc angles , which we are bad at reading. Since outer rings are longer, the ordering of the groups (which group goes to which ring) has a big impact on the impression of the plot.
It can lead to the apparent paradox where larger values have shorter arcs.
The better (and simpler!) alternative is just unwrap the donuts and make a good old stacked bar plot.
BTW, this is also my main issue with circos plots and other circular plot layouts. 12. Friends Don't Let Friends Use Red/Green and Rainbow color scales Deuteranomaly is the most common type of red/green colorblindness, occurring in 1/16 male and 1/256 female.
Any color scales that use shades of red and shades of green in the same time would be a problem for a person with red/green colorblindness (third column of the figure).
In addition, red/green and rainbow do not preserve information well at all when printed on black/white (grey scale, second column in figure).
Many scientific software still use red/green or rainbow as the default color scales, which drives me crazy.
More "modern" color scales, such as viridis are both colorblind-friendly and grey scale-safe (third row of figure).
And they look nice too. 13. Friends Don't Let Friends Forget to Reorder Stacked Bar Plot Stacked bar plots are useful for visualizing proportion data.
Stacked bar plots are commonly used to visualize community structure or population structure or admixture analysis.
This kind of visualization boils down to a collection of samples, where each sample contains multiple classes of members.
However, when we have many samples and many classes, stacked bar plots need to be optimized to be effective.
And by "optimize" I mean the grouping and ordering of samples. Here we have an example data with 100 samples and 8 classes of member.
Due to the number of samples and classes, it is very hard to discern anything from this graph without optimizing the order of bars. What the heck am I looking at?
After reordering the bars, wow , that really made a difference, don't you think?
For a tutorial on how to optimize a stack bar plot, see this script . 14. Friends Don't Let Friends Mix Stacked Bars and Mean separation Sometimes a visualization gets confusing and ineffective when it tries to too many things at once.
One such example is mixing stacked bar plots and mean separation plots.
One displays proportional data adding up to 100%, the other displays the difference in means and dispersion around means.
These are very distinct tasks in data visualization. In this hypothetical experiment, we had blueberry plants assigned to two groups.
One group was the control; the other was treated with a chemical to make fruit development faster. Each group had 5 plants.
The response of the treatment was divided into 3 categories:
light green fruits, light blue fruits, and dark blue fruits.
100 fruits from each plant were examined and the number of fruits in each category was counted.
The percentage of fruits in each category was calculated and reported.
The question of the study is: did the chemical treatment work? The first stacked bar plot is fine as the standard way to visualize proportion data.
It is clear that all categories add up to 100%,
and the chemical treatment strongly shifted the color profile towards the most developed stage (dark blue). The middle stacked bar plot is problematic,
mainly because it is trying to do two distinct data visualization tasks at once.
When error bars and dots are overlaid onto the stacked bars,
it become unclear which error bars and dots are being compared.
Due to the nature of stacked bars, the error bars and dots of the upper stacks have to be shifted upwards,
and thus interpretation of the y-axis for error bars and dots become not straightforward. Finally, if the main point of the visualization is mean separation and dispersion around the mean,
the third graph is the better choice.
There is no ambiguity on which comparisons are being made.
As shown in the first stacked bar plot,
the chemical treatment strongly increases the proportion of dark blue fruits,
at the expense of lighter color fruits. Conclusion (?) That's it for now. I will update this when I have the time (and inspirations) to produce more examples.
Not sure what the next one will be, but stay tuned! | Friends don't let friends make certain types of data visualization - What are they and why are they bad. | data-visualization,r | 9 | 1 | 7 | 95 | 2 | 1 | 0 |
formbricks/formbricks | Formbricks Harvest user-insights, build irresistible experiences. Website | Join Discord community Trusted by ## ✨ About Formbricks Formbricks provides a free and open source surveying platform. Gather feedback at every point in the user journey with beautiful in-app, website, link and email surveys. Build on top of Formbricks or leverage prebuilt data analysis capabilities.
**Try it out in the cloud at [formbricks.com](https://app.formbricks.com/auth/signup)**
## 💪 Mission: Empower your team, craft an irresistible experience.
Formbricks is both a free and open source survey platform - and a privacy-first experience management platform. Use in-app, website, link and email surveys to gather user and customer insights at every point of their journey. Leverage Formbricks Insight Platform or build your own. Life's too short for mediocre UX.
### Table of Contents
- [Features](#features)
- [Getting Started](#getting-started)
- [Cloud Version](#cloud-version)
- [Self-hosted Version](#self-hosted-version)
- [Development](#development)
- [Contribution](#contribution)
- [Contact](#contact-us)
- [Security](#security)
- [License](#license) ### Features
- 📲 Create **conversion-optimized surveys** with our no-code editor with several question types.
- 📚 Choose from a variety of best-practice **templates**.
- 👩🏻 Launch and **target your surveys to specific user groups** without changing your application code.
- 🔗 Create shareable **link surveys**.
- 👨👩👦 Invite your organization members to **collaborate** on your surveys.
- 🔌 Integrate Formbricks with **Slack, Notion, Zapier, n8n and more**.
- 🔒 All **open source**, transparent and self-hostable.
### Built on Open Source
- 💻 [Typescript](https://www.typescriptlang.org/)
- 🚀 [Next.js](https://nextjs.org/)
- ⚛️ [React](https://reactjs.org/)
- 🎨 [TailwindCSS](https://tailwindcss.com/)
- 📚 [Prisma](https://prisma.io/)
- 🔒 [Auth.js](https://authjs.dev/)
- 🧘♂️ [Zod](https://zod.dev/)
- 🐛 [Vitest](https://vitest.dev/) ## 🚀 Getting started
We've got several options depending on your need to help you quickly get started with Formbricks. ### ☁️ Cloud Version
Formbricks has a hosted cloud offering with a generous free plan to get you up and running as quickly as possible. To get started, please visit [formbricks.com](https://app.formbricks.com/auth/signup). ### 🐳 Self-hosting Formbricks
Formbricks is available Open-Source under AGPLv3 license. You can host Formbricks on your own servers using Docker without a subscription.
If you opt for self-hosting Formbricks, here are a few options to consider:
#### Docker
To get started with self-hosting with Docker, take a look at our [self-hosting docs](https://formbricks.com/docs/self-hosting/deployment).
#### Community-managed One Click Hosting
##### Railway
You can deploy Formbricks on [Railway](https://railway.app) using the button below.
[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/new/template/PPDzCd)
##### RepoCloud
Or you can also deploy Formbricks on [RepoCloud](https://repocloud.io) using the button below.
[![Deploy on RepoCloud](https://d16t0pc4846x52.cloudfront.net/deploy.png)](https://repocloud.io/details/?app_id=254) ## 👨💻 Development
### Prerequisites
Here is what you need to be able to run Formbricks:
- [Node.js](https://nodejs.org/en) (Version: >=18.x)
- [Pnpm](https://pnpm.io/)
- [Docker](https://www.docker.com/) - to run PostgreSQL and MailHog
### Local Setup
To get started locally, we've got a [guide to help you](https://formbricks.com/docs/contributing/setup).
### Gitpod Setup
1. Click the button below to open this project in Gitpod.
2. This will open a fully configured workspace in your browser with all the necessary dependencies already installed.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/formbricks/formbricks) ## ✍️ Contribution
We are very happy if you are interested in contributing to Formbricks 🤗
Here are a few options:
- Star this repo.
- Create issues every time you feel something is missing or goes wrong.
- Upvote issues with 👍 reaction so we know what the demand for a particular issue is to prioritize it within the roadmap.
Please check out [our contribution guide](https://formbricks.com/docs/contributing/introduction) and our [list of open issues](https://github.com/formbricks/formbricks/issues) for more information.
## All Thanks To Our Contributors ## 📆 Contact us
Let's have a chat about your survey needs and get you started. ## 🔒 Security
We take security very seriously. If you come across any security vulnerabilities, please disclose them by sending an email to security@formbricks.com. We appreciate your help in making our platform as secure as possible and are committed to working with you to resolve any issues quickly and efficiently. See [`SECURITY.md`](./SECURITY.md) for more information. ## 👩⚖️ License
### The AGPL Formbricks Core
The Formbricks core application is licensed under the [AGPLv3 Open Source License](https://github.com/formbricks/formbricks/blob/main/LICENSE). The core application is fully functional and includes everything you need to design & run link surveys, website surveys and in-app surveys. You can use the software for free for personal and commercial use. You're also allowed to create and distribute modified versions as long as you document the changes you make incl. date. The AGPL license requires you to publish your modified version under the AGPLv3 license as well.
### The Enterprise Edition
Additional to the AGPL licensed Formbricks core, this repository contains code licensed under an Enterprise license. The [code](https://github.com/formbricks/formbricks/tree/main/packages/ee) and [license](https://github.com/formbricks/formbricks/blob/main/packages/ee/LICENSE) for the enterprise functionality can be found in the `/packages/ee` folder of this repository. This additional functionality is not part of the AGPLv3 licensed Formbricks core and is designed to meet the needs of larger teams and enterprises. This advanced functionality is already included in the Docker images, but you need an [Enterprise License Key](https://formbricks.com/docs/self-hosting/enterprise) to unlock it.
### White-Labeling Formbricks and Other Licensing Needs
We currently do not offer Formbricks white-labeled. Any other needs? [Send us an email](mailto:hola@formbricks.com).
### Why charge for Enterprise Features?
The Enterprise Edition and White-Label Licenses allow us to fund the development of Formbricks sustainably. It guarantees that the open-source surveying infrastructure we're building will be around for decades to come. 🔼 Back to top | Open Source Survey Platform | forms,survey,typescript,nextjs,react,tailwindcss,form,reactjs,survey-analysis,survey-data | 51 | 182 | 2,121 | 3,128 | 73 | 129 | 18 |
SoftFever/OrcaSlicer | Orca Slicer Orca Slicer is an open source slicer for FDM printers. Join community: OrcaSlicer Official Discord Server Main features Auto calibrations for all printers Sandwich(inner-outer-inner) mode - an improved version of the External perimeters first mode Precise wall Polyholes conversion support SuperSlicer Wiki: Polyholes Klipper support More granular controls More features can be found in change notes Download Stable Release 📥 Download the Latest Stable Release Visit our GitHub Releases page for the latest stable version of Orca Slicer, recommended for most users. Nightly Builds 🌙 Download the Latest Nightly Build Explore the latest developments in Orca Slicer with our nightly builds. Feedback on these versions is highly appreciated. How to install Windows :
1. Download the installer for your preferred version from the releases page .
- For convenience there is also a portable build available. - If you have troubles to run the build, you might need to install following runtimes: - MicrosoftEdgeWebView2RuntimeInstallerX64 - Details of this runtime - Alternative Download Link Hosted by Microsoft - vcredist2019_x64 - Alternative Download Link Hosted by Microsoft - This file may already be available on your computer if you've installed visual studio. Check the following location: %VCINSTALLDIR%Redist\MSVC\v142 Mac :
1. Download the DMG for your computer: arm64 version for Apple Silicon and x86_64 for Intel CPU. 2. Drag OrcaSlicer.app to Application folder.
3. If you want to run a build from a PR, you also need following instructions below - Option 1 (You only need to do this once. After that the app can be opened normally.):
- Step 1: Hold cmd and right click the app, from the context menu choose Open .
- Step 2: A warning window will pop up, click Open - Option 2:
Execute this command in terminal: `xattr -dr com.apple.quarantine /Applications/OrcaSlicer.app`
```console
softfever@mac:~$ xattr -dr com.apple.quarantine /Applications/OrcaSlicer.app
```
- Option 3:
- Step 1: open the app, a warning window will pop up
![image](./SoftFever_doc/mac_cant_open.png)
- Step 2: in `System Settings` -> `Privacy & Security`, click `Open Anyway`:
![image](./SoftFever_doc/mac_security_setting.png)
</details> Linux(Ubuntu) :
1. If you run into trouble to execute it, try this command in terminal: chmod +x /path_to_appimage/OrcaSlicer_Linux.AppImage How to compile Windows 64-bit Tools needed: Visual Studio 2019, Cmake, git, git-lfs, Strawberry Perl. You will require cmake version 3.14 or later, which is available on their website . Strawberry Perl is available on their github repository . Run build_release.bat in x64 Native Tools Command Prompt for VS 2019 Note: Don't forget to run git lfs pull after cloning the repository to download tools on Windows Mac 64-bit Tools needed: Xcode, Cmake, git, gettext, libtool, automake, autoconf, texinfo You can install most of them by running brew install cmake gettext libtool automake autoconf texinfo run build_release_macos.sh To build and debug in XCode: run XCode.app open build_`arch`/OrcaSlicer.xcodeproj menu bar: Product => Scheme => OrcaSlicer menu bar: Product => Scheme => Edit Scheme... Run => Info tab => Build Configuration: RelWithDebInfo Run => Options tab => Document Versions: uncheck Allow debugging when browsing versions menu bar: Product => Run Ubuntu Dependencies Will be auto installed with the shell script : libmspack-dev libgstreamerd-3-dev libsecret-1-dev libwebkit2gtk-4.0-dev libosmesa6-dev libssl-dev libcurl4-openssl-dev eglexternalplatform-dev libudev-dev libdbus-1-dev extra-cmake-modules libgtk2.0-dev libglew-dev libudev-dev libdbus-1-dev cmake git texinfo run 'sudo ./BuildLinux.sh -u' run './BuildLinux.sh -dsir' Note: If you're running Klipper, it's recommended to add the following configuration to your printer.cfg file.
``` Enable object exclusion [exclude_object] Enable arcs support [gcode_arcs]
resolution: 0.1
``` Supports Orca Slicer is an open-source project, and I'm deeply grateful to all my sponsors and backers. Their generous support enables me to purchase filaments and other essential 3D printing materials for the project. Thank you! :) Sponsors: Backers: Ko-fi supporters: Backers list Support me Some background OrcaSlicer is originally forked from Bambu Studio, it was previously known as BambuStudio-SoftFever. Bambu Studio is forked from PrusaSlicer by Prusa Research, which is from Slic3r by Alessandro Ranellucci and the RepRap community.
Orca Slicer incorporates a lot of features from SuperSlicer by @supermerill
Orca Slicer's logo is designed by community member Justin Levine(@freejstnalxndr) License Orca Slicer is licensed under the GNU Affero General Public License, version 3. Orca Slicer is based on Bambu Studio by BambuLab. Bambu Studio is licensed under the GNU Affero General Public License, version 3. Bambu Studio is based on PrusaSlicer by PrusaResearch. PrusaSlicer is licensed under the GNU Affero General Public License, version 3. PrusaSlicer is owned by Prusa Research. PrusaSlicer is originally based on Slic3r by Alessandro Ranellucci. Slic3r is licensed under the GNU Affero General Public License, version 3. Slic3r was created by Alessandro Ranellucci with the help of many other contributors. The GNU Affero General Public License, version 3 ensures that if you use any part of this software in any way (even behind a web server), your software must be released under the same license. Orca Slicer includes a pressure advance calibration pattern test adapted from Andrew Ellis' generator, which is licensed under GNU General Public License, version 3. Ellis' generator is itself adapted from a generator developed by Sineos for Marlin, which is licensed under GNU General Public License, version 3. The bambu networking plugin is based on non-free libraries from Bambulab. It is optional to the Orca Slicer and provides extended functionalities for Bambulab printer users. | G-code generator for 3D printers (Bambu, Prusa, Voron, VzBot, RatRig, Creality, etc.) | 3d-printer,3d-printing,makers | 57 | 222 | 1,243 | 23,557 | 912 | 10 | 8 |
n3r4zzurr0/svg-spinners | SVG Spinners (CSS & SMIL) All spinners are displayed inside a 24 x 24 dp view box. The main content rests inside the live area of 22 dp with a padding of 1dp. Few points to consider: SMIL animations (both inline and referenced via an img tag) won't start playing until the page has completely loaded whereas CSS animations will start playing while the page is loading. In webkit based browsers, both SMIL and CSS animations, when referenced via an img tag, produce an unusual behavior on page zoom levels other than 100%. Using them inline seems completely fine and consistent across browsers. Preview Rings Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (428) SMIL (384) CSS (531) SMIL (487) CSS (434) SMIL (390) CSS (537) SMIL (493) CSS (483) SMIL (439) CSS (586) SMIL (542) CSS (620) SMIL (739) Dots Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (635) SMIL (686) CSS (482) SMIL (599) CSS (631) SMIL (2973) CSS (409) SMIL (375) CSS (471) SMIL (503) CSS (422) SMIL (459) CSS (948) SMIL (692) CSS (1494) SMIL (2875) CSS (1504) SMIL (2587) CSS (535) SMIL (484) CSS (1693) SMIL (2714) CSS (399) SMIL (357) Bars Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (514) SMIL (625) CSS (895) SMIL (1891) CSS (548) SMIL (1244) CSS (825) SMIL (1891) CSS (1150) SMIL (894) Blocks Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (1182) SMIL (2133) CSS (524) SMIL (1082) CSS (646) SMIL (1579) CSS (2457) SMIL (4106) Pulses Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (301) SMIL (381) CSS (400) SMIL (797) CSS (499) SMIL (1149) CSS (503) SMIL (1135) CSS (461) SMIL (683) CSS (657) SMIL (1361) CSS (853) SMIL (1994) CSS (856) SMIL (1973) Other Preview CSS (Size in bytes) SMIL (Size in bytes) CSS (453) SMIL (870) CSS (565) SMIL (530) CSS (377) SMIL (333) CSS (385) SMIL (341) CSS (774) SMIL (1028) CSS (1064) SMIL (853) CSS (398) SMIL (354) CSS (999) SMIL (1238) CSS (989) SMIL (1172) CSS (1321) SMIL (1276) Adaptation React Component Library by dephraiim License MIT © Utkarsh Verma | A collection of 24 x 24 dp SVG spinners! (CSS & SMIL) | svg-animated-icons | 0 | 3 | 3 | 28 | 4 | 1 | 0 |
meituan/YOLOv6 | English | 简体中文 YOLOv6 Implementation of paper:
- YOLOv6 v3.0: A Full-Scale Reloading 🔥
- YOLOv6: A Single-Stage Object Detection Framework for Industrial Applications What's New [2023.09.15] Release YOLOv6-Segmentation . 🚀 Performance [2023.04.28] Release YOLOv6Lite models on mobile or CPU. ⭐️ Mobile Benchmark [2023.03.10] Release YOLOv6-Face . 🔥 Performance [2023.03.02] Update base models to version 3.0. [2023.01.06] Release P6 models and enhance the performance of P5 models. ⭐️ Benchmark [2022.11.04] Release base models to simplify the training and deployment process. [2022.09.06] Customized quantization methods. 🚀 Quantization Tutorial [2022.09.05] Release M/L models and update N/T/S models with enhanced performance. [2022.06.23] Release N/T/S models with excellent performance. Benchmark | Model | Size | mAP val 0.5:0.95 | Speed T4 trt fp16 b1 (fps) | Speed T4 trt fp16 b32 (fps) | Params (M) | FLOPs (G) |
| :----------------------------------------------------------- | ---- | :----------------------- | --------------------------------------- | ---------------------------------------- | -------------------- | ------------------- |
| YOLOv6-N | 640 | 37.5 | 779 | 1187 | 4.7 | 11.4 |
| YOLOv6-S | 640 | 45.0 | 339 | 484 | 18.5 | 45.3 |
| YOLOv6-M | 640 | 50.0 | 175 | 226 | 34.9 | 85.8 |
| YOLOv6-L | 640 | 52.8 | 98 | 116 | 59.6 | 150.7 |
| | | | | |
| YOLOv6-N6 | 1280 | 44.9 | 228 | 281 | 10.4 | 49.8 |
| YOLOv6-S6 | 1280 | 50.3 | 98 | 108 | 41.4 | 198.0 |
| YOLOv6-M6 | 1280 | 55.2 | 47 | 55 | 79.6 | 379.5 |
| YOLOv6-L6 | 1280 | 57.2 | 26 | 29 | 140.4 | 673.4 | Table Notes - All checkpoints are trained with self-distillation except for YOLOv6-N6/S6 models trained to 300 epochs without distillation.
- Results of the mAP and speed are evaluated on [COCO val2017](https://cocodataset.org/#download) dataset with the input resolution of 640×640 for P5 models and 1280x1280 for P6 models.
- Speed is tested with TensorRT 7.2 on T4.
- Refer to [Test speed](./docs/Test_speed.md) tutorial to reproduce the speed results of YOLOv6.
- Params and FLOPs of YOLOv6 are estimated on deployed models. Legacy models | Model | Size | mAP val 0.5:0.95 | Speed T4 trt fp16 b1 (fps) | Speed T4 trt fp16 b32 (fps) | Params (M) | FLOPs (G) |
| :----------------------------------------------------------- | ---- | :------------------------------------ | --------------------------------------- | ---------------------------------------- | -------------------- | ------------------- |
| [**YOLOv6-N**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6n.pt) | 640 | 35.9 300e 36.3 400e | 802 | 1234 | 4.3 | 11.1 |
| [**YOLOv6-T**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6t.pt) | 640 | 40.3 300e 41.1 400e | 449 | 659 | 15.0 | 36.7 |
| [**YOLOv6-S**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6s.pt) | 640 | 43.5 300e 43.8 400e | 358 | 495 | 17.2 | 44.2 |
| [**YOLOv6-M**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6m.pt) | 640 | 49.5 | 179 | 233 | 34.3 | 82.2 |
| [**YOLOv6-L-ReLU**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6l_relu.pt) | 640 | 51.7 | 113 | 149 | 58.5 | 144.0 |
| [**YOLOv6-L**](https://github.com/meituan/YOLOv6/releases/download/0.2.0/yolov6l.pt) | 640 | 52.5 | 98 | 121 | 58.5 | 144.0 |
- Speed is tested with TensorRT 7.2 on T4.
### Quantized model 🚀
| Model | Size | Precision | mAP val 0.5:0.95 | Speed T4 trt b1 (fps) | Speed T4 trt b32 (fps) |
| :-------------------- | ---- | --------- | :----------------------- | ---------------------------------- | ----------------------------------- |
| **YOLOv6-N RepOpt** | 640 | INT8 | 34.8 | 1114 | 1828 |
| **YOLOv6-N** | 640 | FP16 | 35.9 | 802 | 1234 |
| **YOLOv6-T RepOpt** | 640 | INT8 | 39.8 | 741 | 1167 |
| **YOLOv6-T** | 640 | FP16 | 40.3 | 449 | 659 |
| **YOLOv6-S RepOpt** | 640 | INT8 | 43.3 | 619 | 924 |
| **YOLOv6-S** | 640 | FP16 | 43.5 | 377 | 541 |
- Speed is tested with TensorRT 8.4 on T4.
- Precision is figured on models for 300 epochs. Mobile Benchmark | Model | Size | mAP val 0.5:0.95 | sm8350 (ms) | mt6853 (ms) | sdm660 (ms) |Params (M) | FLOPs (G) |
| :----------------------------------------------------------- | ---- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
| YOLOv6Lite-S | 320 320 | 22.4 | 7.99 | 11.99 | 41.86 | 0.55 | 0.56 |
| YOLOv6Lite-M | 320 320 | 25.1 | 9.08 | 13.27 | 47.95 | 0.79 | 0.67 |
| YOLOv6Lite-L | 320 320 | 28.0 | 11.37 | 16.20 | 61.40 | 1.09 | 0.87 |
| YOLOv6Lite-L | 320 192 | 25.0 | 7.02 | 9.66 | 36.13 | 1.09 | 0.52 |
| YOLOv6Lite-L | 224*128 | 18.9 | 3.63 | 4.99 | 17.76 | 1.09 | 0.24 | Table Notes - From the perspective of model size and input image ratio, we have built a series of models on the mobile terminal to facilitate flexible applications in different scenarios.
- All checkpoints are trained with 400 epochs without distillation.
- Results of the mAP and speed are evaluated on [COCO val2017](https://cocodataset.org/#download) dataset, and the input resolution is the Size in the table.
- Speed is tested on MNN 2.3.0 AArch64 with 2 threads by arm82 acceleration. The inference warm-up is performed 10 times, and the cycle is performed 100 times.
- Qualcomm 888(sm8350), Dimensity 720(mt6853) and Qualcomm 660(sdm660) correspond to chips with different performances at the high, middle and low end respectively, which can be used as a reference for model capabilities under different chips.
- Refer to [Test NCNN Speed](./docs/Test_NCNN_speed.md) tutorial to reproduce the NCNN speed results of YOLOv6Lite. Quick Start Install ```shell
git clone https://github.com/meituan/YOLOv6
cd YOLOv6
pip install -r requirements.txt
``` Reproduce our results on COCO Please refer to [Train COCO Dataset](./docs/Train_coco_data.md). Finetune on custom data Single GPU
```shell
# P5 models
python tools/train.py --batch 32 --conf configs/yolov6s_finetune.py --data data/dataset.yaml --fuse_ab --device 0
# P6 models
python tools/train.py --batch 32 --conf configs/yolov6s6_finetune.py --data data/dataset.yaml --img 1280 --device 0
```
Multi GPUs (DDP mode recommended)
```shell
# P5 models
python -m torch.distributed.launch --nproc_per_node 8 tools/train.py --batch 256 --conf configs/yolov6s_finetune.py --data data/dataset.yaml --fuse_ab --device 0,1,2,3,4,5,6,7
# P6 models
python -m torch.distributed.launch --nproc_per_node 8 tools/train.py --batch 128 --conf configs/yolov6s6_finetune.py --data data/dataset.yaml --img 1280 --device 0,1,2,3,4,5,6,7
```
- fuse_ab: add anchor-based auxiliary branch and use Anchor Aided Training Mode (Not supported on P6 models currently)
- conf: select config file to specify network/optimizer/hyperparameters. We recommend to apply yolov6n/s/m/l_finetune.py when training on your custom dataset.
- data: prepare dataset and specify dataset paths in data.yaml ( [COCO](http://cocodataset.org), [YOLO format coco labels](https://github.com/meituan/YOLOv6/releases/download/0.1.0/coco2017labels.zip) )
- make sure your dataset structure as follows:
```
├── coco
│ ├── annotations
│ │ ├── instances_train2017.json
│ │ └── instances_val2017.json
│ ├── images
│ │ ├── train2017
│ │ └── val2017
│ ├── labels
│ │ ├── train2017
│ │ ├── val2017
│ ├── LICENSE
│ ├── README.txt
```
YOLOv6 supports different input resolution modes. For details, see [How to Set the Input Size](./docs/About_training_size.md). Resume training If your training process is corrupted, you can resume training by
```
# single GPU training.
python tools/train.py --resume
# multi GPU training.
python -m torch.distributed.launch --nproc_per_node 8 tools/train.py --resume
```
Above command will automatically find the latest checkpoint in YOLOv6 directory, then resume the training process.
Your can also specify a checkpoint path to `--resume` parameter by
```
# remember to replace /path/to/your/checkpoint/path to the checkpoint path which you want to resume training.
--resume /path/to/your/checkpoint/path
```
This will resume from the specific checkpoint you provide. Evaluation Reproduce mAP on COCO val2017 dataset with 640×640 or 1280x1280 resolution
```shell
# P5 models
python tools/eval.py --data data/coco.yaml --batch 32 --weights yolov6s.pt --task val --reproduce_640_eval
# P6 models
python tools/eval.py --data data/coco.yaml --batch 32 --weights yolov6s6.pt --task val --reproduce_640_eval --img 1280
```
- verbose: set True to print mAP of each classes.
- do_coco_metric: set True / False to enable / disable pycocotools evaluation method.
- do_pr_metric: set True / False to print or not to print the precision and recall metrics.
- config-file: specify a config file to define all the eval params, for example: [yolov6n_with_eval_params.py](configs/experiment/yolov6n_with_eval_params.py) Inference First, download a pretrained model from the YOLOv6 [release](https://github.com/meituan/YOLOv6/releases/tag/0.4.0) or use your trained model to do inference.
Second, run inference with `tools/infer.py`
```shell
# P5 models
python tools/infer.py --weights yolov6s.pt --source img.jpg / imgdir / video.mp4
# P6 models
python tools/infer.py --weights yolov6s6.pt --img 1280 1280 --source img.jpg / imgdir / video.mp4
```
If you want to inference on local camera or web camera, you can run:
```shell
# P5 models
python tools/infer.py --weights yolov6s.pt --webcam --webcam-addr 0
# P6 models
python tools/infer.py --weights yolov6s6.pt --img 1280 1280 --webcam --webcam-addr 0
```
`webcam-addr` can be local camera number id or rtsp address. Deployment * [ONNX](./deploy/ONNX)
* [OpenCV Python/C++](./deploy/ONNX/OpenCV)
* [OpenVINO](./deploy/OpenVINO)
* [TensorRT](./deploy/TensorRT)
* [NCNN](./deploy/NCNN)
* [Android](./deploy/NCNN/Android) Tutorials * [User Guide(zh_CN)](https://yolov6-docs.readthedocs.io/zh_CN/latest/)
* [Train COCO Dataset](./docs/Train_coco_data.md)
* [Train custom data](./docs/Train_custom_data.md)
* [Test speed](./docs/Test_speed.md)
* [Tutorial of Quantization for YOLOv6](./docs/Tutorial%20of%20Quantization.md) Third-party resources * YOLOv6 Training with Amazon Sagemaker: [yolov6-sagemaker](https://github.com/ashwincc/yolov6-sagemaker) from [ashwincc](https://github.com/ashwincc)
* YOLOv6 NCNN Android app demo: [ncnn-android-yolov6](https://github.com/FeiGeChuanShu/ncnn-android-yolov6) from [FeiGeChuanShu](https://github.com/FeiGeChuanShu)
* YOLOv6 ONNXRuntime/MNN/TNN C++: [YOLOv6-ORT](https://github.com/DefTruth/lite.ai.toolkit/blob/main/lite/ort/cv/yolov6.cpp), [YOLOv6-MNN](https://github.com/DefTruth/lite.ai.toolkit/blob/main/lite/mnn/cv/mnn_yolov6.cpp) and [YOLOv6-TNN](https://github.com/DefTruth/lite.ai.toolkit/blob/main/lite/tnn/cv/tnn_yolov6.cpp) from [DefTruth](https://github.com/DefTruth)
* YOLOv6 TensorRT Python: [yolov6-tensorrt-python](https://github.com/Linaom1214/TensorRT-For-YOLO-Series) from [Linaom1214](https://github.com/Linaom1214)
* YOLOv6 TensorRT Windows C++: [yolort](https://github.com/zhiqwang/yolov5-rt-stack/tree/main/deployment/tensorrt-yolov6) from [Wei Zeng](https://github.com/Wulingtian)
* [YOLOv6 web demo](https://huggingface.co/spaces/nateraw/yolov6) on [Huggingface Spaces](https://huggingface.co/spaces) with [Gradio](https://github.com/gradio-app/gradio). [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/nateraw/yolov6)
* [Interactive demo](https://yolov6.dagshubusercontent.com/) on [DagsHub](https://dagshub.com) with [Streamlit](https://github.com/streamlit/streamlit)
* Tutorial: [How to train YOLOv6 on a custom dataset](https://blog.roboflow.com/how-to-train-yolov6-on-a-custom-dataset/) * YouTube Tutorial: [How to train YOLOv6 on a custom dataset](https://youtu.be/fFCWrMFH2UY)
* Demo of YOLOv6 inference on Google Colab [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mahdilamb/YOLOv6/blob/main/inference.ipynb)
* Blog post: [YOLOv6 Object Detection – Paper Explanation and Inference](https://learnopencv.com/yolov6-object-detection/) FAQ(Continuously updated) If you have any questions, welcome to join our WeChat group to discuss and exchange. | YOLOv6: a single-stage object detection framework dedicated to industrial applications. | object-detection,pytorch,yolo | 7 | 80 | 270 | 646 | 237 | 4 | 0 |
clovaai/donut | # Donut 🍩 : Document Understanding Transformer
[![Paper](https://img.shields.io/badge/Paper-arxiv.2111.15664-red)](https://arxiv.org/abs/2111.15664)
[![Conference](https://img.shields.io/badge/ECCV-2022-blue)](#how-to-cite)
[![Demo](https://img.shields.io/badge/Demo-Gradio-brightgreen)](#demo)
[![Demo](https://img.shields.io/badge/Demo-Colab-orange)](#demo)
[![PyPI](https://img.shields.io/pypi/v/donut-python?color=green&label=pip%20install%20donut-python)](https://pypi.org/project/donut-python)
[![Downloads](https://static.pepy.tech/personalized-badge/donut-python?period=total&units=international_system&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/donut-python)
Official Implementation of Donut and SynthDoG | [Paper](https://arxiv.org/abs/2111.15664) | [Slide](https://docs.google.com/presentation/d/1gv3A7t4xpwwNdpxV_yeHzEOMy-exJCAz6AlAI9O5fS8/edit?usp=sharing) | [Poster](https://docs.google.com/presentation/d/1m1f8BbAm5vxPcqynn_MbFfmQAlHQIR5G72-hQUFS2sk/edit?usp=sharing) Introduction Donut 🍩, Do cume n t u nderstanding t ransformer, is a new method of document understanding that utilizes an OCR-free end-to-end Transformer model. Donut does not require off-the-shelf OCR engines/APIs, yet it shows state-of-the-art performances on various visual document understanding tasks, such as visual document classification or information extraction (a.k.a. document parsing).
In addition, we present SynthDoG 🐶, Synth etic Do cument G enerator, that helps the model pre-training to be flexible on various languages and domains. Our academic paper, which describes our method in detail and provides full experimental results and analyses, can be found here: OCR-free Document Understanding Transformer . Geewook Kim , Teakgyu Hong , Moonbin Yim , JeongYeon Nam , Jinyoung Park , Jinyeong Yim , Wonseok Hwang , Sangdoo Yun , Dongyoon Han , Seunghyun Park . In ECCV 2022. Pre-trained Models and Web Demos Gradio web demos are available! |:--:|
| |
- You can run the demo with ./app.py file.
- Sample images are available at ./misc and more receipt images are available at CORD dataset link .
- Web demos are available from the links in the following table.
- Note: We have updated the Google Colab demo (as of June 15, 2023) to ensure its proper working. |Task|Sec/Img|Score|Trained Model| Demo |
|---|---|---|---|---|
| CORD (Document Parsing) | 0.7 / 0.7 / 1.2 | 91.3 / 91.1 / 90.9 | donut-base-finetuned-cord-v2 (1280) / donut-base-finetuned-cord-v1 (1280) / donut-base-finetuned-cord-v1-2560 | gradio space web demo , google colab demo (updated at 23.06.15) |
| Train Ticket (Document Parsing) | 0.6 | 98.7 | donut-base-finetuned-zhtrainticket | google colab demo (updated at 23.06.15) |
| RVL-CDIP (Document Classification) | 0.75 | 95.3 | donut-base-finetuned-rvlcdip | gradio space web demo , google colab demo (updated at 23.06.15) |
| DocVQA Task1 (Document VQA) | 0.78 | 67.5 | donut-base-finetuned-docvqa | gradio space web demo , google colab demo (updated at 23.06.15) | The links to the pre-trained backbones are here:
- donut-base : trained with 64 A100 GPUs (~2.5 days), number of layers (encoder: {2,2,14,2}, decoder: 4), input size 2560x1920, swin window size 10, IIT-CDIP (11M) and SynthDoG (English, Chinese, Japanese, Korean, 0.5M x 4).
- donut-proto : (preliminary model) trained with 8 V100 GPUs (~5 days), number of layers (encoder: {2,2,18,2}, decoder: 4), input size 2048x1536, swin window size 8, and SynthDoG (English, Japanese, Korean, 0.4M x 3). Please see our paper for more details. SynthDoG datasets The links to the SynthDoG-generated datasets are here: synthdog-en : English, 0.5M. synthdog-zh : Chinese, 0.5M. synthdog-ja : Japanese, 0.5M. synthdog-ko : Korean, 0.5M. To generate synthetic datasets with our SynthDoG, please see ./synthdog/README.md and our paper for details. Updates 2023-06-15 We have updated all Google Colab demos to ensure its proper working. 2022-11-14 New version 1.0.9 is released ( pip install donut-python --upgrade ). See 1.0.9 Release Notes . 2022-08-12 Donut 🍩 is also available at huggingface/transformers 🤗 (contributed by @NielsRogge ). donut-python loads the pre-trained weights from the official branch of the model repositories. See 1.0.5 Release Notes . 2022-08-05 A well-executed hands-on tutorial on donut 🍩 is published at Towards Data Science (written by @estaudere ). 2022-07-20 First Commit, We release our code, model weights, synthetic data and generator. Software installation bash
pip install donut-python or clone this repository and install the dependencies: bash
git clone https://github.com/clovaai/donut.git
cd donut/
conda create -n donut_official python=3.7
conda activate donut_official
pip install . We tested donut-python == 1.0.1 with:
- torch == 1.11.0+cu113
- torchvision == 0.12.0+cu113
- pytorch-lightning == 1.6.4
- transformers == 4.11.3
- timm == 0.5.4 Note : From several reported issues, we have noticed increased challenges in configuring the testing environment for donut-python due to recent updates in key dependency libraries. While we are actively working on a solution, we have updated the Google Colab demo (as of June 15, 2023) to ensure its proper working. For assistance, we encourage you to refer to the following demo links: CORD Colab Demo , Train Ticket Colab Demo , RVL-CDIP Colab Demo , DocVQA Colab Demo . Getting Started Data This repository assumes the following structure of dataset:
```bash tree dataset_name
dataset_name
├── test
│ ├── metadata.jsonl
│ ├── {image_path0}
│ ├── {image_path1}
│ .
│ .
├── train
│ ├── metadata.jsonl
│ ├── {image_path0}
│ ├── {image_path1}
│ .
│ .
└── validation
├── metadata.jsonl
├── {image_path0}
├── {image_path1}
.
. cat dataset_name/test/metadata.jsonl
{"file_name": {image_path0}, "ground_truth": "{\"gt_parse\": {ground_truth_parse}, ... {other_metadata_not_used} ... }"}
{"file_name": {image_path1}, "ground_truth": "{\"gt_parse\": {ground_truth_parse}, ... {other_metadata_not_used} ... }"}
.
.
``` The structure of metadata.jsonl file is in JSON Lines text format , i.e., .jsonl . Each line consists of file_name : relative path to the image file. ground_truth : string format (json dumped), the dictionary contains either gt_parse or gt_parses . Other fields (metadata) can be added to the dictionary but will not be used. donut interprets all tasks as a JSON prediction problem. As a result, all donut model training share a same pipeline. For training and inference, the only thing to do is preparing gt_parse or gt_parses for the task in format described below. For Document Classification The gt_parse follows the format of {"class" : {class_name}} , for example, {"class" : "scientific_report"} or {"class" : "presentation"} .
- Google colab demo is available here .
- Gradio web demo is available here . For Document Information Extraction The gt_parse is a JSON object that contains full information of the document image, for example, the JSON object for a receipt may look like {"menu" : [{"nm": "ICE BLACKCOFFEE", "cnt": "2", ...}, ...], ...} .
- More examples are available at CORD dataset .
- Google colab demo is available here .
- Gradio web demo is available here . For Document Visual Question Answering The gt_parses follows the format of [{"question" : {question_sentence}, "answer" : {answer_candidate_1}}, {"question" : {question_sentence}, "answer" : {answer_candidate_2}}, ...] , for example, [{"question" : "what is the model name?", "answer" : "donut"}, {"question" : "what is the model name?", "answer" : "document understanding transformer"}] .
- DocVQA Task1 has multiple answers, hence gt_parses should be a list of dictionary that contains a pair of question and answer.
- Google colab demo is available here .
- Gradio web demo is available here . For (Pseudo) Text Reading Task The gt_parse looks like {"text_sequence" : "word1 word2 word3 ... "} - This task is also a pre-training task of Donut model.
- You can use our SynthDoG 🐶 to generate synthetic images for the text reading task with proper gt_parse . See ./synthdog/README.md for details. Training This is the configuration of Donut model training on CORD dataset used in our experiment.
We ran this with a single NVIDIA A100 GPU. bash
python train.py --config config/train_cord.yaml \
--pretrained_model_name_or_path "naver-clova-ix/donut-base" \
--dataset_name_or_paths '["naver-clova-ix/cord-v2"]' \
--exp_version "test_experiment"
.
.
Prediction: <s_menu><s_nm>Lemon Tea (L)</s_nm><s_cnt>1</s_cnt><s_price>25.000</s_price></s_menu><s_total><s_total_price>25.000</s_total_price><s_cashprice>30.000</s_cashprice><s_changeprice>5.000</s_changeprice></s_total>
Answer: <s_menu><s_nm>Lemon Tea (L)</s_nm><s_cnt>1</s_cnt><s_price>25.000</s_price></s_menu><s_total><s_total_price>25.000</s_total_price><s_cashprice>30.000</s_cashprice><s_changeprice>5.000</s_changeprice></s_total>
Normed ED: 0.0
Prediction: <s_menu><s_nm>Hulk Topper Package</s_nm><s_cnt>1</s_cnt><s_price>100.000</s_price></s_menu><s_total><s_total_price>100.000</s_total_price><s_cashprice>100.000</s_cashprice><s_changeprice>0</s_changeprice></s_total>
Answer: <s_menu><s_nm>Hulk Topper Package</s_nm><s_cnt>1</s_cnt><s_price>100.000</s_price></s_menu><s_total><s_total_price>100.000</s_total_price><s_cashprice>100.000</s_cashprice><s_changeprice>0</s_changeprice></s_total>
Normed ED: 0.0
Prediction: <s_menu><s_nm>Giant Squid</s_nm><s_cnt>x 1</s_cnt><s_price>Rp. 39.000</s_price><s_sub><s_nm>C.Finishing - Cut</s_nm><s_price>Rp. 0</s_price><sep/><s_nm>B.Spicy Level - Extreme Hot Rp. 0</s_price></s_sub><sep/><s_nm>A.Flavour - Salt & Pepper</s_nm><s_price>Rp. 0</s_price></s_sub></s_menu><s_sub_total><s_subtotal_price>Rp. 39.000</s_subtotal_price></s_sub_total><s_total><s_total_price>Rp. 39.000</s_total_price><s_cashprice>Rp. 50.000</s_cashprice><s_changeprice>Rp. 11.000</s_changeprice></s_total>
Answer: <s_menu><s_nm>Giant Squid</s_nm><s_cnt>x1</s_cnt><s_price>Rp. 39.000</s_price><s_sub><s_nm>C.Finishing - Cut</s_nm><s_price>Rp. 0</s_price><sep/><s_nm>B.Spicy Level - Extreme Hot</s_nm><s_price>Rp. 0</s_price><sep/><s_nm>A.Flavour- Salt & Pepper</s_nm><s_price>Rp. 0</s_price></s_sub></s_menu><s_sub_total><s_subtotal_price>Rp. 39.000</s_subtotal_price></s_sub_total><s_total><s_total_price>Rp. 39.000</s_total_price><s_cashprice>Rp. 50.000</s_cashprice><s_changeprice>Rp. 11.000</s_changeprice></s_total>
Normed ED: 0.039603960396039604
Epoch 29: 100%|█████████████| 200/200 [01:49<00:00, 1.82it/s, loss=0.00327, exp_name=train_cord, exp_version=test_experiment] Some important arguments: --config : config file path for model training. --pretrained_model_name_or_path : string format, model name in Hugging Face modelhub or local path. --dataset_name_or_paths : string format (json dumped), list of dataset names in Hugging Face datasets or local paths. --result_path : file path to save model outputs/artifacts. --exp_version : used for experiment versioning. The output files are saved at {result_path}/{exp_version}/* Test With the trained model, test images and ground truth parses, you can get inference results and accuracy scores. bash
python test.py --dataset_name_or_path naver-clova-ix/cord-v2 --pretrained_model_name_or_path ./result/train_cord/test_experiment --save_path ./result/output.json
100%|█████████████| 100/100 [00:35<00:00, 2.80it/s]
Total number of samples: 100, Tree Edit Distance (TED) based accuracy score: 0.9129639764131697, F1 accuracy score: 0.8406020841373987 Some important arguments: --dataset_name_or_path : string format, the target dataset name in Hugging Face datasets or local path. --pretrained_model_name_or_path : string format, the model name in Hugging Face modelhub or local path. --save_path : file path to save predictions and scores. How to Cite If you find this work useful to you, please cite: bibtex
@inproceedings{kim2022donut,
title = {OCR-Free Document Understanding Transformer},
author = {Kim, Geewook and Hong, Teakgyu and Yim, Moonbin and Nam, JeongYeon and Park, Jinyoung and Yim, Jinyeong and Hwang, Wonseok and Yun, Sangdoo and Han, Dongyoon and Park, Seunghyun},
booktitle = {European Conference on Computer Vision (ECCV)},
year = {2022}
} License ```
MIT license Copyright (c) 2022-present NAVER Corp. Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
``` | Official Implementation of OCR-free Document Understanding Transformer (Donut) and Synthetic Document Generator (SynthDoG), ECCV 2022 | document-ai,eccv-2022,multimodal-pre-trained-model,ocr,nlp,computer-vision | 6 | 7 | 15 | 60 | 182 | 1 | 0 |
ajnart/homarr | Demo ✨ • Install 💻 • Translations 🈺 • Discord 👋 Simplify the management of your server with Homarr - a sleek, modern dashboard that puts all of your apps and services at your fingertips. With Homarr, you can access and control everything in one convenient location. Homarr seamlessly integrates with the apps you've added, providing you with valuable information and giving you complete control. Installation is a breeze, and Homarr supports a wide range of deployment methods. 🖌️ Highly customizable with an extensive drag and drop grid system ✨ Integrates seamlessly with your favorite self-hosted applications 📌 Easy and fast app management - no YAML involved 🙊 Advanced secrets' management system for enhanced security 📄 Detailed documentation and active community 🔍 Search through the web or supported integrations in an instant 🏴 Monitor your application with a built-in status system 🦞 Comprehensive built-in icon picker with over 7000 icons 🐳 Easy deployment with Docker, unRAID, and Synology 🚀 Compatible with any major consumer hardware (x86, Raspberry Pi, old laptops, ...) Homarr has a built-in collection of widgets and integrations , that connect to your applications and enable you to control them directly from the dashboard.
Each widget and integration has a comprehensive documentation
Homarr will integrate with the following applications: 📥 Torrent clients
- Deluge - Transmission - qBittorent 📥 Usenet clients
- SABnzbd - NZBGet 📺 Media servers
- Plex - Jellyfin 📚 Media collection managers
- Sonarr - Radarr - Lidarr - Readarr 🎞️ Media request managers
- Overseerr - Jellyseerr 🚫 DNS ad-blockers
- Pihole - AdGuard Home Other integrations
- 🔌 Dash. - 🐳 Docker We're constantly adding new integrations and widgets, which will enhance your experience even further. https://user-images.githubusercontent.com/30572287/217098893-5880e7de-13d0-42c5-b505-f7921593396f.mp4 Since we are updating Homarr very frequently, we recommend reading our official installation guides: Homarr is maintained by motivated developers in their free-time.
We work for fun and learning on this project. Hence, we're glad for all the help and support we can get.
Although a donation is appreciated, there are other ways you can support us. You can also support us by helping with translating the entire project to as many language as possible or contributing directly to the code or documentation. Please read our Contribution Guidelines All contributions, regardless of their size or scope, are welcome and highly appreciated! Thank you ❤️ | Customizable browser's home page to interact with your homeserver's Docker containers (e.g. Sonarr/Radarr) | homeserver,radarr,sonarr,dashboard,docker,homepage,mantine,nextjs,react | 53 | 66 | 988 | 3,627 | 172 | 36 | 4 |
GhostTroops/scan4all | 💬 README_中文 • Compile/Install/Run • Parameter Description • How to use • Scenario • POC List • Custom Scan • Best Practices Features Free one id Multi-target web netcat for reverse shell What is scan4all: integrated vscan, nuclei, ksubdomain, subfinder, etc., fully automated and intelligent。red team tools
Code-level optimization, parameter optimization, and individual modules, such as vscan filefuzz, have been rewritten for these integrated projects.
In principle, do not repeat the wheel, unless there are bugs, problems Cross-platform: based on golang implementation, lightweight, highly customizable, open source, supports Linux, windows, mac os, etc. Support [23] password blasting, support custom dictionary, open by "priorityNmap": true RDP VNC SSH Socks5 rsh-spx Mysql MsSql Oracle Postgresql Redis FTP Mongodb SMB, also detect MS17-010 (CVE-2017-0143, CVE-2017-0144, CVE-2017-0145, CVE-2017-0146, CVE-2017-0147, CVE-2017-0148), SmbGhost (CVE- 2020-0796) Telnet Snmp Wap-wsp (Elasticsearch) RouterOs HTTP BasicAuth(Authorization), contains Webdav、SVN(Apache Subversion) crack Weblogic, enable nuclei through enableNuclei=true at the same time, support T3, IIOP and other detection Tomcat Jboss Winrm(wsman) POP3/POP3S By default, http password intelligent blasting is enabled, and it will be automatically activated when an HTTP password is required, without manual intervention Detect whether there is nmap in the system, and enable nmap for fast scanning through priorityNmap=true, which is enabled by default, and the optimized nmap parameters are faster than masscan
Disadvantages of using nmap: Is the network bad, because the traffic network packet is too large, which may lead to incomplete results
Using nmap additionally requires setting the root password to an environment variable bash
export PPSSWWDD=yourRootPswd More references: config/doNmapScan.sh
By default, naabu is used to complete port scanning -stats=true to view the scanning progress
Can I not scan Ports?
```bash
noScan=true ./scan4all -l list.txt -v nmap result default noScan=true ./scan4all -l nmapRssuilt.xml -v
``` Fast 15000+ POC detection capabilities, PoCs include: nuclei POC
## Nuclei Templates Top 10 statistics | TAG | COUNT | AUTHOR | COUNT | DIRECTORY | COUNT | SEVERITY | COUNT | TYPE | COUNT |
|-----------|-------|---------------|-------|------------------|-------|----------|-------|---------|-------|
| cve | 1430 | daffainfo | 631 | cves | 1407 | info | 1474 | http | 3858 |
| panel | 655 | dhiyaneshdk | 584 | exposed-panels | 662 | high | 1009 | file | 76 |
| edb | 563 | pikpikcu | 329 | vulnerabilities | 509 | medium | 818 | network | 51 |
| lfi | 509 | pdteam | 269 | technologies | 282 | critical | 478 | dns | 17 |
| xss | 491 | geeknik | 187 | exposures | 275 | low | 225 | | |
| wordpress | 419 | dwisiswant0 | 169 | misconfiguration | 237 | unknown | 11 | | |
| exposure | 407 | 0x_akoko | 165 | token-spray | 230 | | | | |
| cve2021 | 352 | princechaddha | 151 | workflows | 189 | | | | |
| rce | 337 | ritikchaddha | 137 | default-logins | 103 | | | | |
| wp-plugin | 316 | pussycat0x | 133 | file | 76 | | | | | 281 directories, 3922 files .
* vscan POC
* vscan POC includes: xray 2.0 300+ POC, go POC, etc.
* scan4all POC Support 7000+ web fingerprint scanning, identification: httpx fingerprint vscan fingerprint vscan fingerprint: including eHoleFinger, localFinger, etc. scan4all fingerprint Support 146 protocols and 90000+ rule port scanning Depends on protocols and fingerprints supported by nmap Fast HTTP sensitive file detection, can customize dictionary Landing page detection Supports multiple types of input - STDIN/HOST/IP/CIDR/URL/TXT Supports multiple output types - JSON/TXT/CSV/STDOUT Highly integratable: Configurable unified storage of results to Elasticsearch [strongly recommended] Smart SSL Analysis: In-depth analysis, automatically correlate the scanning of domain names in SSL information, such as *.xxx.com, and complete subdomain traversal according to the configuration, and the result will automatically add the target to the scanning list Support to enable *.xx.com subdomain traversal function in smart SSL information, export EnableSubfinder=true, or adjust in the configuration file Automatically identify the case of multiple IPs associated with a domain (DNS), and automatically scan the associated multiple IPs Smart processing: When the IPs of multiple domain names in the list are the same, merge port scans to improve efficiency Intelligently handle http abnormal pages, and fingerprint calculation and learning Automated supply chain identification, analysis and scanning Link python3 log4j-scan This version blocks the bug that your target information is passed to the DNS Log Server to avoid exposing vulnerabilities Added the ability to send results to Elasticsearch for batch, touch typing There will be time in the future to implement the golang version
how to use?
```bash
mkdir ~/MyWork/;cd ~/MyWork/;git clone https://github.com/hktalent/log4j-scan
```` Intelligently identify honeypots and skip Targets. This function is disabled by default. You can set EnableHoneyportDetection=true to enable Highly customizable: allow to define your own dictionary through config/config.json configuration, or control more details, including but not limited to: nuclei, httpx, naabu, etc. support HTTP Request Smuggling: CL-TE、TE-CL、TE-TE、CL_CL、BaseErr Support via parameter Cookie='PHPSession=xxxx' ./scan4all -host xxxx.com, compatible with nuclei, httpx, go-poc, x-ray POC, filefuzz, http Smuggling work process how to install download from Releases ```bash
go install github.com/GhostTroops/scan4all@2.8.9
scan4all -h
```` how to use Start Elasticsearch, of course you can use the traditional way to output, results
```bash
mkdir -p logs data
docker run --restart=always --ulimit nofile=65536:65536 -p 9200:9200 -p 9300:9300 -d --name es -v $PWD/logs:/usr/share/elasticsearch/logs -v $PWD /config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml -v $PWD/config/jvm.options:/usr/share/elasticsearch/config/jvm.options -v $PWD/data:/ usr/share/elasticsearch/data hktalent/elasticsearch:7.16.2 Initialize the es index, the result structure of each tool is different, and it is stored separately ./config/initEs.sh Search syntax, more query methods, learn Elasticsearch by yourself http://127.0.0.1:9200/nmap_index/_doc/_search?q=_id:192.168.0.111
where 92.168.0.111 is the target to query `
- Please install nmap by yourself before use
<a href=https://github.com/GhostTroops/scan4all/discussions>Using Help</a> bash
go build Precise scan szUrl list UrlPrecise=true UrlPrecise=true ./scan4all -l xx.txt Disable adaptation to nmap and use naabu port to scan its internally defined http-related Ports priorityNmap=false ./scan4all -tp http -list allOut.txt -v
```` Work Plan Integrate web-cache-vulnerability-scanner to realize HTTP smuggling smuggling and cache poisoning detection Linkage with metasploit-framework, on the premise that the system has been installed, cooperate with tmux, and complete the linkage with the macos environment as the best practice Integrate more fuzzers , such as linking sqlmap Integrate chromedp to achieve screenshots of landing pages, detection of front-end landing pages with pure js and js architecture, and corresponding crawlers (sensitive information detection, page crawling) Integrate nmap-go to improve execution efficiency, dynamically parse the result stream, and integrate it into the current task waterfall Integrate ksubdomain to achieve faster subdomain blasting Integrate spider to find more bugs Semi-automatic fingerprint learning to improve accuracy; specify fingerprint name, configure Q & A how use Cookie? libpcap related question more see: discussions References https://www.77169.net/html/312916.html https://zhuanlan.zhihu.com/p/636131542 https://github.com/GhostTroops/scan4all/blob/main/static/Installation.md https://github.com/GhostTroops/scan4all/blob/main/static/NicePwn.md https://github.com/GhostTroops/scan4all/blob/main/static/running.md https://www.google.com/search?client=safari&rls=en&q=%22hktalent%22+%22scan4all%22&ie=UTF-8&oe=UTF-8#ip=1 Thanks Donors @freeload101 @b1win0y @BL4CKR4Y Contributors https://github.com/GhostTroops/scan4all/graphs/contributors Changelog 2023-10-01 Optimize support for nuclei@latest 2022-07-28 Added substr and aes_cbc dsl helper by me nuclei v2.7.7 2022-07-20 fix and PR nuclei #2301 Concurrent multi-instance bug 2022-07-20 add web cache vulnerability scanner 2022-07-19 PR nuclei #2308 add dsl function: substr aes_cbc 2022-07-19 Add dcom Protocol enumeration network interfaces 2022-06-30 Embedded integrated private version nuclei-templates A total of 3744 YAML POC; 1. Integrate Elasticsearch to store intermediate results 2. Embed the entire config directory into the program 2022-06-27 Optimize fuzzy matching to improve accuracy and robustness; integrate ksubdomain progress 2022-06-24 Optimize fingerprint algorithm; add workflow chart 2022-06-23 Added parameter ParseSSl to control the default of not deeply analyzing DNS information in SSL and not scanning DNS in SSL by default; Optimization: nmap does not automatically add .exe bug; Optimize the bug of cache files under Windows not optimizing the size 2022-06-22 Integrated weak password detection and password blasting for 11 protocols: ftp, mongodb, mssql, mysql, oracle, postgresql, rdp, redis, smb, ssh, telnet, and optimized support for plug-in password dictionary 2022-06-20 Integrate Subfinder, domain name blasting, startup parameter export EnableSubfinder=true, note that it is very slow after startup; automatic deep drilling of domain name information in the ssl certificate allows you to define your own dictionary through config/config.json configuration, or set related switch 2022-06-17 Optimize the situation where one domain name has multiple IPs. All IPs will be port scanned, and then follow the subsequent scanning process. 2022-06-15 This version adds several weblogic password dictionaries and webshell dictionaries obtained in past actual combat 2022-06-10 Complete the integration of the core, including of course the integration of the core template 2022-06-07 Add similarity algorithm to detect 404 2022-06-07 Added http url list precision scanning parameters, turned on according to the environment variable UrlPrecise=true Communication group (WeChat, QQ,Tg) | Wechat | Or | QQchat | Or | Tg |
| --- |--- |--- |--- |--- |
| || || | 💖Star Donation | Wechat Pay | AliPay | Paypal | BTC Pay |BCH Pay |
| --- | --- | --- | --- | --- |
| | | paypal miracletalent@gmail.com | | | | Official repository vuls Scan: 15000+PoCs; 23 kinds of application password crack; 7000+Web fingerprints; 146 protocols and 90000+ rules Port scanning; Fuzz, HW, awesome BugBounty( ͡° ͜ʖ ͡°)... | attack,auto,golang,hacker,tools,nuclei,bugbounty,bugbounty-tools,hacktools,pentest-tool | 28 | 6 | 22 | 543 | 10 | 6 | 2 |
electric-sql/electric | Local-first sync layer for web and mobile apps. Build reactive, realtime, local-first apps directly on Postgres. # ElectricSQL
Sync for modern apps. From the inventors of CRDTs.
## Quick links
- [Website](https://electric-sql.com)
- [Documentation](https://electric-sql.com/docs)
- [Introduction](https://electric-sql.com/docs/intro/local-first)
- [Quickstart](https://electric-sql.com/docs/quickstart)
## What is ElectricSQL?
ElectricSQL is a local-first software platform that makes it easy to develop high-quality, modern apps with instant reactivity, realtime multi-user collaboration and conflict-free offline support.
[Local-first](https://www.inkandswitch.com/local-first/) is a new development paradigm where your app code talks directly to an embedded local database and data syncs in the background via active-active database replication. Because the app code talks directly to a local database, apps feel instant. Because data syncs in the background via active-active replication it naturally supports multi-user collaboration and conflict-free offline.
## How do I use it?
ElectricSQL gives you instant local-first for your Postgres. Think of it like "Hasura for local-first". Drop ElectricSQL onto an existing Postgres-based system and you get instant local-first data synced into your apps.
ElectricSQL then provides a whole developer experience for you to control what data syncs where and to work with it locally in your app code. See the [Introduction](https://electric-sql.com/docs/intro/local-first) and the [Quickstart guide](https://electric-sql.com/docs/quickstart) to get started.
## Repo structure
This is the main repository for the ElectricSQL source code. Key components include:
- [clients/typescript](https://github.com/electric-sql/electric/tree/main/clients/typescript) — TypeScript client that provides SQLite driver adapters, reactivity and a type-safe data access library
- [components/electric](https://github.com/electric-sql/electric/tree/main/components/electric) — Elixir sync service that manages active-active replication between Postgres and SQLite
- [generator](https://github.com/electric-sql/electric/tree/main/generator) — Prisma generator that creates the type safe data access library
- [protocol/satellite.proto](https://github.com/electric-sql/electric/tree/main/protocol/satellite.proto) — Protocol Buffers definition of the Satellite replication protocol
See the Makefiles for test and build instructions and the [e2e](https://github.com/electric-sql/electric/tree/main/e2e) folder for integration tests.
## Team
ElectricSQL was founded by [@thruflo](https://github.com/thruflo) and [@balegas](https://github.com/balegas), under the guidance of:
- [Marc Shapiro](https://lip6.fr/Marc.Shapiro) and [Nuno Preguiça](https://asc.di.fct.unl.pt/~nmp), two of the co-inventors of CRDTs
- [@bieniusa](https://linkedin.com/in/annette-bieniusa-b0807b145), the lead developer of [AntidoteDB](https://www.antidotedb.eu)
- [@josevalim](https://www.linkedin.com/in/josevalim), the creator of the [Elixir](https://elixir-lang.org) programming language
See the [Team](https://electric-sql.com/about/team) and [Literature](https://electric-sql.com/docs/reference/literature) pages for more details.
## Contributing
See the [Community Guidelines](https://github.com/electric-sql/electric/blob/main/CODE_OF_CONDUCT.md) including the [Guide to Contributing](https://github.com/electric-sql/electric/blob/main/CONTRIBUTING.md) and [Contributor License Agreement](https://github.com/electric-sql/electric/blob/main/CLA.md).
## Support
We have an [open community Discord](https://discord.electric-sql.com). Come and say hello and let us know if you have any questions or need any help getting things running.
It's also super helpful if you leave the project a star here at the [top of the page☝️](#start-of-content) | Local-first sync layer for web and mobile apps. Build reactive, realtime, local-first apps directly on Postgres. | local-first,sqlite,elixir,postgres,sql,crdts,offline,crdt | 101 | 41 | 1,123 | 1,182 | 46 | 110 | 11 |
srush/GPU-Puzzles | GPU Puzzles by Sasha Rush - srush_nlp GPU architectures are critical to machine learning, and seem to be
becoming even more important every day. However, you can be an expert
in machine learning without ever touching GPU code. It is hard to gain
intuition working through abstractions. This notebook is an attempt to teach beginner GPU programming in a
completely interactive fashion. Instead of providing text with
concepts, it throws you right into coding and building GPU
kernels. The exercises use NUMBA which directly maps Python
code to CUDA kernels. It looks like Python but is basically
identical to writing low-level CUDA code.
In a few hours, I think you can go from basics to
understanding the real algorithms that power 99% of deep learning
today. If you do want to read the manual, it is here: NUMBA CUDA Guide I recommend doing these in Colab, as it is easy to get started. Be
sure to make your own copy, turn on GPU mode in the settings ( Runtime / Change runtime type , then set Hardware accelerator to GPU ), and
then get to coding. (If you are into this style of puzzle, also check out my Tensor
Puzzles for PyTorch.) Walkthrough Guide python
!pip install -qqq git+https://github.com/danoneata/chalk@srush-patch-1
!wget -q https://github.com/srush/GPU-Puzzles/raw/main/robot.png https://github.com/srush/GPU-Puzzles/raw/main/lib.py python
import numba
import numpy as np
import warnings
from lib import CudaProblem, Coord python
warnings.filterwarnings(
action="ignore", category=numba.NumbaPerformanceWarning, module="numba"
) Puzzle 1: Map Implement a "kernel" (GPU function) that adds 10 to each position of vector a and stores it in vector out . You have 1 thread per position. Warning This code looks like Python but it is really CUDA! You cannot use
standard python tools like list comprehensions or ask for Numpy properties
like shape or size (if you need the size, it is given as an argument).
The puzzles only require doing simple operations, basically
+, *, simple array indexing, for loops, and if statements.
You are allowed to use local variables.
If you get an
error it is probably because you did something fancy :). Tip: Think of the function call as being run 1 time for each thread.
The only difference is that cuda.threadIdx.x changes each time. ```python
def map_spec(a):
return a + 10 def map_test(cuda):
def call(out, a) -> None:
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 1 lines) return call SIZE = 4
out = np.zeros((SIZE,))
a = np.arange(SIZE)
problem = CudaProblem(
"Map", map_test, [a], out, threadsperblock=Coord(SIZE, 1), spec=map_spec
)
problem.show()
``` # Map
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0.]
Spec : [10 11 12 13] Puzzle 2 - Zip Implement a kernel that adds together each position of a and b and stores it in out .
You have 1 thread per position. ```python
def zip_spec(a, b):
return a + b def zip_test(cuda):
def call(out, a, b) -> None:
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 1 lines) return call SIZE = 4
out = np.zeros((SIZE,))
a = np.arange(SIZE)
b = np.arange(SIZE)
problem = CudaProblem(
"Zip", zip_test, [a, b], out, threadsperblock=Coord(SIZE, 1), spec=zip_spec
)
problem.show()
``` # Zip
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | ```python ``` python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0.]
Spec : [0 2 4 6] Puzzle 3 - Guards Implement a kernel that adds 10 to each position of a and stores it in out .
You have more threads than positions. ```python
def map_guard_test(cuda):
def call(out, a, size) -> None:
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 2 lines) return call SIZE = 4
out = np.zeros((SIZE,))
a = np.arange(SIZE)
problem = CudaProblem(
"Guard",
map_guard_test,
[a],
out,
[SIZE],
threadsperblock=Coord(8, 1),
spec=map_spec,
)
problem.show()
``` # Guard
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0.]
Spec : [10 11 12 13] Puzzle 4 - Map 2D Implement a kernel that adds 10 to each position of a and stores it in out .
Input a is 2D and square. You have more threads than positions. ```python
def map_2D_test(cuda):
def call(out, a, size) -> None:
local_i = cuda.threadIdx.x
local_j = cuda.threadIdx.y
# FILL ME IN (roughly 2 lines) return call SIZE = 2
out = np.zeros((SIZE, SIZE))
a = np.arange(SIZE * SIZE).reshape((SIZE, SIZE))
problem = CudaProblem(
"Map 2D", map_2D_test, [a], out, [SIZE], threadsperblock=Coord(3, 3), spec=map_spec
)
problem.show()
``` # Map 2D
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0. 0.]
[0. 0.]]
Spec : [[10 11]
[12 13]] Puzzle 5 - Broadcast Implement a kernel that adds a and b and stores it in out .
Inputs a and b are vectors. You have more threads than positions. ```python
def broadcast_test(cuda):
def call(out, a, b, size) -> None:
local_i = cuda.threadIdx.x
local_j = cuda.threadIdx.y
# FILL ME IN (roughly 2 lines) return call SIZE = 2
out = np.zeros((SIZE, SIZE))
a = np.arange(SIZE).reshape(SIZE, 1)
b = np.arange(SIZE).reshape(1, SIZE)
problem = CudaProblem(
"Broadcast",
broadcast_test,
[a, b],
out,
[SIZE],
threadsperblock=Coord(3, 3),
spec=zip_spec,
)
problem.show()
``` # Broadcast
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0. 0.]
[0. 0.]]
Spec : [[0 1]
[1 2]] Puzzle 6 - Blocks Implement a kernel that adds 10 to each position of a and stores it in out .
You have fewer threads per block than the size of a . Tip: A block is a group of threads. The number of threads per block is limited, but we can
have many different blocks. Variable cuda.blockIdx tells us what block we are in. ```python
def map_block_test(cuda):
def call(out, a, size) -> None:
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
# FILL ME IN (roughly 2 lines) return call SIZE = 9
out = np.zeros((SIZE,))
a = np.arange(SIZE)
problem = CudaProblem(
"Blocks",
map_block_test,
[a],
out,
[SIZE],
threadsperblock=Coord(4, 1),
blockspergrid=Coord(3, 1),
spec=map_spec,
)
problem.show()
``` # Blocks
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0. 0. 0. 0. 0. 0.]
Spec : [10 11 12 13 14 15 16 17 18] Puzzle 7 - Blocks 2D Implement the same kernel in 2D. You have fewer threads per block
than the size of a in both directions. ```python
def map_block2D_test(cuda):
def call(out, a, size) -> None:
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
# FILL ME IN (roughly 4 lines) return call SIZE = 5
out = np.zeros((SIZE, SIZE))
a = np.ones((SIZE, SIZE)) problem = CudaProblem(
"Blocks 2D",
map_block2D_test,
[a],
out,
[SIZE],
threadsperblock=Coord(3, 3),
blockspergrid=Coord(2, 2),
spec=map_spec,
)
problem.show()
``` # Blocks 2D
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0.]]
Spec : [[11. 11. 11. 11. 11.]
[11. 11. 11. 11. 11.]
[11. 11. 11. 11. 11.]
[11. 11. 11. 11. 11.]
[11. 11. 11. 11. 11.]] Puzzle 8 - Shared Implement a kernel that adds 10 to each position of a and stores it in out .
You have fewer threads per block than the size of a . Warning : Each block can only have a constant amount of shared
memory that threads in that block can read and write to. This needs
to be a literal python constant not a variable. After writing to
shared memory you need to call cuda.syncthreads to ensure that
threads do not cross. (This example does not really need shared memory or syncthreads, but it is a demo.) ```python
TPB = 4
def shared_test(cuda):
def call(out, a, size) -> None:
shared = cuda.shared.array(TPB, numba.float32)
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x if i < size:
shared[local_i] = a[i]
cuda.syncthreads()
# FILL ME IN (roughly 2 lines)
return call SIZE = 8
out = np.zeros(SIZE)
a = np.ones(SIZE)
problem = CudaProblem(
"Shared",
shared_test,
[a],
out,
[SIZE],
threadsperblock=Coord(TPB, 1),
blockspergrid=Coord(2, 1),
spec=map_spec,
)
problem.show()
``` # Shared
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 1 | 0 | 0 | 1 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0. 0. 0. 0. 0.]
Spec : [11. 11. 11. 11. 11. 11. 11. 11.] Puzzle 9 - Pooling Implement a kernel that sums together the last 3 position of a and stores it in out .
You have 1 thread per position. You only need 1 global read and 1 global write per thread. Tip: Remember to be careful about syncing. ```python
def pool_spec(a):
out = np.zeros(*a.shape)
for i in range(a.shape[0]):
out[i] = a[max(i - 2, 0) : i + 1].sum()
return out TPB = 8
def pool_test(cuda):
def call(out, a, size) -> None:
shared = cuda.shared.array(TPB, numba.float32)
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 8 lines) return call SIZE = 8
out = np.zeros(SIZE)
a = np.arange(SIZE)
problem = CudaProblem(
"Pooling",
pool_test,
[a],
out,
[SIZE],
threadsperblock=Coord(TPB, 1),
blockspergrid=Coord(1, 1),
spec=pool_spec,
)
problem.show()
``` # Pooling
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0. 0. 0. 0. 0.]
Spec : [ 0. 1. 3. 6. 9. 12. 15. 18.] Puzzle 10 - Dot Product Implement a kernel that computes the dot-product of a and b and stores it in out .
You have 1 thread per position. You only need 2 global reads and 1 global write per thread. Note: For this problem you don't need to worry about number of shared reads. We will
handle that challenge later. ```python
def dot_spec(a, b):
return a @ b TPB = 8
def dot_test(cuda):
def call(out, a, b, size) -> None:
shared = cuda.shared.array(TPB, numba.float32) i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 9 lines)
return call SIZE = 8
out = np.zeros(1)
a = np.arange(SIZE)
b = np.arange(SIZE)
problem = CudaProblem(
"Dot",
dot_test,
[a, b],
out,
[SIZE],
threadsperblock=Coord(SIZE, 1),
blockspergrid=Coord(1, 1),
spec=dot_spec,
)
problem.show()
``` # Dot
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0.]
Spec : 140 Puzzle 11 - 1D Convolution Implement a kernel that computes a 1D convolution between a and b and stores it in out .
You need to handle the general case. You only need 2 global reads and 1 global write per thread. ```python
def conv_spec(a, b):
out = np.zeros(*a.shape)
len = b.shape[0]
for i in range(a.shape[0]):
out[i] = sum([a[i + j] * b[j] for j in range(len) if i + j < a.shape[0]])
return out MAX_CONV = 4
TPB = 8
TPB_MAX_CONV = TPB + MAX_CONV
def conv_test(cuda):
def call(out, a, b, a_size, b_size) -> None:
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x # FILL ME IN (roughly 17 lines)
return call Test 1 SIZE = 6
CONV = 3
out = np.zeros(SIZE)
a = np.arange(SIZE)
b = np.arange(CONV)
problem = CudaProblem(
"1D Conv (Simple)",
conv_test,
[a, b],
out,
[SIZE, CONV],
Coord(1, 1),
Coord(TPB, 1),
spec=conv_spec,
)
problem.show()
``` # 1D Conv (Simple)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0. 0. 0.]
Spec : [ 5. 8. 11. 14. 5. 0.] Test 2 python
out = np.zeros(15)
a = np.arange(15)
b = np.arange(4)
problem = CudaProblem(
"1D Conv (Full)",
conv_test,
[a, b],
out,
[15, 4],
Coord(2, 1),
Coord(TPB, 1),
spec=conv_spec,
)
problem.show() # 1D Conv (Full)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Spec : [14. 20. 26. 32. 38. 44. 50. 56. 62. 68. 74. 80. 41. 14. 0.] Puzzle 12 - Prefix Sum Implement a kernel that computes a sum over a and stores it in out .
If the size of a is greater than the block size, only store the sum of
each block. We will do this using the parallel prefix sum algorithm in shared memory.
That is, each step of the algorithm should sum together half the remaining numbers.
Follow this diagram: ```python
TPB = 8
def sum_spec(a):
out = np.zeros((a.shape[0] + TPB - 1) // TPB)
for j, i in enumerate(range(0, a.shape[-1], TPB)):
out[j] = a[i : i + TPB].sum()
return out def sum_test(cuda):
def call(out, a, size: int) -> None:
cache = cuda.shared.array(TPB, numba.float32)
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x
# FILL ME IN (roughly 12 lines) return call Test 1 SIZE = 8
out = np.zeros(1)
inp = np.arange(SIZE)
problem = CudaProblem(
"Sum (Simple)",
sum_test,
[inp],
out,
[SIZE],
Coord(1, 1),
Coord(TPB, 1),
spec=sum_spec,
)
problem.show()
``` # Sum (Simple)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0.]
Spec : [28.] Test 2 python
SIZE = 15
out = np.zeros(2)
inp = np.arange(SIZE)
problem = CudaProblem(
"Sum (Full)",
sum_test,
[inp],
out,
[SIZE],
Coord(2, 1),
Coord(TPB, 1),
spec=sum_spec,
)
problem.show() # Sum (Full)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [0. 0.]
Spec : [28. 77.] Puzzle 13 - Axis Sum Implement a kernel that computes a sum over each column of a and stores it in out . ```python
TPB = 8
def sum_spec(a):
out = np.zeros((a.shape[0], (a.shape[1] + TPB - 1) // TPB))
for j, i in enumerate(range(0, a.shape[-1], TPB)):
out[..., j] = a[..., i : i + TPB].sum(-1)
return out def axis_sum_test(cuda):
def call(out, a, size: int) -> None:
cache = cuda.shared.array(TPB, numba.float32)
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
local_i = cuda.threadIdx.x
batch = cuda.blockIdx.y
# FILL ME IN (roughly 12 lines) return call BATCH = 4
SIZE = 6
out = np.zeros((BATCH, 1))
inp = np.arange(BATCH * SIZE).reshape((BATCH, SIZE))
problem = CudaProblem(
"Axis Sum",
axis_sum_test,
[inp],
out,
[SIZE],
Coord(1, BATCH),
Coord(TPB, 1),
spec=sum_spec,
)
problem.show()
``` # Axis Sum
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0.]
[0.]
[0.]
[0.]]
Spec : [[ 15.]
[ 51.]
[ 87.]
[123.]] Puzzle 14 - Matrix Multiply! Implement a kernel that multiplies square matrices a and b and
stores the result in out . Tip: The most efficient algorithm here will copy a block into
shared memory before computing each of the individual row-column
dot products. This is easy to do if the matrix fits in shared
memory. Do that case first. Then update your code to compute
a partial dot-product and iteratively move the part you
copied into shared memory. You should be able to do the hard case
in 6 global reads. ```python
def matmul_spec(a, b):
return a @ b TPB = 3
def mm_oneblock_test(cuda):
def call(out, a, b, size: int) -> None:
a_shared = cuda.shared.array((TPB, TPB), numba.float32)
b_shared = cuda.shared.array((TPB, TPB), numba.float32) i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
j = cuda.blockIdx.y * cuda.blockDim.y + cuda.threadIdx.y
local_i = cuda.threadIdx.x
local_j = cuda.threadIdx.y
# FILL ME IN (roughly 14 lines)
return call Test 1 SIZE = 2
out = np.zeros((SIZE, SIZE))
inp1 = np.arange(SIZE * SIZE).reshape((SIZE, SIZE))
inp2 = np.arange(SIZE * SIZE).reshape((SIZE, SIZE)).T problem = CudaProblem(
"Matmul (Simple)",
mm_oneblock_test,
[inp1, inp2],
out,
[SIZE],
Coord(1, 1),
Coord(TPB, TPB),
spec=matmul_spec,
)
problem.show(sparse=True)
``` # Matmul (Simple)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0. 0.]
[0. 0.]]
Spec : [[ 1 3]
[ 3 13]] Test 2 ```python
SIZE = 8
out = np.zeros((SIZE, SIZE))
inp1 = np.arange(SIZE * SIZE).reshape((SIZE, SIZE))
inp2 = np.arange(SIZE * SIZE).reshape((SIZE, SIZE)).T problem = CudaProblem(
"Matmul (Full)",
mm_oneblock_test,
[inp1, inp2],
out,
[SIZE],
Coord(3, 3),
Coord(TPB, TPB),
spec=matmul_spec,
)
problem.show(sparse=True)
``` # Matmul (Full)
Score (Max Per Thread):
| Global Reads | Global Writes | Shared Reads | Shared Writes |
| 0 | 0 | 0 | 0 | python
problem.check() Failed Tests.
Yours: [[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0.]]
Spec : [[ 140 364 588 812 1036 1260 1484 1708]
[ 364 1100 1836 2572 3308 4044 4780 5516]
[ 588 1836 3084 4332 5580 6828 8076 9324]
[ 812 2572 4332 6092 7852 9612 11372 13132]
[ 1036 3308 5580 7852 10124 12396 14668 16940]
[ 1260 4044 6828 9612 12396 15180 17964 20748]
[ 1484 4780 8076 11372 14668 17964 21260 24556]
[ 1708 5516 9324 13132 16940 20748 24556 28364]] | Solve puzzles. Learn CUDA. | cuda,machine-learning,puzzles | 0 | 3 | 8 | 42 | 9 | 1 | 0 |
holoiso-eol/holoiso | All work is being done on: https://github.com/holoiso-staging/releases. This version is completely EOL and is no longer supported. HoloISO SteamOS 3 (Holo) archiso configuration. Yes, Gabe. SteamOS functions well on a toaster. This project attempts to bring the Steam Deck's SteamOS Holo redistribution into a generic, installable format, and provide a close-to-official SteamOS experience.
Main point of this project focuses in re-implementing proprietary (as in runs-only-on-deck) components that Steam client, OS itself, gamescope and user-created applications for Deck rely on and making me learn Linux in a fun and unique way. Click here to join HoloISO Telegram update channel; Common Questions Is this official? No, but it may as well be 99% of the way there. Most of the code and packages, are straight from Valve, with zero possible edits, and the ISO is being built same rootfs bootstrap as all HoloISO installations run I have an NVIDIA G- No. Not even questionable. If you have an NVIDIA GPU, You're on your own. Latest Valve updates for Steam client including normal and Jupiter bootstraps have broken gamepadui on NVIDIA GPUs, and if so, no support will be provided for you. Hardware Support: CPU: - Mostly all CPUs work fine. But people report inoperable experience on 7xxx series. (Should be working in later builds with linux-zen package included) WLAN/PCIe additional cards: - Any pre-2021 WLAN Card works fine on Valve's 5.13 Neptune kernel, but linux-zen provides support for ALL current cards Sound: - Everything mostly works fine(tm) GPU: - AMD GPUs with RADV support (Guaranteed to work fully stable. 7xxx requires testing)
- NVIDIA GPUs (Unfunctional, but might work. No support will be provided to you, don't ask about it)
- Intel GPUs (Random experience) Progress: Working stuff: - Bootup
- SteamOS OOBE (Steam Deck UI First Boot Experience)
- Deck UI (separate session)
- Deck UI (-gamepadui)
- ~~TDP/FPS limiting~~ ( 0)
- Global FSR
- Shader Pre-Caching
- Switch to Desktop from plasma/to plasma without user interference.
- Valve's exclusive Vapor* appearance for KDE Plasma
- Steam Deck pacman mirrors
- Cool-looking neofetch?
- System updates Working stuff on Steam Deck compared to other distributions: - Dock Firmware updater (additionally installable in desktop by running sudo pacman -S jupiter-dock-updater-bin)
- Steam Deck BIOS, Controller firmware, OS firmware updater, support for thumbstick and haptic motor calibration, native amplifier (CS35L41) support
- New fan curve control
- TDP/Clock control (*0) Disabled for ALL systems except for Steam Deck (Valve Jupiter 1) due to VERY LOW hardcoded TDP/Clock values, especially for dGPUs. Installation process: Prerequistes: - 4GB flash drive
- More than 8 GB RAM if you plan to use "Copy-To-RAM" option to install
- AMD GPU that supports RADV Drivers instead of Radeon (Southern Islands and Sea Islands require additional kernel cmdline property)
- UEFI-enabled device
- Disabled secure boot Installation: - Flash the ISO from releases using BalenaEtcher , Rufus with DD mode, or by typing sudo dd if=SteamOS.iso of=/dev/sd(your flash drive) bs=4M status=progress oflag=sync , or by simply throwing ISO into Ventoy drive
- Boot into ISO
- Click on "Install SteamOS on this device"
- Follow on-screen instructions
- Take your favourite hot beverage, and wait 'till it installs :3 Upon booting, you'll be greeted with Steam Deck's OOBE screen, from where you'll connect to your network, and login to your Steam account, from there, you can exit to KDE Plasma seamlessly by choosing Switch to desktop in the power menu, like so . Screenshots: Credits: (Too much people xD, to be filled later!!!) Notes: This configuration includes Valve's pacman.conf repositories, holoinstall script and holoinstall post-installation binaries. This configuration builds a releng-based ISO , which is the default Arch Linux redistribution flavor. | SteamOS 3 (Holo) archiso configuration | [] | 27 | 17 | 42 | 291 | 0 | 9 | 0 |
sigoden/dufs | Dufs Dufs is a distinctive utility file server that supports static serving, uploading, searching, accessing control, webdav... Features Serve static files Download folder as zip file Upload files and folders (Drag & Drop) Create/Edit/Search files Resumable/partial uploads/downloads Access control Support https Support webdav Easy to use with curl Install With cargo cargo install dufs With docker docker run -v `pwd`:/data -p 5000:5000 --rm sigoden/dufs /data -A With Homebrew brew install dufs Binaries on macOS, Linux, Windows Download from Github Releases , unzip and add dufs to your $PATH. CLI ```
Dufs is a distinctive utility file server - https://github.com/sigoden/dufs Usage: dufs [OPTIONS] [serve-path] Arguments:
[serve-path] Specific path to serve [default: .] Options:
-c, --config Specify configuration file
-b, --bind Specify bind address or unix socket
-p, --port Specify port to listen on [default: 5000]
--path-prefix Specify a path prefix
--hidden Hide paths from directory listings, e.g. tmp, .log, .lock
-a, --auth Add auth roles, e.g. user:pass@/dir1:rw,/dir2
-A, --allow-all Allow all operations
--allow-upload Allow upload files/folders
--allow-delete Allow delete files/folders
--allow-search Allow search files/folders
--allow-symlink Allow symlink to files/folders outside root directory
--allow-archive Allow zip archive generation
--enable-cors Enable CORS, sets Access-Control-Allow-Origin: * --render-index Serve index.html when requesting a directory, returns 404 if not found index.html
--render-try-index Serve index.html when requesting a directory, returns directory listing if not found index.html
--render-spa Serve SPA(Single Page Application)
--assets Set the path to the assets directory for overriding the built-in assets
--log-format Customize http log format
--log-file Specify the file to save logs to, other than stdout/stderr
--compress Set zip compress level [default: low] [possible values: none, low, medium, high]
--completions Print shell completion script for [possible values: bash, elvish, fish, powershell, zsh]
--tls-cert Path to an SSL/TLS certificate to serve with HTTPS
--tls-key Path to the SSL/TLS certificate's private key
-h, --help Print help
-V, --version Print version
``` Examples Serve current working directory in read-only mode dufs Allow all operations like upload/delete/search/create/edit... dufs -A Only allow upload operation dufs --allow-upload Serve a specific directory dufs Downloads Serve a single file dufs linux-distro.iso Serve a single-page application like react/vue dufs --render-spa Serve a static website with index.html dufs --render-index Require username/password dufs -a admin:123@/:rw Listen on specific host:ip dufs -b 127.0.0.1 -p 80 Listen on unix socket dufs -b /tmp/dufs.socket Use https dufs --tls-cert my.crt --tls-key my.key API Upload a file sh
curl -T path-to-file http://127.0.0.1:5000/new-path/path-to-file Download a file sh
curl http://127.0.0.1:5000/path-to-file # download the file
curl http://127.0.0.1:5000/path-to-file?hash # retrieve the sha256 hash of the file Download a folder as zip file sh
curl -o path-to-folder.zip http://127.0.0.1:5000/path-to-folder?zip Delete a file/folder sh
curl -X DELETE http://127.0.0.1:5000/path-to-file-or-folder Create a directory sh
curl -X MKCOL http://127.0.0.1:5000/path-to-folder Move the file/folder to the new path sh
curl -X MOVE http://127.0.0.1:5000/path -H "Destination: http://127.0.0.1:5000/new-path" List/search directory contents sh
curl http://127.0.0.1:5000?q=Dockerfile # search for files, similar to `find -name Dockerfile`
curl http://127.0.0.1:5000?simple # output names only, similar to `ls -1`
curl http://127.0.0.1:5000?json # output paths in json format With authorization (Both basic or digest auth works) sh
curl http://127.0.0.1:5000/file --user user:pass # basic auth
curl http://127.0.0.1:5000/file --user user:pass --digest # digest auth Resumable downloads sh
curl -C- -o file http://127.0.0.1:5000/file Resumable uploads sh
upload_offset=$(curl -I -s http://127.0.0.1:5000/file | tr -d '\r' | sed -n 's/content-length: //p')
dd skip=$upload_offset if=file status=none ibs=1 | \
curl -X PATCH -H "X-Update-Range: append" --data-binary @- http://127.0.0.1:5000/file Advanced topics ### Access Control
Dufs supports account based access control. You can control who can do what on which path with `--auth`/`-a`.
```
dufs -a admin:admin@/:rw -a guest:guest@/
dufs -a user:pass@/:rw,/dir1 -a @/
```
1. Use `@` to separate the account and paths. No account means anonymous user.
2. Use `:` to separate the username and password of the account.
3. Use `,` to separate paths.
4. Use path suffix `:rw`/`:ro` set permissions: `read-write`/`read-only`. `:ro` can be omitted.
- `-a admin:admin@/:rw`: `admin` has complete permissions for all paths.
- `-a guest:guest@/`: `guest` has read-only permissions for all paths.
- `-a user:pass@/:rw,/dir1`: `user` has read-write permissions for `/*`, has read-only permissions for `/dir1/*`.
- `-a @/`: All paths is publicly accessible, everyone can view/download it.
> There are no restrictions on using ':' and '@' characters in a password. For example, `user:pa:ss@1@/:rw` is valid, the password is `pa:ss@1`.
#### Hashed Password
DUFS supports the use of sha-512 hashed password.
Create hashed password
```
$ mkpasswd -m sha-512 -s
Password: 123456
$6$tWMB51u6Kb2ui3wd$5gVHP92V9kZcMwQeKTjyTRgySsYJu471Jb1I6iHQ8iZ6s07GgCIO69KcPBRuwPE5tDq05xMAzye0NxVKuJdYs/
```
Use hashed password
```
dufs -a 'admin:$6$tWMB51u6Kb2ui3wd$5gVHP92V9kZcMwQeKTjyTRgySsYJu471Jb1I6iHQ8iZ6s07GgCIO69KcPBRuwPE5tDq05xMAzye0NxVKuJdYs/@/:rw'
```
Two important things for hashed passwords:
1. Dufs only supports sha-512 hashed passwords, so ensure that the password string always starts with `$6$`.
2. Digest authentication does not function properly with hashed passwords.
### Hide Paths
Dufs supports hiding paths from directory listings via option `--hidden ,...`.
```
dufs --hidden .git,.DS_Store,tmp
```
> The glob used in --hidden only matches file and directory names, not paths. So `--hidden dir1/file` is invalid.
```sh
dufs --hidden '.*' # hidden dotfiles
dufs --hidden '*/' # hidden all folders
dufs --hidden '*.log,*.lock' # hidden by exts
dufs --hidden '*.log' --hidden '*.lock'
```
### Log Format
Dufs supports customize http log format with option `--log-format`.
The log format can use following variables.
| variable | description |
| ------------ | ------------------------------------------------------------------------- |
| $remote_addr | client address |
| $remote_user | user name supplied with authentication |
| $request | full original request line |
| $status | response status |
| $http_ | arbitrary request header field. examples: $http_user_agent, $http_referer |
The default log format is `'$remote_addr "$request" $status'`.
```
2022-08-06T06:59:31+08:00 INFO - 127.0.0.1 "GET /" 200
```
Disable http log
```
dufs --log-format=''
```
Log user-agent
```
dufs --log-format '$remote_addr "$request" $status $http_user_agent'
```
```
2022-08-06T06:53:55+08:00 INFO - 127.0.0.1 "GET /" 200 Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36
```
Log remote-user
```
dufs --log-format '$remote_addr $remote_user "$request" $status' -a /@admin:admin -a /folder1@user1:pass1
```
```
2022-08-06T07:04:37+08:00 INFO - 127.0.0.1 admin "GET /" 200
```
## Environment variables
All options can be set using environment variables prefixed with `DUFS_`.
```
[serve-path] DUFS_SERVE_PATH="."
--config DUFS_CONFIG=config.yaml
-b, --bind DUFS_BIND=0.0.0.0
-p, --port DUFS_PORT=5000
--path-prefix DUFS_PATH_PREFIX=/dufs
--hidden DUFS_HIDDEN=tmp,*.log,*.lock
-a, --auth DUFS_AUTH="admin:admin@/:rw|@/"
-A, --allow-all DUFS_ALLOW_ALL=true
--allow-upload DUFS_ALLOW_UPLOAD=true
--allow-delete DUFS_ALLOW_DELETE=true
--allow-search DUFS_ALLOW_SEARCH=true
--allow-symlink DUFS_ALLOW_SYMLINK=true
--allow-archive DUFS_ALLOW_ARCHIVE=true
--enable-cors DUFS_ENABLE_CORS=true
--render-index DUFS_RENDER_INDEX=true
--render-try-index DUFS_RENDER_TRY_INDEX=true
--render-spa DUFS_RENDER_SPA=true
--assets DUFS_ASSETS=./assets
--log-format DUFS_LOG_FORMAT=""
--log-file DUFS_LOG_FILE=./dufs.log
--compress DUFS_COMPRESS=low
--tls-cert DUFS_TLS_CERT=cert.pem
--tls-key DUFS_TLS_KEY=key.pem
```
## Configuration File
You can specify and use the configuration file by selecting the option `--config `.
The following are the configuration items:
```yaml
serve-path: '.'
bind: 0.0.0.0
port: 5000
path-prefix: /dufs
hidden:
- tmp
- '*.log'
- '*.lock'
auth:
- admin:admin@/:rw
- user:pass@/src:rw,/share
- '@/' # According to the YAML spec, quoting is required.
allow-all: false
allow-upload: true
allow-delete: true
allow-search: true
allow-symlink: true
allow-archive: true
enable-cors: true
render-index: true
render-try-index: true
render-spa: true
assets: ./assets/
log-format: '$remote_addr "$request" $status $http_user_agent'
log-file: ./dufs.log
compress: low
tls-cert: tests/data/cert.pem
tls-key: tests/data/key_pkcs1.pem
```
### Customize UI
Dufs allows users to customize the UI with your own assets.
```
dufs --assets my-assets-dir/
```
Your assets folder must contains a `index.html` file.
`index.html` can use the following placeholder variables to retrieve internal data.
- `__INDEX_DATA__`: directory listing data
- `__ASSETS_PREFIX__`: assets url prefix License Copyright (c) 2022-2024 dufs-developers. dufs is made available under the terms of either the MIT License or the Apache License 2.0, at your option. See the LICENSE-APACHE and LICENSE-MIT files for license details. | A file server that supports static serving, uploading, searching, accessing control, webdav... | cloud-disk,command-line,file-upload-server,static-server,rust,webdav,webdav-server,file-sharing | 51 | 21 | 188 | 364 | 1 | 1 | 2 |
bensadeh/tailspin | A log file highlighter Features 🪵 View (or tail ) any log file of any format 🍰 No setup or config required 🌈 Highlights numbers, dates, IP-addresses, UUIDs, URLs and more ⚙️ All highlight groups are customizable 🧬 Easy to integrate with other commands 🔍 Uses less under the hood for scrollback, search and filtering Table of Contents Overview Usage Installing Highlight Groups Watching folders Customizing Highlight Groups Working with stdin and stdout Using the pager less Settings Overview tailspin works by reading through a log file line by line, running a series of regexes
against each line. The regexes recognize patterns you expect to find in a logfile, like dates, numbers, severity
keywords and more. tailspin does not make any assumptions on the format or position of the items it wants to highlight. For this reason,
it requires no configuration and the highlighting will work consistently across different logfiles. Usage The binary name for tailspin is tspin . ```console Read from file and view in less tspin application.log Read from file and print to stdout tspin application.log --print Read from stdin and print to stdout echo "2021-01-01 12:00:00 [INFO] This is a log message" | tspin Read from another command and print to stdout kubectl logs [pod name] --follow | tspin
``` Installing Package Managers ```console Homebrew brew install tailspin Cargo cargo install tailspin Archlinux pacman -S tailspin Nix nix-shell -p tailspin NetBSD pkgin install tailspin FreeBSD pkg install tailspin
``` From Source console
cargo install --path . Binary will be placed in ~/.cargo/bin , make sure you add the folder to your PATH environment variable. [!IMPORTANT]
When building from source, make sure that you are using the latest version
of less . Highlight Groups Dates Keywords URLs Numbers IP Addresses Quotes Unix file paths HTTP methods UUIDs Key-value pairs Pointer addresses Unix processes Watching folders tailspin can listen for newline entries in a given folder. Watching folders is useful for monitoring log files that
are rotated. When watching folders, tailspin will start in follow mode (abort with Ctrl + C ) and will only print
newline entries which arrive after the initial start. Customizing Highlight Groups Overview Create config.toml in ~/.config/tailspin to customize highlight groups. Styles have the following shape: toml
style = { fg = "color", bg = "color", italic = false, bold = false, underline = false } To edit the different highlight groups, include them in your config.toml file. For example, to edit the date highlight group, add the following to your config.toml : toml
[date]
style = { fg = "green" } Expand the section below to see the default config for the highlight groups: Default highlight groups settings ```toml
[date]
number = { fg = "magenta" }
separator = { faint = true }
[date_word] # e.g. "Jan 01", "Mon Feb 28"
day = { fg = "magenta" }
month = { fg = "magenta" }
number = { fg = "magenta" }
[time]
time = { fg = "blue" }
zone = { fg = "red" }
separator = { faint = true }
[[keywords]]
words = ['null', 'true', 'false']
style = { fg = "red", italic = true }
[[keywords]]
words = ['GET']
style = { fg = "black", bg = "green" }
border = true
[url]
http = { fg = "red", faint = true }
https = { fg = "green", faint = true }
host = { fg = "blue", faint = true }
path = { fg = "blue" }
query_params_key = { fg = "magenta" }
query_params_value = { fg = "cyan" }
symbols = { fg = "red" }
[number]
style = { fg = "cyan" }
[ip]
number = { fg = "blue", italic = true }
letter = { fg = "magenta", italic = true }
separator = { fg = "red" }
[quotes]
style = { fg = "yellow" }
token = '"'
[path]
segment = { fg = "green", italic = true }
separator = { fg = "yellow" }
[uuid]
number = { fg = "blue", italic = true }
letter = { fg = "magenta", italic = true }
separator = { fg = "red" }
[pointer]
number = { fg = "blue", italic = true }
letter = { fg = "magenta", italic = true }
separator = { fg = "red" }
[key_value]
key = { faint = true }
separator = { fg = "white" }
[process]
name = { fg = "green" }
separator = { fg = "red" }
id = { fg = "yellow" }
``` Disabling Highlight Groups To disable a highlight group, set the disabled field for that group to true: toml
[date]
disabled = true Adding Keywords via config.toml To add custom keywords, either include them in the list of keywords or add new entries: ```toml
[[keywords]]
words = ['MyCustomKeyword']
style = { fg = "green" } [[keywords]]
words = ['null', 'true', 'false']
style = { fg = "red", italic = true }
``` Adding Keywords from the command line Sometimes it is more convenient to add highlight groups on the fly without having to edit a TOML. To add highlights from
the command line, use the --words-[red|green|yellow|blue|magenta|cyan] flag followed by a comma separated list
of words to be highlighted. Custom regex highlighters When you need more control over the highlighting, you can use the regex highlighter. This highlighter allows you to
specify a regex and a style to be applied to the matched text. It supports one capture group () . When found, it will apply the style to the captured text. toml
[[regexps]]
regular_expression = 'Started (.*)\.'
style = { fg = "red" } Working with stdin and stdout By default, tailspin will open a file in the pager less . However, if you pipe something into tailspin , it will
print the highlighted output directly to stdout . This is similar to running tspin [file] --print . To let tailspin highlight the logs of different commands, you can pipe the output of those commands into tailspin like so: console
journalctl -f | tspin
cat /var/log/syslog | tspin
kubectl logs -f pod_name | tspin Using the pager less Overview tailspin uses less as its pager to view the highlighted log files. You can get more info on less via the man command ( man less ) or by hitting the h button to access the help screen. Navigating Navigating within less uses a set of keybindings that may be familiar to users of vim or other vi -like
editors. Here's a brief overview of the most useful navigation commands: j / k : Scroll one line up / down d / u : Scroll one half-page up / down g / G : Go to the top / bottom of the file Follow mode When you run tailspin with the -f or --follow flag, it will scroll to the bottom and print new lines to the screen
as they're added to the file. To stop following the file, interrupt with Ctrl + C . This will stop the tailing, but keep the
file open, allowing you to review the existing content. To resume following the file from within less , press Shift + F . Search Use / followed by your search query. For example, /ERROR finds the first occurrence of ERROR . After the search, n finds the next instance, and N finds the previous instance. Filtering less allows filtering lines by a keyword, using & followed by the pattern. For instance, &ERROR shows
only lines with ERROR . To only show lines containing either ERROR or WARN , use a regular expression: &\(ERROR\|WARN\) . To clear the filter, use & with no pattern. Settings console
-f, --follow Follow the contents of the file
-e, --start-at-end Start at the end of the file
-p, --print Print the output to stdout
-c, --listen-command [CMD] Listen the output (stdout) of the provided command
--config-path [PATH] Use the configuration file from the provided path
--words-[COLOR] [WORDS] Highlight the provided words with the given color
--disable-builtin-keywords Disable the highlighting of all builtin groups
--disable-booleans Disable the highlighting of booleans and nulls
--disable-severity Disable the highlighting of severity levels
--disable-rest Disable the highlighting of REST verbs | 🌀 A log file highlighter | tail,log,syntax-highlighting,less,follow,colors,logfile,file,log-file,colorizer | 19 | 11 | 83 | 676 | 6 | 1 | 3 |
simeydotme/pokemon-cards-css | Pokémon Cards Holographic effect in CSS This is a repository holder for the Pokemon Cards CSS Holographic effect. 🔥 As seen on css-tricks.com and codepen 🌟 Demo running @ https://poke-holo.simey.me/ A collection of advanced CSS styles, applied with SvelteJS. Uses CSS Transforms, Gradients, Blend-modes and Filters to simulate the various Holofoil effects found
in the Sword and Shield era of Pokemon Trading Cards. support / tip If you think this is super cool, or useful, and want to donate a little, then you are also super cool! | | | |
|--|--:|---------|
| | | £1 tip |
| | | £5 tip |
| | | £10 tip | attribution - Galaxy Holo from aschefield101 - Some backgrounds from Vecteezy | A collection of advanced CSS styles to create realistic-looking effects for the faces of Pokemon cards. | blend-modes,css,filter,gradient,holo,holographic,pokemon,svelte,sveltejs,transform | 0 | 6 | 14 | 124 | 6 | 5 | 0 |
mulaRahul/keyviz | Keyviz is a free and open-source software to visualise your keystrokes and mouse actions in real time! Let your audience know what handy shortcuts/keys you're pressing during screencasts, presentations, collaborations, or whenever you need it. English | 简体中文 ⌨️ Keystrokes & 🖱️ Mouse Actions Now you can visualize mouse actions! Not only mouse clicks, you can also visualize mouse actions along with keystrokes like Cmd + Click , Alt + Drag , etc. 🎨 Stylize Don't restrain yourself to just black & white! You can customize every aspect of the visualization. The visualisation's style, size, colour (modifier and regular keys), border, icon, etc. Powerful and easy-to-use configuration options. Filter normal keys and only display shortcuts like Cmd + K (Default) Adjust the visualisation position on the screen Decide how much the visualisation lingers on the screen before animating out Switch between animation presets to animate your visualisation in & out 📥 Installation You can download the latest version of keyviz from the Github Releases page. For the installer, unzip the downloaded file, run the installer and follow the familiar steps to install keyviz. Below are the platform specifics options and requirements - 🪟 Windows ### 👜 Microsoft Store
You can download keyviz directly from the [microsoft store](https://apps.microsoft.com/detail/Keyviz/9phzpj643p7l?mode=direct).
### 🥄 Scoop
```bash
scoop bucket add extras # first, add the bucket
scoop install keyviz
```
### 📦 Winget
```bash
winget install mulaRahul.Keyviz
``` *.dll missing error? If you're getting a `.dll` missing error after installing the application, you're missing the required Visual C++ redistributables. You can get the same from here [VSC++ Redist](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist?view=msvc-170). 🍎 MacOS ### 🔒 Permission
Keyviz requires **Input Monitoring** and **Accessibility** permissions. Enable the same in settings - ```
Settings > Privacy & Security > Input Monitoring/Accessibility
``` 🐧 Linux ### ❗ v2.x.x Requirements
```bash
sudo apt-get install libayatana-appindicator3-dev
```
or
```bash
sudo apt-get install appindicator3-0.1 libappindicator3-dev
``` 🛠️ Build Instructions You can always further develop/build the project by yourself. First of all ensure that you've setup Flutter on your system. If not follow this guide . After setting up flutter, clone the repository if you have git installed or download the zip and unpack the same. bash
mkdir keyviz
cd keyviz
git clone https://github.com/mulaRahul/keyviz.git . Move inside the flutter project and run the build command to create an executable - bash
flutter build windows 💖 Support As keyviz is freeware, the only way I can earn is through your generous donations. It helps free my time and work more on keyviz. | Keyviz is a free and open-source tool to visualize your keystrokes ⌨️ and 🖱️ mouse actions in real-time. | flutter,keystroke,flutter-desktop,keypress | 6 | 8 | 111 | 221 | 43 | 5 | 1 |
nxtrace/NTrace-core | NextTrace An open source visual routing tool that pursues light weight, developed using Golang. HomePage: www.nxtrace.org IAAS Sponsor We are extremely grateful to DMIT , Misaka and SnapStack for providing the network infrastructure that powers this project. How To Use Document Language: English | 简体中文 ⚠️ Please note: We welcome PR submissions from the community, but please submit your PRs to the NTrace-V1 repository instead of NTrace-core repository. Regarding the NTrace-V1 and NTrace-core repositories: Both will largely remain consistent with each other. All development work is done within the NTrace-V1 repository. The NTrace-V1 repository releases new versions first. After running stably for an undetermined period, we will synchronize that version to NTrace-core. This means that the NTrace-V1 repository serves as a "beta" or "testing" version. Please note, there are exceptions to this synchronization. If a version of NTrace-V1 encounters a serious bug, NTrace-core will skip that flawed version and synchronize directly to the next version that resolves the issue. Automated Install Linux One-click installation script shell
curl nxtrace.org/nt |bash * Arch Linux AUR installation command
* Directly download bin package (only supports amd64) ```shell
yay -S nexttrace-bin
```
* The AUR builds are maintained by ouuan Linuxbrew's installation command Same as the macOS Homebrew's installation method (homebrew-core version only supports amd64)
* Deepin installation command shell
apt install nexttrace * Termux installation command shell
pkg install nexttrace-enhanced macOS macOS Homebrew's installation command Homebrew-core version shell
brew install nexttrace * This repository's ACTIONS automatically built version (updates faster) shell
brew tap nxtrace/nexttrace && brew install nxtrace/nexttrace/nexttrace * The homebrew-core build is maintained by chenrui333, please note that this version's updates may lag behind the repository Action automatically version Windows Windows Scoop installation command Scoop-extras version powershell
scoop bucket add extras && scoop install extras/nexttrace Scoop-extra is maintained by soenggam Please note, the repositories for all of the above installation methods are maintained by open source enthusiasts. Availability and timely updates are not guaranteed. If you encounter problems, please contact the repository maintainer to solve them, or use the binary packages provided by the official build of this project. Manual Install Download the precompiled executable For users not covered by the above methods, please go directly to Release to download the compiled binary executable. Release provides compiled binary executables for many systems and different architectures. If none are available, you can compile it yourself. Some essential dependencies of this project are not fully implemented on Windows by Golang , so currently, NextTrace is in an experimental support phase on the Windows platform. Install from source After installing Go >= 1.20 yourself, you can use the following command to install shell
go install github.com/nxtrace/NTrace-core@latest because of the version constraints conflict, you can not install NTrace-V1 by this After installation, the executable is in the $GOPATH/bin directory. If you have not set GOPATH , it is in the $HOME/go/bin directory.
The binary file name is consistent with the project name. You need to replace the nexttrace command below with NTrace-core .
If you want to be consistent with the commands below, you can rename the binary after executing the go install command shell
mv $GOPATH/bin/NTrace-core $GOPATH/bin/nexttrace Get Started NextTrace uses the ICMP protocol to perform TraceRoute requests by default, which supports both IPv4 and IPv6 ```bash IPv4 ICMP Trace nexttrace 1.0.0.1 URL nexttrace http://example.com:8080/index.html?q=1 Form printing nexttrace --table 1.0.0.1 An Output Easy to Parse nexttrace --raw 1.0.0.1
nexttrace --json 1.0.0.1 IPv4/IPv6 Resolve Only, and automatically select the first IP when there are multiple IPs nexttrace --ipv4 g.co
nexttrace --ipv6 g.co IPv6 ICMP Trace nexttrace 2606:4700:4700::1111 Disable Path Visualization With the -M parameter nexttrace koreacentral.blob.core.windows.net MapTrace URL: https://api.nxtrace.org/tracemap/html/c14e439e-3250-5310-8965-42a1e3545266.html Disable MPLS display using the --disable-mpls / -e parameter or the NEXTTRACE_DISABLEMPLS environment variable nexttrace --disable-mpls example.com
export NEXTTRACE_DISABLEMPLS=1
``` PS: The routing visualization drawing module was written by @tsosunchia , and the specific code can be viewed at tsosunchia/traceMap . Note that in LeoMoeAPI 2.0, due to the addition of geographical location data, we have deprecated the online query part of the OpenStreetMap API in the traceMap plugin and are using location information from our own database . The routing visualization function requires the geographical coordinates of each Hop, but third-party APIs generally do not provide this information, so this function is currently only supported when used with LeoMoeAPI. NextTrace now supports quick testing, and friends who have a one-time backhaul routing test requirement can use it ```bash IPv4 ICMP Fast Test (Beijing + Shanghai + Guangzhou + Hangzhou) in China Telecom / Unicom / Mobile / Education Network nexttrace --fast-trace You can also use TCP SYN for testing nexttrace --fast-trace --tcp You can also quickly test through a customized IP/DOMAIN list file nexttrace --file /path/to/your/iplist.txt CUSTOMIZED IP DOMAIN LIST FILE FORMAT One IP/DOMAIN per line + space + description information (optional) forExample: 106.37.67.1 BEIJING-TELECOM 240e:928:101:31a::1 BEIJING-TELECOM bj.10086.cn BEIJING-MOBILE 2409:8080:0:1::1 223.5.5.5 ``` NextTrace already supports route tracing for specified Network Devices ```bash Use eth0 network interface nexttrace --dev eth0 2606:4700:4700::1111 Use eth0 network interface's IP When using the network interface's IP for route tracing, note that the IP type to be traced should be the same as network interface's IP type (e.g. both IPv4) nexttrace --source 204.98.134.56 9.9.9.9
``` NextTrace can also use TCP and UDP protocols to perform Traceroute requests, but UDP protocols only supports IPv4 now ```bash TCP SYN Trace nexttrace --tcp www.bing.com You can specify the port by yourself [here is 443], the default port is 80 nexttrace --tcp --port 443 2001:4860:4860::8888 UDP Trace nexttrace --udp 1.0.0.1 nexttrace --udp --port 53 1.0.0.1
``` NextTrace also supports some advanced functions, such as ttl control, concurrent probe packet count control, mode switching, etc. ```bash Send 2 probe packets per hop nexttrace --queries 2 www.hkix.net No concurrent probe packets, only one probe packet is sent at a time nexttrace --parallel-requests 1 www.hkix.net Start Trace with TTL of 5, end at TTL of 10 nexttrace --first 5 --max-hops 10 www.decix.net In addition, an ENV is provided to set whether to hide the destination IP export NEXTTRACE_ENABLEHIDDENDSTIP=1 Turn off the IP reverse parsing function nexttrace --no-rdns www.bbix.net Set the payload size to 1024 bytes nexttrace --psize 1024 example.com Set the payload size and DF flag for TCP Trace nexttrace --psize 1024 --dont-fragment --tcp example.com Feature: print Route-Path diagram Route-Path diagram example: AS6453 Tata Communication「Singapore『Singapore』」 ╭╯ ╰AS9299 Philippine Long Distance Telephone Co.「Philippines『Metro Manila』」 ╭╯ ╰AS36776 Five9 Inc.「Philippines『Metro Manila』」 ╭╯ ╰AS37963 Aliyun「ALIDNS.COM『ALIDNS.COM』」 nexttrace --route-path www.time.com.my Disable color output nexttrace --nocolor 1.1.1.1 or use ENV export NO_COLOR=1
``` NextTrace supports users to select their own IP API (currently supports: LeoMoeAPI , IP.SB , IPInfo , IPInsight , IPAPI.com , Ip2region , IPInfoLocal , CHUNZHEN ) ```bash You can specify the IP database by yourself [IP-API.com here], if not specified, LeoMoeAPI will be used nexttrace --data-provider ip-api.com Note There are frequency limits for free queries of the ipinfo and IPInsight APIs. You can purchase services from these providers to remove the limits If necessary, you can clone this project, add the token provided by ipinfo or IPInsight and compile it yourself Note For the offline database IPInfoLocal, please download it manually and rename it to ipinfoLocal.mmdb. (You can download it from here: https://ipinfo.io/signup?ref=free-database-downloads) For the offline database Ip2region, you can download it manually and rename it to ip2region.db, or let NextTrace download it automatically Fill the token to: ipgeo/tokens.go Please be aware: Due to the serious abuse of IP.SB, you will often be not able to query IP data from this source IP-API.com has a stricter restiction on API calls, if you can't query IP data from this source, please try again in a few minutes The Pure-FTPd IP database defaults to using http://127.0.0.1:2060 as the query interface. To customize it, please use environment variables export NEXTTRACE_CHUNZHENURL=http://127.0.0.1:2060 You can use https://github.com/freshcn/qqwry to build your own Pure-FTPd IP database service You can also specify the default IP database by setting an environment variable export NEXTTRACE_DATAPROVIDER=ipinfo
``` NextTrace supports mixed parameters and shortened parameters ```bash
Example:
nexttrace --data-provider IPAPI.com --max-hops 20 --tcp --port 443 --queries 5 --no-rdns 1.1.1.1
nexttrace -tcp --queries 2 --parallel-requests 1 --table --route-path 2001:4860:4860::8888 Equivalent to:
nexttrace -d ip-api.com -m 20 -T -p 443 -q 5 -n 1.1.1.1
nexttrace -T -q 2 --parallel-requests 1 -t -P 2001:4860:4860::8888
``` IP Database We use bgp.tools as a data provider for routing tables. NextTrace BackEnd is now open-source. https://github.com/sjlleo/nexttrace-backend NextTrace LeoMoeAPI now utilizes the Proof of Work (POW) mechanism to prevent abuse, where NextTrace introduces the powclient library as a client-side component. Both the POW CLIENT and SERVER are open source, and everyone is welcome to use them. (Please direct any POW module-related questions to the corresponding repositories) GitHub - tsosunchia/powclient: Proof of Work CLIENT for NextTrace GitHub - tsosunchia/powserver: Proof of Work SERVER for NextTrace All NextTrace IP geolocation API DEMO can refer to here For full usage list, please refer to the usage menu ```shell
Usage: nexttrace [-h|--help] [-4|--ipv4] [-6|--ipv6] [-T|--tcp] [-U|--udp]
[-F|--fast-trace] [-p|--port ] [-q|--queries ] [--parallel-requests ] [-m|--max-hops ] [-d|--data-provider
(Ip2region|ip2region|IP.SB|ip.sb|IPInfo|ipinfo|IPInsight|ipinsight|IPAPI.com|ip-api.com|IPInfoLocal|ipinfolocal|chunzhen|LeoMoeAPI|leomoeapi|disable-geoip)]
[--pow-provider (api.nxtrace.org|sakura)] [-n|--no-rdns]
[-a|--always-rdns] [-P|--route-path] [-r|--report] [--dn42]
[-o|--output] [-t|--table] [--raw] [-j|--json] [-c|--classic]
[-f|--first ] [-M|--map] [-e|--disable-mpls]
[-v|--version] [-s|--source " "] [-D|--dev " "]
[-z|--send-time ] [-i|--ttl-time ]
[--timeout ] [--psize ]
[_positionalArg_nexttrace_32 " "] [--dot-server
(dnssb|aliyun|dnspod|google|cloudflare)] [-g|--language
(en|cn)] [--file " "] [-C|--nocolor] Arguments: -h --help Print help information
-4 --ipv4 Use IPv4 only
-6 --ipv6 Use IPv6 only
-T --tcp Use TCP SYN for tracerouting (default port
is 80)
-U --udp Use UDP SYN for tracerouting (default port
is 33494)
-F --fast-trace One-Key Fast Trace to China ISPs
-p --port Set the destination port to use. With
default of 80 for "tcp", 33494 for "udp"
-q --queries Set the number of probes per each hop.
Default: 3
--parallel-requests Set ParallelRequests number. It should be
1 when there is a multi-routing. Default:
18
-m --max-hops Set the max number of hops (max TTL to be
reached). Default: 30
-d --data-provider Choose IP Geograph Data Provider [IP.SB,
IPInfo, IPInsight, IP-API.com, Ip2region,
IPInfoLocal, CHUNZHEN, disable-geoip].
Default: LeoMoeAPI
--pow-provider Choose PoW Provider [api.nxtrace.org,
sakura] For China mainland users, please
use sakura. Default: api.nxtrace.org
-n --no-rdns Do not resolve IP addresses to their
domain names
-a --always-rdns Always resolve IP addresses to their
domain names
-P --route-path Print traceroute hop path by ASN and
location
-r --report output using report mode
--dn42 DN42 Mode
-o --output Write trace result to file
(RealTimePrinter ONLY)
-t --table Output trace results as table
--raw An Output Easy to Parse
-j --json Output trace results as JSON
-c --classic Classic Output trace results like
BestTrace
-f --first Start from the first_ttl hop (instead from
1). Default: 1
-M --map Disable Print Trace Map
-e --disable-mpls Disable MPLS
-v --version Print version info and exit
-s --source Use source src_addr for outgoing packets
-D --dev Use the following Network Devices as the
source address in outgoing packets
-z --send-time Set how many [milliseconds] between
sending each packet.. Useful when some
routers use rate-limit for ICMP messages.
Default: 50
-i --ttl-time Set how many [milliseconds] between
sending packets groups by TTL. Useful when
some routers use rate-limit for ICMP
messages. Default: 50
--timeout The number of [milliseconds] to keep probe
sockets open before giving up on the
connection.. Default: 1000
--psize Set the payload size. Default: 52
--_positionalArg_nexttrace_32 IP Address or domain name
--dot-server Use DoT Server for DNS Parse [dnssb,
aliyun, dnspod, google, cloudflare]
-g --language Choose the language for displaying [en,
cn]. Default: cn
--file Read IP Address or domain name from file
-C --nocolor Disable Colorful Output
--dont-fragment Set the Don't Fragment bit (IPv4 TCP
only). Default: false
``` Project screenshot OpenTrace OpenTrace is the cross-platform GUI version of NextTrace developed by @Archeb, bringing a familiar but more powerful user experience. This software is still in the early stages of development and may have many flaws and errors. We value your feedback. https://github.com/Archeb/opentrace NEXTTRACE WEB API NextTraceWebApi is a web-based server implementation of NextTrace in the MTR style, offering various deployment options including Docker . https://github.com/nxtrace/nexttracewebapi NextTraceroute NextTraceroute is a root-free Android route tracing application that defaults to using the NextTrace API , developed by @surfaceocean. Thank you to all the test users for your enthusiastic support. This app has successfully passed the closed testing phase and is now officially available on the Google Play Store. https://github.com/nxtrace/NextTraceroute LeoMoeAPI Credits NextTrace focuses on Golang Traceroute implementations, and its LeoMoeAPI geolocation information is not supported by raw data, so a commercial version is not possible. The LeoMoeAPI data is subject to copyright restrictions from multiple data sources, and is only used for the purpose of displaying the geolocation of route tracing. We would like to credit samleong123 for providing nodes in Malaysia, TOHUNET Looking Glass for global nodes, and Ping.sx from Misaka, where more than 80% of reliable calibration data comes from ping/mtr reports. At the same time, we would like to credit isyekong for their contribution to rDNS-based calibration ideas and data. LeoMoeAPI is accelerating the development of rDNS resolution function, and has already achieved automated geolocation resolution for some backbone networks, but there are some misjudgments. We hope that NextTrace will become a One-Man ISP-friendly traceroute tool in the future, and we are working on improving the calibration of these ASN micro-backbones as much as possible. In terms of development, I would like to credit missuo and zhshch for their help with Go cross-compilation, design concepts and TCP/UDP Traceroute refactoring, and tsosunchia for their support on TraceMap. I would also like to credit FFEE_CO, TheresaQWQ, stydxm and others for their help. LeoMoeAPI has received a lot of support since its first release, so I would like to credit them all! We hope you can give us as much feedback as possible on IP geolocation errors (see issue) so that it can be calibrated in the first place and others can benefit from it. JetBrain Support This Project uses JetBrain Open-Source Project License . We Proudly Develop By Goland . Credits Gubo Reliable Host Recommendation Website IPInfo Provided most of the data support for this project free of charge BGP.TOOLS Provided some data support for this project free of charge PeeringDB Provided some data support for this project free of charge sjlleo The perpetual leader, founder, and core contributors tsosunchia The project chair, infra maintainer, and core contributors Vincent Young zhshch2002 Sam Sam waiting4new FFEE_CO bobo liu YekongTAT Others Although other third-party APIs are integrated in this project, please refer to the official website of the third-party APIs for specific TOS and AUP. If you encounter IP data errors, please contact them directly to correct them. For feedback related to corrections about IP information, we currently have two channels available: IP 错误报告汇总帖 in the GITHUB ISSUES section of this project (Recommended) This project's dedicated correction email: correction@nxtrace.org (Please note that this email is only for correcting IP-related information. For other feedback, please submit an ISSUE) How to obtain the freshly baked binary executable of the latest commit? Please go to the most recent Build & Release workflow in GitHub Actions. Star History | NextTrace, an open source visual route tracking CLI tool | traceroute,api,as-path,asn-lookup,geoip,geolocation,ip-lookup,nexttrace | 75 | 12 | 99 | 831 | 3 | 3 | 3 |
Ehviewer-Overhauled/Ehviewer | English | 简体中文 | 正體中文 EhViewer Description | Download | Screenshot | Thanks | License Description EhViewer fork dedicated to lightweight and high-performance with Material Design 3 and Dynamic Color Support Download Screenshot Thanks Here is the libraries Arrow AOSP & AndroidX Kotlin & KotlinX FullDraggableDrawer MCA material-design-icons Okhttp RikkaX Libarchive Coil Jsoup Specially thanks Tachiyomi for its reader implementation! License Copyright 2014-2019 Hippo Seven
Copyright 2020-2022 NekoInverter
Copyright 2022-2023 Tarsin Norbin
EhViewer is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
EhViewer is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with EhViewer. If not, see <https://www.gnu.org/licenses/>. | EhViewer overhauled with Material Design 3, Jetpack Compose and more | android,e-hentai,ehviewer,exhentai,material-design,kotlin,libarchive,material-design-3,material-ui,c | 80 | 44 | 711 | 5,897 | 64 | 10 | 3 |
SunWeb3Sec/DeFiHackLabs | DeFi Hacks Reproduce - Foundry Reproduce DeFi hack incidents using Foundry. 431 incidents included. Let's make Web3 secure! Join Discord Notion: 101 root cause analysis of past DeFi hacked incidents Transaction debugging tools Disclaimer: This content serves solely as a proof of concept showcasing past DeFi hacking incidents. It is strictly intended for educational purposes and should not be interpreted as encouraging or endorsing any form of illegal activities or actual hacking attempts. The provided information is for informational and learning purposes only, and any actions taken based on this content are solely the responsibility of the individual. The usage of this information should adhere to applicable laws, regulations, and ethical standards. Getting Started Follow the instructions to install Foundry . Clone and install dependencies: git submodule update --init --recursive Contributing Guidelines Web3 Cybersecurity Academy All articles are also published on Substack . OnChain transaction debugging Lesson 1: Tools ( English | 中文 | Vietnamese | Korean | Spanish ) Lesson 2: Warm up ( English | 中文 | Korean ) Lesson 3: Write Your Own PoC (Price Oracle Manipulation) ( English | 中文 | Korean ) Lesson 4: Write Your Own PoC (MEV Bot) ( English | 中文 | Korean ) Lesson 5: Rugpull Analysis ( English | 中文 ) Lesson 6: Write Your Own PoC (Reentrancy) ( English | 中文 ) Lesson 7: Hack Analysis: Nomad Bridge, August 2022 ( English | 中文 ) Who Support Us? DeFiHackLabs Received Grant From Donate us If you appreciate our work, please consider donating. Even a small amount helps us continue developing and improving our projects, and promoting web3 security. EVM Chains - 0xD7d6215b4EF4b9B5f40baea48F41047Eb67a11D5 Giveth List of Past DeFi Incidents 20240616 WIFCOIN_ETH 20240611 Crb2 20240611 JokInTheBox 20240610 Bazaar 20240608 YYStoken 20240606 MineSTM 20240604 NCD 20240601 VeloCore 20240531 MixedSwapRouter 20240529 SCROLL 20240529 MetaDragon 20240528 EXcommunity 20240527 RedKeysCoin 20240526 NORMIE 20240522 Burner 20240516 TCH 20240514 Sonne Finance 20240514 PredyFinance 20240510 GFOX 20240510 TSURU 20240508 GPU 20240507 SATURN 20240506 OSN 20240430 Yield 20240430 PikeFinance 20240425 NGFS 20240424 XBridge 20240424 YIEDL 20240422 Z123 20240420 Rico 20240419 HedgeyFinance 20240416 SATX 20240416 MARS_DEFI 20240415 Chainge 20240412 FIL314 20240412 SumerMoney 20240412 GROKD 20240409 UPS 20240408 SQUID 20240404 WSM 20240401 ATM 20240401 OpenLeverage 20240329 PrismaFi 20240328 LavaLending 20240325 ZongZi 20240314 ARK 20240321 SSS 20240320 Paraswap 20240314 MO 20240313 IT 20240309 Juice 20240309 UnizenIO 20240307 GHT 20240306 ALP 20240306 TGBS 20240305 Woofi 20240228 Seneca 20240228 SMOOFSStaking 20240223 CompoundUni 20240223 BlueberryProtocol 20240221 DeezNutz404 20240221 GAIN 20240219 RuggedArt 20240216 ParticleTrade 20240215 DualPools 20240215 Miner 20240211 Game 20240210 FILX DN404 20240208 Pandora404 20240205 BurnsDefi 20240201 AffineDeFi 20240130 MIMSpell 20240128 BarleyFinance 20240127 CitadelFinance 20240125 NBLGAME 20240122 DAO_SoulMate 20240117 BmiZapper 20240117 SocketGateway 20240112 WiseLending 20240110 LQDX Alert 20240104 Gamma 20240102 MIC 20240102 RadiantCapital 20240101 OrbitChain 2023 [20231230 ChannelsFinance](past/2023/README.md#20231230-channelsfinance---compoundv2-inflation-attack)
[20231225 Telcoin](past/2023/README.md#20231225-telcoin---storage-collision)
[20231222 PineProtocol](past/2023/README.md#20231222-pineprotocol---business-logic-flaw)
[20231220 TransitFinance](past/2023/README.md#20231220-transitfinance---lack-of-validation-pool)
[20231217 FloorProtocol](past/2023/README.md#20231217-floorprotocol---business-logic-flaw)
[20231216 GoodDollar](past/2023/README.md#20231216-gooddollar---lack-of-input-validation--reentrancy)
[20231216 NFTTrader](past/2023/README.md#20231216-nfttrader---reentrancy)
[20231213 HYPR](past/2023/README.md#20231213-hypr---business-logic-flaw)
[20231206 TIME](past/2023/README.md#20231206-time---arbitrary-address-spoofing-attack)
[20231206 ElephantStatus](past/2023/README.md#20231206-elephantstatus---price-manipulation)
[20231205 BEARNDAO](past/2023/README.md#20231205-bearndao---business-logic-flaw)
[20231202 bZxProtocol](past/2023/README.md#20231202-bzxprotocol---inflation-attack)
[20231201 UnverifiedContr_0x431abb](past/2023/README.md#20231201-unverifiedcontr_0x431abb---business-logic-flaw)
[20231130 CAROLProtocol](past/2023/README.md#20231130-carolprotocol---price-manipulation-via-reentrancy)
[20231129 AIS](past/2023/README.md#20231129-ais---access-control)
[20231125 TheNFTV2](past/2023/README.md#20231125-thenftv2---logic-flaw)
[20231122 KyberSwap](past/2023/README.md#20231122-kyberswap---precision-loss)
[20231117 Token8633_9419](past/2023/README.md#20231117-token8633_9419---price-manipulation)
[20231117 ShibaToken](past/2023/README.md#20231117-shibatoken---business-logic-flaw)
[20231115 LinkDAO](past/2023/README.md#20231115-linkdao---bad-k-value-verification)
[20231114 OKC Project](past/2023/README.md#20231114-OKC-Project---Instant-Rewards-Unlocked)
[20231112 MEV_0x8c2d](past/2023/README.md#20231112-mevbot_0x8c2d---lack-of-access-control)
[20231112 MEV_0xa247](past/2023/README.md#20231112-mevbot_0xa247---incorrect-access-control)
[20231111 Mahalend](past/2023/README.md#20231111-mahalend---donate-inflation-exchangerate--rounding-error)
[20231110 Raft_fi](past/2023/README.md#20231110-raft_fi---donate-inflation-exchangerate--rounding-error)
[20231110 GrokToken](past/2023/README.md#20231110-grok---lack-of-slippage-protection)
[20231107 MEVbot](past/2023/README.md#20231107-mevbot---lack-of-access-control)
[20231106 TrustPad](past/2023/README.md#20231106-trustpad---lack-of-msgsender-address-verification)
[20231106 TheStandard_io](past/2023/README.md#20231106-thestandard_io---lack-of-slippage-protection)
[20231102 3913Token](past/2023/README.md#20231102-3913token---deflationary-token-attack)
[20231101 OnyxProtocol](past/2023/README.md#20231101-onyxprotocol---precission-loss-vulnerability)
[20231031 UniBotRouter](past/2023/README.md#20231031-UniBotRouter---arbitrary-external-call)
[20231028 AstridProtocol](past/2023/README.md#20231028-AstridProtocol---business-logic-flaw)
[20231024 MaestroRouter2](past/2023/README.md#20231024-maestrorouter2---arbitrary-external-call)
[20231022 OpenLeverage](past/2023/README.md#20231022-openleverage---business-logic-flaw)
[20231019 kTAF](past/2023/README.md#20231019-ktaf---compoundv2-inflation-attack)
[20231018 HopeLend](past/2023/README.md#20231018-hopelend---div-precision-loss)
[20231018 MicDao](past/2023/README.md#20231018-micdao---price-manipulation)
[20231013 BelugaDex](past/2023/README.md#20231013-belugadex---price-manipulation)
[20231013 WiseLending](past/2023/README.md#20231013-wiselending---donate-inflation-exchangerate--rounding-error)
[20231012 Platypus](past/2023/README.md#20231012-platypus---business-logic-flaw)
[20231011 BH](past/2023/README.md#20231011-bh---price-manipulation)
[20231008 pSeudoEth](past/2023/README.md#20231008-pseudoeth---pool-manipulation)
[20231007 StarsArena](past/2023/README.md#20231007-starsarena---reentrancy)
[20231005 DePayRouter](past/2023/README.md#20231005-depayrouter---business-logic-flaw)
[20230930 FireBirdPair](past/2023/README.md#20230930-FireBirdPair---lack-slippage-protection)
[20230929 DEXRouter](past/2023/README.md#20230929-dexrouter---arbitrary-external-call)
[20230926 XSDWETHpool](past/2023/README.md#20230926-XSDWETHpool---reentrancy)
[20230924 KubSplit](past/2023/README.md#20230924-kubsplit---pool-manipulation)
[20230921 CEXISWAP](past/2023/README.md#20230921-cexiswap---incorrect-access-control)
[20230916 uniclyNFT](past/2023/README.md#20230916-uniclynft---reentrancy)
[20230911 0x0DEX](past/2023/README.md#20230911-0x0dex---parameter-manipulation)
[20230909 BFCToken](past/2023/README.md#20230909-bfctoken---business-logic-flaw)
[20230908 APIG](past/2023/README.md#20230908-apig---business-logic-flaw)
[20230907 HCT](past/2023/README.md#20230907-hct---price-manipulation)
[20230905 JumpFarm](past/2023/README.md#20230905-JumpFarm---rebasing-logic-issue)
[20230905 HeavensGate](past/2023/README.md#20230905-HeavensGate---rebasing-logic-issue)
[20230905 FloorDAO](past/2023/README.md#20230905-floordao---rebasing-logic-issue)
[20230902 DAppSocial](past/2023/README.md#20230902-dappsocial---business-logic-flaw)
[20230829 EAC](past/2023/README.md#20230829-eac---price-manipulation)
[20230827 Balancer](past/2023/README.md#20230827-balancer---rounding-error--business-logic-flaw)
[20230826 SVT](past/2023/README.md#20230826-svt---flawed-price-calculation)
[20230824 GSS](past/2023/README.md#20230824-gss---skim-token-balance)
[20230821 EHIVE](past/2023/README.md#20230821-ehive---business-logic-flaw)
[20230819 BTC20](past/2023/README.md#20230819-btc20---price-manipulation)
[20230818 ExactlyProtocol](past/2023/README.md#20230818-exactlyprotocol---insufficient-validation)
[20230814 ZunamiProtocol](past/2023/README.md#20230814-zunamiprotocol---price-manipulation)
[20230809 EarningFram](past/2023/README.md#20230809-earningfram---reentrancy)
[20230802 CurveBurner](past/2023/README.md#20230802-curveburner---lack-slippage-protection)
[20230802 Uwerx](past/2023/README.md#20230802-uwerx---fault-logic)
[20230801 NeutraFinance](past/2023/README.md#20230801-neutrafinance---price-manipulation)
[20230801 LeetSwap](past/2023/README.md#20230801-leetswap---access-control)
[20230731 GYMNET](past/2023/README.md#20230731-gymnet---insufficient-validation)
[20230730 Curve](past/2023/README.md#20230730-curve---vyper-compiler-bug--reentrancy)
[20230726 Carson](past/2023/README.md#20230726-carson---price-manipulation)
[20230724 Palmswap](past/2023/README.md#20230724-palmswap---business-logic-flaw)
[20230723 MintoFinance](past/2023/README.md#20230723-mintofinance---signature-replay)
[20230722 ConicFinance02](past/2023/README.md#20230722-conic-finance-02---price-manipulation)
[20230721 ConicFinance](past/2023/README.md#20230721-conic-finance---read-only-reentrancy--misconfiguration)
[20230721 SUT](past/2023/README.md#20230721-sut---business-logic-flaw)
[20230720 Utopia](past/2023/README.md#20230720-utopia---business-logic-flaw)
[20230720 FFIST](past/2023/README.md#20230720-ffist---business-logic-flaw)
[20230718 APEDAO](past/2023/README.md#20230718-apedao---business-logic-flaw)
[20230718 BNO](past/2023/README.md#20230718-bno---invalid-emergency-withdraw-mechanism)
[20230717 NewFi](past/2023/README.md#20230717-newfi---lack-slippage-protection)
[20230712 Platypus](past/2023/README.md#20230712-platypus---bussiness-logic-flaw)
[20230712 WGPT](past/2023/README.md#20230712-wgpt---business-logic-flaw)
[20230711 RodeoFinance](past/2023/README.md#20230711-rodeofinance---twap-oracle-manipulation)
[20230711 Libertify](past/2023/README.md#20230711-libertify---reentrancy)
[20230710 ArcadiaFi](past/2023/README.md#20230710-arcadiafi---reentrancy)
[20230708 CIVNFT](past/2023/README.md#20230708-civnft---lack-of-access-control)
[20230708 Civfund](past/2023/README.md#20230708-civfund---lack-of-access-control)
[20230707 LUSD](past/2023/README.md#20230707-LUSD---price-manipulation-attack)
[20230704 BambooIA](past/2023/README.md#20230704-bambooia---price-manipulation-attack)
[20230704 BaoCommunity](past/2023/README.md#20230704-baocommunity---donate-inflation-exchangerate--rounding-error)
[20230703 AzukiDAO](past/2023/README.md#20230703-azukidao---invalid-signature-verification)
[20230630 Biswap](past/2023/README.md#20230630-biswap---v3migrator-exploit)
[20230628 Themis](past/2023/README.md#20230628-themis---manipulation-of-prices-using-flashloan)
[20230623 SHIDO](past/2023/README.md#20230623-shido---business-loigc)
[20230621 BabyDogeCoin02](past/2023/README.md#20230621-babydogecoin02---lack-slippage-protection)
[20230621 BUNN](past/2023/README.md#20230621-bunn---reflection-tokens)
[20230620 MIM](past/2023/README.md#20230620-mimspell---arbitrary-external-call-vulnerability)
[20230618 ARA](past/2023/README.md#20230618-ara---incorrect-handling-of-permissions)
[20230617 Pawnfi](past/2023/README.md#20230617-pawnfi---business-logic-flaw)
[20230615 CFC](past/2023/README.md#20230615-cfc---uniswap-skim-token-balance-attack)
[20230615 DEPUSDT_LEVUSDC](past/2023/README.md#20230615-depusdt_levusdc---incorrect-access-control)
[20230612 Sturdy Finance](past/2023/README.md#20230612-sturdy-finance---read-only-reentrancy)
[20230611 SellToken04](past/2023/README.md#20230611-sellToken04---Price-Manipulation)
[20230607 CompounderFinance](past/2023/README.md#20230607-compounderfinance---manipulation-of-funds-through-fluctuations-in-the-amount-of-exchangeable-assets)
[20230606 VINU](past/2023/README.md#20230606-vinu---price-manipulation)
[20230606 UN](past/2023/README.md#20230606-un---price-manipulation)
[20230602 NST SimpleSwap](past/2023/README.md#20230602-nst-simple-swap---unverified-contract-wrong-approval)
[20230601 DDCoin](past/2023/README.md#20230601-ddcoin---flashloan-attack-and-smart-contract-vulnerability)
[20230601 Cellframenet](past/2023/README.md#20230601-cellframenet---calculation-issues-during-liquidity-migration)
[20230531 ERC20TokenBank](past/2023/README.md#20230531-erc20tokenbank---price-manipulation)
[20230529 Jimbo](past/2023/README.md#20230529-jimbo---protocol-specific-price-manipulation)
[20230529 BabyDogeCoin](past/2023/README.md#20230529-babydogecoin---lack-slippage-protection)
[20230529 FAPEN](past/2023/README.md#20230529-fapen---wrong-balance-check)
[20230529 NOON_NO](past/2023/README.md#20230529-noon-no---wrong-visibility-in-function)
[20230525 GPT](past/2023/README.md#20230525-gpt-token---fee-machenism-exploitation)
[20230524 LocalTrade](past/2023/README.md#20230524-local-trade-lct---improper-access-control-of-close-source-contract)
[20230524 CS](past/2023/README.md#20230524-cs-token---outdated-global-variable)
[20230523 LFI](past/2023/README.md#20230523-lfi-token---business-logic-flaw)
[20230514 landNFT](past/2023/README.md#20230514-landNFT---lack-of-permission-control)
[20230514 SellToken03](past/2023/README.md#20230514-selltoken03---unchecked-user-input)
[20230513 Bitpaidio](past/2023/README.md#20230513-bitpaidio---business-logic-flaw)
[20230513 SellToken02](past/2023/README.md#20230513-selltoken02---price-manipulation)
[20230512 LW](past/2023/README.md#20230512-lw---flashloan-price-manipulation)
[20230511 SellToken01](past/2023/README.md#20230511-selltoken01---business-logic-flaw)
[20230510 SNK](past/2023/README.md#20230510-snk---reward-calculation-error)
[20230509 MCC](past/2023/README.md#20230509-mcc---reflection-token)
[20230509 HODL](past/2023/README.md#20230509-hodl---reflection-token)
[20230506 Melo](past/2023/README.md#20230506-melo---access-control)
[20230505 DEI](past/2023/README.md#20230505-dei---wrong-implemention)
[20230503 NeverFall](past/2023/README.md#20230503-NeverFall---price-manipulation)
[20230502 Level](past/2023/README.md#20230502-level---business-logic-flaw)
[20230428 0vix](past/2023/README.md#20230428-0vix---flashloan-price-manipulation)
[20230427 SiloFinance](past/2023/README.md#20230427-Silo-finance---Business-Logic-Flaw)
[20230424 Axioma](past/2023/README.md#20230424-Axioma---business-logic-flaw)
[20230419 OLIFE](past/2023/README.md#20230419-OLIFE---Reflection-token)
[20230416 Swapos V2](past/2023/README.md#20230416-swapos-v2---error-k-value-attack)
[20230415 HundredFinance](past/2023/README.md#20230415-hundredfinance---donate-inflation-exchangerate--rounding-error)
[20230413 yearnFinance](past/2023/README.md#20230413-yearnFinance---misconfiguration)
[20230412 MetaPoint](past/2023/README.md#20230412-metapoint---Unrestricted-Approval)
[20230411 Paribus](past/2023/README.md#20230411-paribus---reentrancy)
[20230409 SushiSwap](past/2023/README.md#20230409-SushiSwap---Unchecked-User-Input)
[20230405 Sentiment](past/2023/README.md#20230405-sentiment---read-only-reentrancy)
[20230402 Allbridge](past/2023/README.md#20230402-allbridge---flashloan-price-manipulation)
[20230328 SafeMoon Hack](past/2023/README.md#20230328-safemoon-hack)
[20230328 THENA](past/2023/README.md#20230328---thena---yield-protocol-flaw)
[20230325 DBW](past/2023/README.md#20230325---dbw--business-logic-flaw)
[20230322 BIGFI](past/2023/README.md#20230322---bigfi---reflection-token)
[20230317 ParaSpace NFT](past/2023/README.md#20230317---paraspace-nft---flashloan--scaledbalanceof-manipulation)
[20230315 Poolz](past/2023/README.md#20230315---poolz---integer-overflow)
[20230313 EulerFinance](past/2023/README.md#20230313---eulerfinance---business-logic-flaw)
[20230308 DKP](past/2023/README.md#20230308---dkp---flashloan-price-manipulation)
[20230307 Phoenix](past/2023/README.md#20230307---phoenix---access-control--arbitrary-external-call)
[20230227 LaunchZone](past/2023/README.md#20230227---launchzone---access-control)
[20230227 SwapX](past/2023/README.md#20230227---swapx---access-control)
[20230224 EFVault](past/2023/README.md#20230224---efvault---storage-collision)
[20230222 DYNA](past/2023/README.md#20230222---dyna---business-logic-flaw)
[20230218 RevertFinance](past/2023/README.md#20230218---revertfinance---arbitrary-external-call-vulnerability)
[20230217 Starlink](past/2023/README.md#20230217---starlink---business-logic-flaw)
[20230217 Dexible](past/2023/README.md#20230217---dexible---arbitrary-external-call-vulnerability)
[20230217 Platypusdefi](past/2023/README.md#20230217---platypusdefi---business-logic-flaw)
[20230210 Sheep Token](past/2023/README.md#20230210---sheep---reflection-token)
[20230210 dForce](past/2023/README.md#20230210---dforce---read-only-reentrancy)
[20230207 CowSwap](past/2023/README.md#20230207---cowswap---arbitrary-external-call-vulnerability)
[20230206 FDP Token](past/2023/README.md#20230206---fdp---reflection-token)
[20230203 Orion Protocol](past/2023/README.md#20230203---orion-protocol---reentrancy)
[20230203 Spherax USDs](past/2023/README.md#20230203---spherax-usds---balance-recalculation-bug)
[20230202 BonqDAO](past/2023/README.md#20230202---BonqDAO---price-oracle-manipulation)
[20230130 BEVO](past/2023/README.md#20230130---bevo---reflection-token)
[20230126 TomInu Token](past/2023/README.md#20230126---tinu---reflection-token)
[20230119 SHOCO Token](past/2023/README.md#20230119---shoco---reflection-token)
[20230119 ThoreumFinance](past/2023/README.md#20230119---thoreumfinance-business-logic-flaw)
[20230118 QTN Token](past/2023/README.md#20230118---qtntoken---business-logic-flaw)
[20230118 UPS Token](past/2023/README.md#20230118---upstoken---business-logic-flaw)
[20230117 OmniEstate](past/2023/README.md#20230117---OmniEstate---no-input-parameter-check)
[20230116 MidasCapital](past/2023/README.md#20230116---midascapital---read-only-reentrancy)
[20230111 UFDao](past/2023/README.md#20230111---ufdao---incorrect-parameter-setting)
[20230111 ROE](past/2023/README.md#20230111---roefinance---flashloan-price-manipulation)
[20230110 BRA](past/2023/README.md#20230110---bra---business-logic-flaw)
[20230103 GDS](past/2023/README.md#20230103---gds---business-logic-flaw) 2022 [20221230 DFS](past/2022/README.md#20221230---dfs---insufficient-validation--flashloan)
[20221229 JAY](past/2022/README.md#20221229---jay---insufficient-validation--reentrancy)
[20221225 Rubic](past/2022/README.md#20221225---rubic---arbitrary-external-call-vulnerability)
[20221223 Defrost](past/2022/README.md#20221223---defrost---reentrancy)
[20221214 Nmbplatform](past/2022/README.md#20221214---nmbplatform---flashloan-price-manipulation)
[20221214 FPR](past/2022/README.md#20221214---fpr---access-control)
[20221213 ElasticSwap](past/2022/README.md#20221213---elasticswap---business-logic-flaw)
[20221212 BGLD](past/2022/README.md#20221212---bgld-deflationary-token---flashloan-price-manipulation)
[20221211 Lodestar](past/2022/README.md#20221211---lodestar---flashloan-price-manipulation)
[20221210 MUMUG](past/2022/README.md#20221210---mumug---flashloan-price-manipulation)
[20221210 TIFIToken](past/2022/README.md#20221210---tifitoken---flashloan-price-manipulation)
[20221209 NOVAToken](past/2022/README.md#20221209---novatoken---malicious-unlimted-minting-rugged)
[20221207 AES](past/2022/README.md#20221207---aes-deflationary-token---business-logic-flaw--flashloan-price-manipulation)
[20221205 RFB](past/2022/README.md#20221205---rfb---predicting-random-numbers)
[20221205 BBOX](past/2022/README.md#20221205---bbox---flashloan-price-manipulation)
[20221202 OverNight](past/2022/README.md#20221202---overnight---flashloan-attack)
[20221201 APC](past/2022/README.md#20221201---apc---flashloan--price-manipulation)
[20221129 MBC & ZZSH](past/2022/README.md#20221129---mbc--zzsh---business-logic-flaw--access-control)
[20221129 SEAMAN](past/2022/README.md#20221129---seaman---business-logic-flaw)
[20221123 NUM](past/2022/README.md#20221123---num---protocol-token-incompatible)
[20221122 AUR](past/2022/README.md#20221122---aur---lack-of-permission-check)
[20221121 SDAO](past/2022/README.md#20221121---sdao---business-logic-flaw)
[20221119 AnnexFinance](past/2022/README.md#20221119---annexfinance---verify-flashloan-callback)
[20221117 UEarnPool](past/2022/README.md#20221117---uearnpool---flashloan-attack)
[20221116 SheepFarm](past/2022/README.md#20221116---sheepfarm---no-input-validation)
[20221110 DFXFinance](past/2022/README.md#20221110---dfxfinance---reentrancy)
[20221109 brahTOPG](past/2022/README.md#20221109-brahtopg---arbitrary-external-call-vulnerability)
[20221108 MEV_0ad8](past/2022/README.md#20221108-mev_0ad8---arbitrary-call)
[20221108 Kashi](past/2022/README.md#20221108-kashi---price-caching-design-defect)
[20221107 MooCAKECTX](past/2022/README.md#20221107-moocakectx---flashloan-attack)
[20221105 BDEX](past/2022/README.md#20221105-bdex---business-logic-flaw)
[20221027 VTF](past/2022/README.md#20221027-vtf-token---incorrect-reward-calculation)
[20221027 Team Finance](past/2022/README.md#20221027-team-finance---liquidity-migration-exploit)
[20221026 N00d Token](past/2022/README.md#20221026-n00d-token---reentrancy)
[20221025 ULME](past/2022/README.md#20221025-ulme---access-control)
[20221024 Market](past/2022/README.md#20221024-market---read-only-reentrancy)
[20221024 MulticallWithoutCheck](past/2022/README.md#20221024-multicallwithoutcheck---arbitrary-external-call-vulnerability)
[20221021 OlympusDAO](past/2022/README.md#20221021-olympusdao---no-input-validation)
[20221020 HEALTH Token](past/2022/README.md#20221020-health---transfer-logic-flaw)
[20221019 BEGO Token](past/2022/README.md#20221020-bego---incorrect-signature-verification)
[20221018 HPAY](past/2022/README.md#20221018-hpay---access-control)
[20221018 PLTD Token](past/2022/README.md#20221018-pltd---transfer-logic-flaw)
[20221017 Uerii Token](past/2022/README.md#20221017-uerii-token---access-control)
[20221014 INUKO Token](past/2022/README.md#20221014-inuko---flashloan-price-manipulation)
[20221014 EFLeverVault](past/2022/README.md#20221014-eflevervault---verify-flashloan-callback)
[20221014 MEVBOT a47b](past/2022/README.md#20221014-mevbota47b---mevbot-a47b)
[20221012 ATK](past/2022/README.md#20221012-atk---flashloan-manipulate-price)
[20221011 Rabby Wallet SwapRouter](past/2022/README.md#20221011-rabby-wallet-swaprouter---arbitrary-external-call-vulnerability)
[20221011 Templedao](past/2022/README.md#20221011-templedao---insufficient-access-control)
[20221010 Carrot](past/2022/README.md#20221010-carrot---public-functioncall)
[20221009 Xave Finance](past/2022/README.md#20221009-xave-finance---malicious-proposal-mint--transfer-ownership)
[20221006 RES-Token](past/2022/README.md#20221006-RES-Token---pair-manipulate)
[20221002 Transit Swap](past/2022/README.md#20221002-transit-swap---incorrect-owner-address-validation)
[20221001 BabySwap](past/2022/README.md#20221001-babyswap---parameter-access-control)
[20221001 RL](past/2022/README.md#20221001-RL-Token---Incorrect-Reward-calculation)
[20221001 Thunder Brawl](past/2022/README.md#20221001-thunder-brawl---reentrancy)
[20220929 BXH](past/2022/README.md#20220928-bxh---flashloan--price-oracle-manipulation)
[20220928 MEVBOT Badc0de](past/2022/README.md#20220928-MEVBOT---Badc0de)
[20220923 RADT-DAO](past/2022/README.md#20220923-RADT-DAO---pair-manipulate)
[20220913 MevBot Private TX](past/2022/README.md#20220913-mevbot-private-tx)
[20220910 DPC](past/2022/README.md#20220910-dpc---Incorrect-Reward-calculation)
[20220909 YYDS](past/2022/README.md#20220909-YYDS---pair-manipulate)
[20220908 NewFreeDAO](past/2022/README.md#20220908-newfreedao---flashloans-attack)
[20220908 Ragnarok Online Invasion](past/2022/README.md#20220908-ragnarok-online-invasion---broken-access-control)
[20220906 NXUSD](past/2022/README.md#20220906-NXUSD---flashloan-price-oracle-manipulation)
[20220905 ZoomproFinance](past/2022/README.md#20220905-zoomprofinance---flashloans--price-manipulation)
[20220902 ShadowFi](past/2022/README.md#20220902-shadowfi---access-control)
[20220902 Bad Guys by RPF](past/2022/README.md#20220902-bad-guys-by-rpf---business-logic-flaw--missing-check-for-number-of-nft-to-mint)
[20220828 DDC](past/2022/README.md#20220828-ddc)
[20220824 LuckyTiger NFT](past/2022/README.md#20220824-luckytiger-nft---predicting-random-numbers)
[20220810 XSTABLE Protocol](past/2022/README.md#20220810-xstable-protocol---incorrect-logic-check)
[20220809 ANCH](past/2022/README.md#20220809-anch---skim-token-balance)
[20220807 EGD Finance](past/2022/README.md#20220807-egd-finance---flashloans--price-manipulation)
[20220802 Nomad Bridge](past/2022/README.md#20220802-nomad-bridge---business-logic-flaw--incorrect-acceptable-merkle-root-checks)
[20220801 Reaper Farm](past/2022/README.md#20220801-reaper-farm---business-logic-flaw--lack-of-access-control-mechanism)
[20220725 LPC](past/2022/README.md#20220725-lpc---business-logic-flaw--incorrect-recipient-balance-check-did-not-check-senderrecipient-in-transfer)
[20220723 Audius](past/2022/README.md#20220723-audius---storage-collision--malicious-proposal)
[20220713 SpaceGodzilla](past/2022/README.md#20220713-spacegodzilla---flashloans--price-manipulation)
[20220710 Omni NFT](past/2022/README.md#20220710-omni-nft---reentrancy)
[20220706 FlippazOne NFT](past/2022/README.md#20220706-flippazone-nft---accesscontrol)
[20220701 Quixotic - Optimism NFT Marketplace](past/2022/README.md#20220701-quixotic---optimism-nft-marketplace)
[20220626 XCarnival](past/2022/README.md#20220626-xcarnival---infinite-number-of-loans)
[20220624 Harmony's Horizon Bridge](past/2022/README.md#20220624-harmonys-horizon-bridge---private-key-compromised)
[20220618 SNOOD](past/2022/README.md#20220618-snood---miscalculation-on-_spendallowance)
[20220616 InverseFinance](past/2022/README.md#20220616-inversefinance---flashloan--price-oracle-manipulation)
[20220608 GYMNetwork](past/2022/README.md#20220608-gymnetwork---accesscontrol)
[20220608 Optimism - Wintermute](past/2022/README.md#20220608-optimism---wintermute)
[20220606 Discover](past/2022/README.md#20220606-discover---flashloan--price-oracle-manipulation)
[20220529 NOVO Protocol](past/2022/README.md#20220529-novo-protocol---flashloan--price-oracle-manipulation)
[20220524 HackDao](past/2022/README.md#20220524-HackDao---Skim-token-balance)
[20220517 ApeCoin](past/2022/README.md#20220517-apecoin-ape---flashloan)
[20220508 Fortress Loans](past/2022/README.md#20220508-fortress-loans---malicious-proposal--price-oracle-manipulation)
[20220430 Saddle Finance](past/2022/README.md#20220430-saddle-finance---swap-metapool-attack)
[20220430 Rari Capital/Fei Protocol](past/2022/README.md#20220430-rari-capitalfei-protocol---flashloan-attack--reentrancy)
[20220428 DEUS DAO](past/2022/README.md#20220428-deus-dao---flashloan--price-oracle-manipulation)
[20220424 Wiener DOGE](past/2022/README.md#20220424-wiener-doge---flashloan)
[20220423 Akutar NFT](past/2022/README.md#20220423-akutar-nft---denial-of-service)
[20220421 Zeed Finance](past/2022/README.md#20220421-zeed-finance)
[20220416 BeanstalkFarms](past/2022/README.md#20220416-beanstalkfarms---dao--flashloan)
[20220415 Rikkei Finance](past/2022/README.md#20220415-rikkei-finance---access-control--price-oracle-manipulation)
[20220412 ElephantMoney](past/2022/README.md#20220412-elephantmoney---flashloan--price-oracle-manipulation)
[20220411 Creat Future](past/2022/README.md#20220411-creat-future)
[20220409 GYMNetwork](past/2022/README.md#20220409-gymnetwork---flashloan--token-migrate-flaw)
[20220329 Ronin Network](past/2022/README.md#20220329-ronin-network---Bridge)
[20220329 Redacted Cartel](past/2022/README.md#20220329-redacted-cartel---custom-approval-logic)
[20220327 Revest Finance](past/2022/README.md#20220327-revest-finance---reentrancy)
[20220326 Auctus](past/2022/README.md#20220326-auctus)
[20220322 CompoundTUSDSweepTokenBypass](past/2022/README.md#20220322-compoundtusdsweeptokenbypass)
[20220321 OneRing Finance](past/2022/README.md#20220321-onering-finance---flashloan--price-oracle-manipulation)
[20220320 LI.FI](past/2022/README.md#20220320-LiFi---bridges)
[20220320 Umbrella Network](past/2022/README.md#20220320-umbrella-network---underflow)
[20220315 Agave Finance](past/2022/README.md#20220313-agave-finance---erc667-reentrancy)
[20220315 Hundred Finance](past/2022/README.md#20220313-hundred-finance---erc667-reentrancy)
[20220313 Paraluni](past/2022/README.md#20220313-paraluni---flashloan--reentrancy)
[20220309 Fantasm Finance](past/2022/README.md#20220309-fantasm-finance---business-logic-in-mint)
[20220305 Bacon Protocol](past/2022/README.md#20220305-bacon-protocol---reentrancy)
[20220303 TreasureDAO](past/2022/README.md#20220303-treasuredao---zero-fee)
[20220214 BuildFinance - DAO](past/2022/README.md#20220214-buildfinance---dao)
[20220208 Sandbox LAND](past/2022/README.md#20220208-sandbox-land---access-control)
[20220205 Meter](past/2022/README.md#20220205-Meter---bridge)
[20220204 TecraSpace](past/2022/README.md#20220204-TecraSpace---Any-token-is-destroyed)
[20220128 Qubit Finance](past/2022/README.md#20220128-qubit-finance---bridge-address0safetransferfrom-does-not-revert)
[20220118 Multichain (Anyswap)](past/2022/README.md#20220118-multichain-anyswap---insufficient-token-validation) 2021 [20211221 Visor Finance](past/2021/README.md#20211221-visor-finance---reentrancy)
[20211218 Grim Finance](past/2021/README.md#20211218-grim-finance---flashloan--reentrancy)
[20211214 Nerve Bridge](past/2021/README.md#20211214-nerve-bridge---swap-metapool-attack)
[20211130 MonoX Finance](past/2021/README.md#20211130-monox-finance---price-manipulation)
[20211027 Cream Finance](past/2021/README.md#20211027-creamfinance---price-manipulation)
[20211015 Indexed Finance](past/2021/README.md#20211015-indexed-finance---price-manipulation)
[20210916 SushiSwap Miso](past/2021/README.md#20210916-sushiswap-miso)
[20210915 Nimbus Platform](past/2021/README.md#20210915-nimbus-platform)
[20210915 NowSwap Platform](past/2021/README.md#20210915-nowswap-platform)
[20210912 ZABU Finance](past/2021/README.md#20210912-ZABU-Finance---Deflationary-token-uncompatible)
[20210903 DAO Maker](past/2021/README.md#20210903-dao-maker---bad-access-controal)
[20210830 Cream Finance](past/2021/README.md#20210830-cream-finance---flashloan-attack--reentrancy)
[20210817 XSURGE](past/2021/README.md#20210817-xsurge---flashloan-attack--reentrancy)
[20210811 Poly Network](past/2021/README.md#20210811-poly-network---bridge-getting-around-modifier-through-cross-chain-message)
[20210804 WaultFinance](past/2021/README.md#20210804-waultfinace---flashloan-price-manipulation)
[20210728 Levyathan Finance](past/2021/README.md#20210728-levyathan-finance---i-lost-keys-and-minting-ii-vulnerable-emergencywithdraw)
[20210710 Chainswap](past/2021/README.md#20210710-chainswap---bridge-logic-flaw)
[20210702 Chainswap](past/2021/README.md#20210702-chainswap---bridge-logic-flaw)
[20210628 SafeDollar](past/2021/README.md#20210628-safedollar---deflationary-token-uncompatible)
[20210625 xWin Finance](past/2021/README.md#20210625-xwin-finance---subscription-incentive-mechanism)
[20210622 Eleven Finance](past/2021/README.md#20210622-eleven-finance---doesnt-burn-shares)
[20210607 88mph NFT](past/2021/README.md#20210607-88mph-nft---access-control)
[20210603 PancakeHunny](past/2021/README.md#20210603-pancakehunny---incorrect-calculation)
[20210527 BurgerSwap](past/2021/README.md#20210527-burgerswap---mathematical-flaw--reentrancy)
[20210519 PancakeBunny](past/2021/README.md#20210519-pancakebunny---price-oracle-manipulation)
[20210508 Rari Capital](past/2021/README.md#20210509-raricapital---cross-contract-reentrancy)
[20210508 Value Defi](past/2021/README.md#20210508-value-defi---cross-contract-reentrancy)
[20210502 Spartan](past/2021/README.md#20210502-spartan---logic-flaw)
[20210428 Uranium](past/2021/README.md#20210428-uranium---miscalculation)
[20210308 DODO](past/2021/README.md#20210308-dodo---flashloan-attack)
[20210305 Paid Network](past/2021/README.md#20210305-paid-network---private-key-compromised)
[20210204 Yearn YDai](past/2021/README.md#20210204-yearn-ydai---Slippage-proection-absent)
[20210125 Sushi Badger Digg](past/2021/README.md#20210125-sushi-badger-digg---sandwich-attack) Before 2020 [20201229 Cover Protocol](past/2021/README.md#20201229-cover-protocol)
[20201121 Pickle Finance](past/2021/README.md#20201121-pickle-finance)
[20201026 Harvest Finance](past/2021/README.md#20201026-harvest-finance---flashloan-attack)
[20200804 Opyn Protocol](past/2021/README.md#20200804-opyn-protocol---msgValue-in-loop)
[20200628 Balancer Protocol](past/2021/README.md#20200628-balancer-protocol---token-incompatible)
[20200618 Bancor Protocol](past/2021/README.md#20200618-bancor-protocol---access-control)
[20200419 LendfMe](past/2021/README.md#20200419-lendfme---erc777-reentrancy)
[20200418 UniSwapV1](past/2021/README.md#20200418-uniswapv1---erc777-reentrancy)
[20180422 Beauty Chain](past/2021/README.md##20180422-beauty-chain---integer-overflow)
[20171106 Parity - 'Accidentally Killed It'](past/2021/README.md##20171106-parity---accidentally-killed-it) Transaction debugging tools Phalcon | Tx tracer | Cruise | Ethtx | Tenderly | eigenphi Ethereum Signature Database 4byte | sig db | etherface Useful tools ABI to interface | Get ABI for unverified contracts | ETH Calldata Decoder | ETHCMD - Guess ABI | Abi tools Hacks Dashboard Slowmist | Defillama | De.Fi | Rekt | Cryptosec List of DeFi Hacks & POCs 20240616 WIFCOIN_ETH - business logic flaw Lost: 13189.92USD(WIF token) ```sh
forge test --contracts .\src\test\2024-06\WIFCOIN_ETH_exp.sol -vv --evm-version "shanghai" ``` Contract WIFCOIN_ETH_exp.sol Link reference https://x.com/ChainAegis/status/1802550962977964139 20240616 Crb2 - business logic flaw Lost: ~15K ```sh
forge test --contracts .\src\test\2024-06\Crb2_exp.sol -vv --evm-version shanghai ``` Contract Crb2_exp.sol Link reference 20240611 JokInTheBox - business logic flaw Lost: 9.2 eth ```sh
forge test --contracts .\src\test\2024-06\JokInTheBox.sol -vv --evm-version cancun ``` Contract JokInTheBox_exp.sol Link reference https://x.com/0xNickLFranklin/status/1800355604692910571 20240610 Bazaar - Insufficient Permission Check Lost: 1.4M sh
forge test --contracts ./src/test/2024-06/Bazaar_exp.sol -vvv Contract Bazaar_exp.sol Link reference https://x.com/shoucccc/status/1800353122159833195 20240608 YYStoken - Business Logic Flaw Lost: $28K sh
forge test --contracts src/test/2024-06/YYS_exp.sol -vv Contract YYS_exp.sol Link reference https://x.com/0xNickLFranklin/status/1799610045589831833 20240606 MineSTM - Business Logic Flaw Lost: $13.8K sh
forge test --contracts src/test/2024-06/MineSTM_exp.sol -vv Contract MineSTM_exp.sol Link reference https://x.com/0xNickLFranklin/status/1798920774511898862 20240604 NCD - Business Logic Flaw Lost: $6.4K sh
forge test --contracts src/test/2024-06/NCD_exp.sol -vv Contract NCD_exp.sol Link reference https://x.com/SlowMist_Team/status/1797821034319765604 20240601 VeloCore - lack-of-access-control Lost: $6.88M sh
forge test --contracts src/test/2024-06/Velocore_exp.sol -vv Contract Velocore_exp.sol Link reference https://x.com/BeosinAlert/status/1797247874528645333 20240531 MixedSwapRouter - Arbitrary Call Lost: >10700USD(WINR token) sh
forge test --contracts ./src/test/2024-05/MixedSwapRouter_exp.sol -vvv Contract MixedSwapRouter_exp.sol Link reference https://x.com/ChainAegis/status/1796484286738227579 20240529 SCROLL - Integer Underflow Lost: 76 ETH sh
forge test --contracts ./src/test/2024-05/SCROLL_exp.sol -vvv Contract SCROLL_exp.sol Link reference https://x.com/0xNickLFranklin/status/1795650745448169741 20240529 MetaDragon - Lack of Access Control Lost: ~ $180k sh
forge test --contracts src/test/2024-05/MetaDragon_exp.sol -vvvvv --evm-version shanghai Contract MetaDragon_exp.sol Link reference https://x.com/Phalcon_xyz/status/1795746828064854497 20240528 EXcommunity - Business Logic Flaw Lost: 33BNB sh
forge test --contracts ./src/test/2024-05/EXcommunity_exp.sol -vvv Contract EXcommunity_exp.sol Link reference https://x.com/SlowMist_Team/status/1795648617530995130 20240527 RedKeysCoin - Weak RNG Lost: $12K sh
forge test --contracts ./src/test/2024-05/RedKeysCoin_exp.sol -vvv --evm-version shanghai Contract RedKeysCoin_exp.sol Link reference 20240526 NORMIE - Business Logic Flaw Lost: $490K sh
forge test --contracts ./src/test/2024-05/NORMIE_exp.sol -vv Contract NORMIE_exp.sol Link reference https://x.com/lookonchain/status/1794680612399542672 20240522 Burner - sandwich ack Lost: 1.7 eth sh
forge test --contracts ./src/test/2024-05/Burner_exp.sol -vv Contract Burner_exp.sol Link reference https://x.com/0xNickLFranklin/status/1792925754243625311 20240516 TCH - Signature Malleability Vulnerability Lost: $18K sh
forge test --contracts ./src/test/2024-05/TCH_exp.sol -vvv Contract TCH_exp.sol Link reference https://x.com/DecurityHQ/status/1791180322882629713 20240514 Sonne Finance - Precision loss Lost: $20M sh
forge test --contracts ./src/test/2024-05/Sonne_exp.sol -vvv Contract Sonne_exp.sol Link reference https://neptunemutual.com/blog/taking-a-closer-look-at-sonne-finance-exploit/ 20240514 PredyFinance - Reentrancy Lost: $464K sh
forge test --contracts ./src/test/2024-05/PredyFinance_exp.sol -vvv Contract PredyFinance_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1790307019590680851 20240510 GFOX - lack of access control Lost: 330K USD sh
forge test --contracts ./src/test/2024-05/GFOX_exp.sol -vvv --evm-version shanghai Contract GFOX_exp.sol Link reference https://twitter.com/CertiKAlert/status/1788751142144401886 20240510 TSURU - Insufficient Validation Lost: 140K sh
forge test --contracts ./src/test/2024-05/TSURU_exp.sol -vvv --evm-version shanghai Contract TSURU_exp.sol Link reference https://base.tsuru.wtf/usdtsuru-exploit-incident-report 20240508 GPU - self transfer Lost: ~32K USD sh
forge test --contracts src/test/2024-05/GPU_exp.sol -vvv Contract GPU_exp.sol Link reference https://twitter.com/PeckShieldAlert/status/1788153869987611113 20240507 SATURN - Price Manipulation Lost: ~15 BNB sh
forge test --contracts src/test/2024-05/OSN_exp.sol -vvv Contract SATURN_exp.sol Link reference https://twitter.com/ChainAegis/status/1787667253435195841 20240506 OSN - Reward Distribution Problem Lost: ~109K USD sh
forge test --contracts src/test/2024-05/OSN_exp.sol -vvv --evm-version shanghai Contract OSN_exp.sol Link reference https://twitter.com/SlowMist_Team/status/1787330586857861564 20240430 Yield - Business Logic Flaw Lost: 181K sh
forge test --contracts ./src/test/2024-04/Yield_exp.sol -vvv Contract Yield_exp.sol Link reference https://twitter.com/peckshield/status/1785121607192817692 https://medium.com/immunefi/yield-protocol-logic-error-bugfix-review-7b86741e6f50 20240430 PikeFinance - Uninitialized Proxy Lost: 1.4M sh
forge test --contracts ./src/test/2024-04/PikeFinance_exp.sol -vvv Contract PikeFinance_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1785508900093194591 20240425 NGFS - Bad Access Control Lost: ~190K sh
forge test --contracts ./src/test/2024-04/NGFS_exp.sol -vvv --evm-version shanghai Contract NGFS_exp.sol Link reference https://twitter.com/CertiKAlert/status/1783476515331616847 20240424 XBridge - Logic Flaw Lost: >200k USD(plus a lot of STC, SRLTY, Mazi tokens) sh
forge test --contracts ./src/test/2024-04/XBridge_exp.sol -vvv Contract XBridge_exp.sol 20240424 YIEDL - Input Validation Lost: 150k USD sh
forge test --contracts ./src/test/2024-04/YIEDL_exp.sol -vvv Contract YIEDL_exp.sol 20240422 Z123 - price manipulation Lost: 136k USD sh
forge test --contracts ./src/test/2024-04/Z123_exp.sol -vvv Contract Z123_exp.sol Link reference https://twitter.com/PeckShieldAlert/status/1782322484911784385 20240420 Rico - Arbitrary Call Lost: 36K sh
forge test --contracts ./src/test/2024-04/Rico_exp.sol -vvv Contract Rico_exp.sol Link reference https://twitter.com/ricocreditsys/status/1781803698940781009 20240419 HedgeyFinance - Logic Flaw Lost: 48M USD sh
forge test --contracts ./src/test/2024-04/HedgeyFinance_exp.sol -vvv Contract HedgeyFinance_exp.sol Link reference https://twitter.com/Cube3AI/status/1781294512716820918 20240416 SATX - Logic Flaw Lost: ~ 50 BNB sh
forge test --contracts src/test/2024-04/SATX_exp.sol -vvv Contract SATX_exp.sol Link reference https://x.com/bbbb/status/1780341239801393479 20240416 MARS - Bad Reflection Lost: >100K sh
forge test --contracts src/test/2024-04/MARS_exp.sol -vv Contract MARS_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1780150315603701933 20240415 Chainge - Input Validation Lost: ~200K sh
forge test --contracts ./src/test/2024-04/Chainge_exp.sol -vvv Contract Chainge_exp.sol Link reference https://twitter.com/CyversAlerts/status/1779875922381860920 20240412 FIL314 - Insufficient Validation And Price Manipulation Lost: ~14 BNB sh
forge test --contracts ./src/test/2024-04/FIL314_exp.sol -vvv Contract FIL314_exp.sol Link reference 20240412 SumerMoney - Reentrancy Lost: 350K sh
forge test --contracts ./src/test/2024-04/SumerMoney_exp.sol -vvv Contract SumerMoney_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1778986926705672698 20240412 GROKD - lack of access control Lost: $~150 BNB forge test --contracts ./src/test/2024-04/GROKD_exp.sol -vvv Contract GROKD_exp.sol Link reference https://x.com/hipalex921/status/1778482890705416323?t=KvvG83s7SXr9I55aftOc6w&s=05 20240409 UPS - business logic flaw Lost: $~28K USD forge test --contracts ./src/test/2024-04/UPS_exp.sol -vvv Contract UPS_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1777589021058728214 20240408 SQUID - sandwich attack Lost: $~87K USD forge test --contracts ./src/test/2024-04/SQUID_exp.sol -vvv Contract SQUID_exp.sol Link reference https://twitter.com/bbbb/status/1777228277415039304 20240404 wsm - manipulating price Lost: $~18K USD forge test --contracts ./src/test/2024-04/WSM_exp.sol -vvv Contract WSM_exp.sol Link reference https://hacked.slowmist.io/#:~:text=Hacked%20target%3A%20Wall%20Street%20Memes 20240401 ATM - business logic flaw Lost: $~182K USD forge test --contracts ./src/test/2024-04/ATM_exp.sol -vvv Contract ATM_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1775008489569718508 20240401 OpenLeverage - Reentrancy Lost: ~234K forge test --contracts src/test/2024-04/OpenLeverage2_exp.sol -vvv Contract OpenLeverage2_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1774727539975672136 20240329 PrismaFi - Insufficient Validation Lost: $~11M sh
forge test --contracts ./src/test/2024-03/Prisma_exp.sol -vvv Contract Prisma_exp.sol Link reference https://twitter.com/EXVULSEC/status/1773371049951797485 20240328 LavaLending - Business Logic Flaw Lost: ~340K forge test --contracts src/test/2024-03/LavaLending_exp.sol -vvv Contract LavaLending_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1774727539975672136 https://twitter.com/Phalcon_xyz/status/1773546399713345965 https://hackmd.io/@LavaSecurity/03282024 20240325 ZongZi - Price Manipulation Lost: ~223K forge test --contracts src/test/2024-03/ZongZi_exp.sol -vvv Contract ZongZi_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1772195949638775262 20240321 SSS - Token Balance Doubles on Transfer to self Lost: 4.8M sh
forge test --contracts ./src/test/2024-03/SSS_exp.sol -vvv Contract SSS_exp.sol Link reference https://twitter.com/dot_pengun/status/1770989208125272481 20240324 ARK - business logic flaw Lost: ~348BNB forge test --contracts src/test/2024-03/ARK_exp.sol -vvv Contract ARK_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1771728823534375249 20240320 Paraswap - Incorrect Access Control Lost: ~24K forge test --contracts src/test/2024-03/Paraswap_exp.sol -vvv --evm-version shanghai Contract Paraswap_exp.sol Link reference https://medium.com/neptune-mutual/analysis-of-the-paraswap-exploit-1f97c604b4fe 20240314 MO - business logic flaw Lost: ~413k USDT forge test --contracts src/test/2024-03/MO_exp.sol -vvv Contract MO_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1768184024483430523 20240313 IT - business logic flaw Lost: ~13k USDT forge test --via-ir --contracts src/test/2024-03/IT_exp.sol -vvv Contract IT_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1768171595561046489 20240309 Juice - Business Logic Flaw Lost: ~54 ETH sh
forge test --contracts ./src/test/2024-03/Juice_exp.sol -vvv Contract Juice_exp.sol Link reference https://medium.com/@juicebotapp/juice-staking-exploit-next-steps-95e218b3ec71 20240309 UnizenIO - unverified external call Lost: ~2M forge test --contracts src/test/2024-03/UnizenIO_exp.sol -vvvv Contract UnizenIO_exp.sol | UnizenIO2_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1766274000534004187 https://twitter.com/AnciliaInc/status/1766261463025684707 20240307 GHT - Business Logic Flaw Lost: ~57K forge test --contracts ./src/test/2024-03/GHT_exp.sol -vvv Contract GHT_exp.sol Link reference 20240306 ALP - Public internal function Lost: ~10K Testing forge test --contracts ./src/test/2024-03/ALP_exp.sol -vvv Contract ALP_exp.sol Link Reference https://twitter.com/0xNickLFranklin/status/1765296663667875880 20240306 TGBS - Business Logic Flaw Lost: ~150K forge test --contracts ./src/test/2024-03/TGBS_exp.sol -vvv Contract TGBS_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1765290290083144095 https://twitter.com/Phalcon_xyz/status/1765285257949974747 20240305 Woofi - Price Manipulation Lost: ~8M forge test --contracts ./src/test/2024-03/Woofi_exp.sol -vvv Contract Woofi_exp.sol Link reference https://twitter.com/spreekaway/status/1765046559832764886
https://twitter.com/PeckShieldAlert/status/1765054155478175943 20240228 Seneca - Arbitrary External Call Vulnerability Lost: ~6M forge test --contracts ./src/test/2024-02/Seneca_exp.sol -vvv Contract Seneca_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1763045563040411876 20240228 SMOOFSStaking - Reentrancy Lost: Unclear forge test --contracts ./src/test/2024-02/SMOOFSStaking_exp.sol -vvv Contract SMOOFSStaking_exp.sol Link reference https://twitter.com/AnciliaInc/status/1762893563103428783 https://twitter.com/0xNickLFranklin/status/1762895774311178251 20240223 CompoundUni - Oracle bad price Lost: ~439,537 USD forge test --contracts ./src/test/2024-02/CompoundUni_exp.sol -vvv Contract CompoundUni_exp.sol Link reference https://twitter.com/0xLEVI104/status/1762092203894276481 20240223 BlueberryProtocol - logic flaw Lost: ~1,400,000 USD forge test --contracts ./src/test/2024-02/BlueberryProtocol_exp.sol -vvv Contract BlueberryProtocol_exp.sol Link reference https://twitter.com/blueberryFDN/status/1760865357236211964 20240221 DeezNutz 404 - lack of validation Lost: ~170k forge test --contracts ./src/test/2024-02/DeezNutz404_exp.sol -vvv Contract DeezNutz404_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1760481343161700523 20240221 GAIN - bad function implementation Lost: ~6.4 ETH forge test --contracts ./src/test/2024-02/GAIN_exp.sol -vvv Contract GAIN_exp.sol Link reference https://twitter.com/0xNickLFranklin/status/1760559768241160679 20240219 RuggedArt - reentrancy Lost: ~10k forge test --contracts ./src/test/2024-02/RuggedArt_exp.sol -vvv Contract RuggedArt_exp.sol Link reference https://twitter.com/EXVULSEC/status/1759822545875025953 20240216 ParticleTrade - lack of validation data Lost: ~50k forge test --contracts ./src/test/2024-02/ParticleTrade_exp.sol -vvv Contract ParticleTrade_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1758028270770250134 20240215 DualPools - precision truncation Lost: ~42k forge test --contracts ./src/test/2024-02/DualPools_exp.sol -vvvv Contract DualPools_exp.sol Link reference https://medium.com/@lunaray/dualpools-hack-analysis-5209233801fa 20240215 Miner - lack of validation dst address Lost: ~150k forge test --contracts ./src/test/2024-02/Miner_exp.sol -vvv --evm-version shanghai Contract Miner_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1757777340002681326 20240211 Game - Reentrancy && Business Logic Flaw Lost: ~20 ETH forge test --contracts ./src/test/2024-02/Game_exp.sol -vvv Contract Game_exp.sol Link reference https://twitter.com/AnciliaInc/status/1757533144033739116 20240210 FILX DN404 - Access Control Lost: 200K sh
forge test --contracts ./src/test/2024-02/DN404_exp.sol -vvv Contract DN404_exp.sol 20240208 Pandora - interger underflow Lost: ~17K USD forge test --contracts ./src/test/2024-02/PANDORA_exp.sol -vvv Contract PANDORA_exp.sol Link reference https://twitter.com/pennysplayer/status/1766479470058406174 20240205 BurnsDefi - Price Manipulation Lost: ~67K forge test --contracts ./src/test/2024-02/BurnsDefi_exp.sol -vvv Contract BurnsDefi_exp.sol Link reference https://twitter.com/pennysplayer/status/1754342573815238946 https://medium.com/neptune-mutual/how-was-citadel-finance-exploited-a5f9acd0b408 (similar incident) 20240201 AffineDeFi - lack of validation userData Lost: ~88K forge test --contracts ./src/test/2024-02/AffineDeFi_exp.sol -vvv Contract AffineDeFi_exp.sol Link reference https://twitter.com/Phalcon_xyz/status/1753020812284809440 https://twitter.com/CyversAlerts/status/1753040754287513655 20240130 MIMSpell - Precission Loss Lost: ~6,5M forge test --contracts ./src/test/2024-01/MIMSpell2_exp.sol -vvv Contract MIMSpell2_exp.sol Link reference https://twitter.com/kankodu/status/1752581744803680680 https://twitter.com/Phalcon_xyz/status/1752278614551216494 https://twitter.com/peckshield/status/1752279373779194011 https://phalcon.blocksec.com/explorer/security-incidents 20240128 BarleyFinance - Reentrancy Lost: ~130K forge test --contracts ./src/test/2024-01/BarleyFinance_exp.sol -vvv Contract BarleyFinance_exp.sol Link reference https://phalcon.blocksec.com/explorer/security-incidents https://www.bitget.com/news/detail/12560603890246 https://twitter.com/Phalcon_xyz/status/1751788389139992824 20240127 CitadelFinance - Price Manipulation Lost: ~93K forge test --contracts ./src/test/2024-01/CitadelFinance_exp.sol -vvv Contract CitadelFinance_exp.sol Link reference https://medium.com/neptune-mutual/how-was-citadel-finance-exploited-a5f9acd0b408 20240125 NBLGAME - Reentrancy Lost: ~180K forge test --contracts ./src/test/2024-01/NBLGAME_exp.sol -vvv Contract NBLGAME_exp.sol Link reference https://twitter.com/SlowMist_Team/status/1750526097106915453 https://twitter.com/AnciliaInc/status/1750558426382635036 20240122 DAO_SoulMate - Incorrect Access Control Lost: ~319K forge test --contracts ./src/test/2024-01/DAO_SoulMate_exp.sol -vvv --evm-version 'shanghai' Contract DAO_SoulMate_exp.sol Link reference https://twitter.com/MetaSec_xyz/status/1749743245599617282 20240117 BmiZapper - Arbitrary external call vulnerability Lost: ~114K forge test --contracts ./src/test/2024-01/Bmizapper_exp.sol -vvv Contract BmiZapper_exp.sol Link reference https://x.com/0xmstore/status/1747756898172952725 20240112 SocketGateway - Lack of calldata validation Lost: ~3.3Million $ forge test --contracts ./src/test/2024-01/SocketGateway_exp.sol -vvv --evm-version shanghai Contract SocketGateway_exp.sol Link reference https://twitter.com/BeosinAlert/status/1747450173675196674 https://twitter.com/peckshield/status/1747353782004900274 20240112 WiseLending - Bad HealthFactor Check Lost: ~464K forge test --contracts ./src/test/2024-01/WiseLending02_exp.sol -vvv --evm-version shanghai Contract WiseLending02_exp.sol Link reference https://twitter.com/danielvf/status/1746303616778981402 20240110 LQDX - Unauthorized TransferFrom Lost: unknown forge test --contracts src/test/2024-01/LQDX_alert_exp.sol -vvv Contract LQDX_alert_exp.sol Link reference https://twitter.com/SlowMist_Team/status/1744972012865671452 20240104 Gamma - Price manipulation Lost: ~6.3M forge test --contracts ./src/test/2024-01/Gamma_exp.sol -vvv Contract Gamma_exp.sol Link reference https://twitter.com/officer_cia/status/1742772207997050899 https://twitter.com/shoucccc/status/1742765618984829326 20240102 MIC - Business Logic Flaw Lost: ~500K forge test --contracts ./src/test/2024-01/MIC_exp.sol -vvv Contract MIC_exp.sol Link reference https://x.com/MetaSec_xyz/status/1742484748239536173 20240102 RadiantCapital - Loss of Precision Lost: ~4,5M forge test --contracts ./src/test/2024-01/RadiantCapital_exp.sol -vvv Contract RadiantCapital_exp.sol Link reference https://neptunemutual.com/blog/how-was-radiant-capital-exploited/ https://twitter.com/BeosinAlert/status/1742389285926678784 20240101 OrbitChain - Incorrect input validation Lost: ~81M forge test --contracts ./src/test/2024-01/OrbitChain_exp.sol -vvv Contract OrbitChain_exp.sol Link reference https://blog.solidityscan.com/orbit-chain-hack-analysis-b71c36a54a69 View Gas Reports Foundry also has the ability to report the gas used per function call which mimics the behavior of hardhat-gas-reporter . Generally speaking if gas costs per function call is very high, then the likelihood of its success is reduced. Gas optimization is an important activity done by smart contract developers. Every poc in this repository can produce a gas report like this: bash
forge test --gas-report --contracts <contract> -vvv For Example:
Let us find out the gas used in the Audius poc Execution bash
forge test --gas-report --contracts ./src/test/Audius.exp.sol -vvv Demo Bug Reproduce Moved to DeFiVulnLabs FlashLoan Testing Moved to DeFiLabs | Reproduce DeFi hacked incidents using Foundry. | defi,ethereum,foundry,solidity,web3 | 0 | 92 | 663 | 2,270 | 3 | 2 | 1 |
alufers/mitmproxy2swagger | mitmproxy2swagger https://user-images.githubusercontent.com/5400940/168086818-c48f60ab-3f95-42eb-b435-c8b1a6326b81.mp4 A tool for automatically converting mitmproxy captures to OpenAPI 3.0 specifications. This means that you can automatically reverse-engineer REST APIs by just running the apps and capturing the traffic. 🆕 NEW! Added support for processing HAR exported from the browser DevTools. See Usage - HAR for more details. Installation First you will need python3 and pip3. ```bash
$ pip install mitmproxy2swagger ... or ... $ pip3 install mitmproxy2swagger ... or ... $ git clone git@github.com:alufers/mitmproxy2swagger.git
$ cd mitmproxy2swagger
$ docker build -t mitmproxy2swagger .
``` Then clone the repo and run mitmproxy2swagger as per examples below. Usage Mitmproxy To create a specification by inspecting HTTP traffic you will need to: Capture the traffic by using the mitmproxy tool. I personally recommend using mitmweb, which is a web interface built-in to mitmproxy. bash
$ mitmweb
Web server listening at http://127.0.0.1:8081/
Proxy server listening at http://*:9999
... IMPORTANT To configure your client to use the proxy exposed by mitm proxy, please consult the mitmproxy documentation for more information. Save the traffic to a flow file. In mitmweb you can do this by using the "File" menu and selecting "Save": Run the first pass of mitmproxy2swagger: bash
$ mitmproxy2swagger -i <path_to_mitmptoxy_flow> -o <path_to_output_schema> -p <api_prefix>
# ... or ...
$ docker run -it -v $PWD:/app mitmproxy2swagger mitmproxy2swagger -i <path_to_mitmptoxy_flow> -o <path_to_output_schema> -p <api_prefix> Please note that you can use an existing schema, in which case the existing schema will be extended with the new data. You can also run it a few times with different flow captures, the captured data will be safely merged. <api_prefix> is the base url of the API you wish to reverse-engineer. You will need to obtain it by observing the requests being made in mitmproxy. For example if an app has made requests like these: http
https://api.example.com/v1/login
https://api.example.com/v1/users/2
https://api.example.com/v1/users/2/profile The likely prefix is https://api.example.com/v1 . Running the first pass should have created a section in the schema file like this: yaml
x-path-templates:
# Remove the ignore: prefix to generate an endpoint with its URL
# Lines that are closer to the top take precedence, the matching is greedy
- ignore:/addresses
- ignore:/basket
- ignore:/basket/add
- ignore:/basket/checkouts
- ignore:/basket/coupons/attach/{id}
- ignore:/basket/coupons/attach/104754 You should edit the schema file with a text editor and remove the ignore: prefix from the paths you wish to be generated. You can also adjust the parameters appearing in the paths. Run the second pass of mitmproxy2swagger: bash
$ mitmproxy2swagger -i <path_to_mitmptoxy_flow> -o <path_to_output_schema> -p <api_prefix> [--examples]
# ... or ...
$ docker run -it -v $PWD:/app mitmproxy2swagger mitmproxy2swagger -i <path_to_mitmptoxy_flow> -o <path_to_output_schema> -p <api_prefix> [--examples] Run the command a second time (with the same schema file). It will pick up the edited lines and generate endpoint descriptions. Please note that mitmproxy2swagger will not overwrite existing endpoint descriptions, if you want to overwrite them, you can delete them before running the second pass. Passing --examples will add example data to requests and responses. Take caution when using this option, as it may add sensitive data (tokens, passwords, personal information etc.) to the schema.
Passing --headers will add headers data to requests and responses. Take caution when using this option, as it may add sensitive data (tokens, passwords, personal information etc.) to the schema. HAR Capture and export the traffic from the browser DevTools. In the browser DevTools, go to the Network tab and click the "Export HAR" button. Continue the same way you would do with the mitmproxy dump. mitmproxy2swagger will automatically detect the HAR file and process it. Example output See the examples . You will find a generated schema there and an html file with the generated documentation (via redoc-cli ). See the generated html file here . Development and contributing This project uses: poetry for dependency management pre-commit for code formatting and linting pytest for unit testing To install the dependencies: bash
poetry install Run linters: bash
pre-commit run --all-files Install pre-commit hooks: bash
pre-commit install Run tests: bash
poetry run pytest Run tests with coverage: bash
poetry run pytest --cov=mitmproxy2swagger License MIT | Automagically reverse-engineer REST APIs via capturing traffic | mitmproxy,openapi,reverse-engineering,swagger | 18 | 16 | 135 | 303 | 11 | 13 | 3 |
windingwind/zotero-better-notes | Better Notes for Zotero Everything about note management. All in Zotero. Better Notes Handbook (outdated, for version<=0.8.9): 中文 (provide translation) 🧩 Outline 🧐 What is this? 🤔 What can it do? 👋 Install 😎 Quick start More [Getting Started with the _Workspace_](#getting-started-with-the-workspace)
[Note Editor](#note-editor)
[Note Link](#note-link)
[Note Template](#note-template)
[Syncing: Note 🔄️ Markdown](#syncing-note-%EF%B8%8F-markdown)
[Note Export](#note-export)
[GPT Integration](#gpt-integration)
[Action Workflow](#action-workflow)
[Other Features](#other-features) 🧲 API 🔧 Development 🔔 Disclaimer 🔎 My Zotero Plugins 🫶 Sponsors 🤗 Contributors 🧐 What is this? Better Notes (BN) is a plugin for Zotero . BN streamlines your workflows of: paper reading annotating note taking metadata analyzing knowledge exporting AI writing assistant and: works out of the box highly customizable all in Zotero 🤔 What can it do? 🖇️ Connect knowledge fragments with note link . With one click. Learn more → 🗂️ Simplify and automate knowledge analysis with extensive note templates . With one click. Learn more → 🔄️ Keep in sync with your Markdown files. Two-way, automatically. Learn more → 🖨️ Export notes to different formats: Markdown, Docx, PDF, and mind map. Learn more → 📝 Enhancements for Zotero's note editor with outline, link relation, view images.... Open as much note tabs/windows as you like! 👋 Install Download the plugin (.xpi file) from below. For Zotero 7 beta, please always use the latest beta version. Latest Version: 1.1.4-beta.86 Latest Stable v1.0.4 (last for Zotero 6) v0.8.9 (last with auto-insert, tag-insert, math-ocr, for Zotero 6) All Releases (including beta plugin for Zotero 7 beta) Note : If you're using Firefox as your browser, right-click the .xpi and select "Save As.." In Zotero click Tools in the top menu bar and then click Plugins Go to the Extensions page and then click the gear icon in the top right. Select Install Add-on from file . Browse to where you downloaded the .xpi file and select it. Finish! 😎 Quick start BN offers a range of features that can be combined like Lego blocks to build your own note-taking workflow. Start taking notes in Zotero with BN in 5 minutes ! Getting Started with the Workspace 💡 This section is outdated and will be removed. For the latest beta version, the workspace is no longer a thing. You can open unlimited number of note tab/window, which is what we call workspace in the past. The workspace serves as the central hub where input flows (papers and annotations) converge with output flows (summaries and comparisons). To open the workspace , click the button in the tabs bar. The workspace contains a default note called the workspace note . You can create a new note as the workspace note if prompted on opening workspace . 💡 How to set an existing note as the workspace note ? In the library: select a note item and right-click In the note editor: click on the Tools button You can change the workspace note at any time. The workspace allows you to take notes and write, just like you would in MS Word or a markdown editor (e.g., Obsidian). Explore the Workspace ! 💡 The layout from left to right is: Outline Workspace note editor (main editor) Note link preview (hidden by default) Reader notes pane (hidden by default) 💡 To toggle these panes, hover the workspace tab and click corresponding buttons. 💡 To open the workspace in a new window, drag the workspace tab. Note Editor The workspace includes the note editor for the workspace note . You can use it to take notes and write summaries. 💡 How to open note editor? In the library: click to open a note editor and double-click to open note editor in a standalone window. In the PDF reader: right-side bar 💡 How to create a new note? Click the note icon in the library tools bar (the row under the tabs bar). Note Link To create a note link between current note and the workspace note , simply click the button in the title bar of current note editor. Note Template Still spending a lot of time writing summaries or doing copy-pasting while taking notes? Say hello to Note Template ! Note Template is designed for tasks like: Summarize metadata and annotations from multiple papers, with customized filters Compare papers across sections Generate content programmatically 💡 Need help or looking for community templates? See here → 💡 Want to write/share your own templates? How to write → How to share → Syncing: Note 🔄️ Markdown With BN, you can integrate your note-taking into your existing workflow seamlessly. If you use markdown editors like Obsidian, you can keep your notes in sync with external Markdown files easily. To set up auto-sync, click Set Auto-Sync the first time you export your note. There is no need for any third-party tools or complicated setups! Any changes made to your note or its corresponding Markdown file will be automatically synced. This feature makes it easy to keep all of your notes up to date and in one place. 💡 Note: The note being edited will be synced after the editor is closed. Note Export BN offers various options to export your notes, giving you the flexibility to choose the format that suits your needs. You can export your note to the following formats: A new note in Zotero Markdown file (embedded or linked, with images) MS Word document (.docx) PDF document (.pdf) FreeMind file (.mm) Simply click on the corresponding export button in the toolbar and follow the prompts. GPT Integration The Zotero-GPT plugin provides GPT Integration. If you also have Better Notes installed, you can wake up GPT pane in the workspace note editor with space key. You can: Ask GPT questions about current note Summarize/fix spelling and grammar/translate/polish the selection Accept suggestions/modifications from GPT with enter key. Action Workflow The Actions & Tags plugin provides a powerful workflow engine for Zotero. If you also have Better Notes installed, you can use the following actions to automate note generation/editing/syncing/etc.: Auto-generate note from template when opening an item Auto-sync note when opening/creating an item More... Other Features Quick Note: convert annotation to note with one click. Resize images with right-click menu. Preview images with double-click/ctrl-click. 🧲 API BN provides APIs for other plugin developers in Zotero.BetterNotes.api.${API_MODULE} . See api.ts . workspace : Workspace APIs sync : Syncing APIs convert : Lossless conversion between note, HTML, Markdown, note link, and annotation template : Manipulate note templates $export : Export note $import : Import note editor : Note editor APIs. Give your script the full control of contents in the note editor. 🔧 Development This plugin is built based on the Zotero Plugin Template . See the setup and debug details there. To startup, run bash
git clone https://github.com/windingwind/zotero-better-notes.git
cd zotero-better-notes
npm install
npm run build The plugin is built to ./builds/*.xpi . 🔔 Disclaimer Use this code under AGPL. No warranties are provided. Keep the laws of your locality in mind! 🔎 My Zotero Plugins Translate for Zotero : PDF translation for Zotero zotero-pdf-preview : PDF preview for Zotero zotero-tag : Automatically tag items/Batch tagging 🙌 Sponsors Thanks peachgirl100 , Juan Gimenez ,
and other anonymous sponsors! If you want to leave your name here, please email me or leave a message with the donation. 🤗 Contributors | Everything about note management. All in Zotero. | knowledge,markdown,mindmap,note,notes,plugin,zotero,zotero-addon,zotero-plugin,addon | 232 | 10 | 84 | 916 | 6 | 6 | 2 |
chakra-ui/panda | Panda is a universal styling solution for the modern web — build time, type safe, and scalable CSS-in-JS Features ⚡️ Write style objects or style props, extract them at build time ✨ Modern CSS output — cascade layers @layer , css variables and more 🦄 Works with most JavaScript frameworks 🚀 Recipes and Variants - Just like Stitches™️ ✨ 🎨 High-level design tokens support for simultaneous themes 💪 Type-safe styles and autocomplete (via codegen) 🐼 Get a taste of Panda. Try it out for yourself in StackBlitz Documentation Visit our official documentation . Install The recommended way to install the latest version of Panda is by running the command below: bash
npm i -D @pandacss/dev To scaffold the panda config and postcss bash
npx panda init -p Setup and import the entry CSS file css
@layer reset, base, tokens, recipes, utilities; jsx
import 'path/to/entry.css' Start the dev server of your project bash
npm run dev Start using panda ```jsx
import { css } from '../styled-system/css'
import { stack, vstack, hstack } from '../styled-system/patterns' function Example() {
return ( Box 1 Box 2 )
}
``` Directory Structure | Package | Description |
| --------------------------------------------- | ----------------------------------------------------------- |
| cli | CLI package installed by the end user |
| core | Contains core features of Panda (utility, recipes, etc) |
| config | Contains functions for reading and merging the panda config |
| extractor | Contains code for fast AST parsing and scanning |
| generator | Contains codegen artifacts (js, css, jsx) |
| parser | Contains code for parsing a source code |
| is-valid-prop | Contains code for checking if a prop is a valid css prop |
| node | Contains the Node.js API of Panda's features |
| token-dictionary | Contains code used to process tokens and semantic tokens |
| shared | Contains shared TS functions | Contributing Feel like contributing? That's awesome! We have a contributing guide to help guide you. Want to help improve the docs? Our docsite lives in the monorepo . If you're interested in contributing to the documentation, check out the contributing guide . Support Having trouble? Get help in the official Panda Discord . Acknowledgement The development of Panda was only possible due to the inspiration and ideas from these amazing projects. Chakra UI - where it all started Vanilla Extract - for inspiring the utilities API Stitches - for inspiring the recipes and variants API Tailwind CSS - for inspiring the JIT compiler and strategy Class Variance Authority - for inspiring the cva name Styled System - for the initial idea of Styled Props Linaria - for inspiring the initial atomic css strategy Uno CSS - for inspiring the studio and astro integration Goober - for tiny and performant js functions in template literal styles License MIT License © 2023-Present Segun Adebayo | 🐼 Universal, Type-Safe, CSS-in-JS Framework for Product Teams ⚡️ | css,styled-system,typescript,utility-classes,design-system,framework-agnostic,postcss,atomic-css,css-in-js,engine | 1,000 | 123 | 1,741 | 2,883 | 0 | 37 | 2 |
cloudwego/hertz | Hertz English | 中文 Hertz [həːts] is a high-usability, high-performance and high-extensibility Golang HTTP framework that helps developers build microservices. It was designed with reference to other open-source frameworks like fasthttp , gin , echo and combined with the internal requirements in ByteDance. At present, it has been widely used inside ByteDance. Nowadays, more and more microservices use Golang. If you have requirements for microservice performance and hope that the framework can fully meet the internal customizable requirements, Hertz will be a good choice. Basic Features High usability During the development process, it is often more important to write the correct code quickly. Therefore, in the iterative process of Hertz, we actively listen to users' opinions and continue to polish the framework, hoping to provide users with a better user experience and help users write correct code faster.
- High performance Hertz uses the self-developed high-performance network library Netpoll by default. In some special scenarios, compared to Go Net, Hertz has certain advantages in QPS and time delay. For performance data, please refer to the Echo data in the figure below. Comparison of four frameworks: Comparison of three frameworks: For detailed performance data, please refer to hertz-benchmark .
- High extensibility Hertz adopts a layered design, providing more interfaces and default extension implementations. Users can also extend by themselves. At the same time, thanks to the layered design of the framework, the extensibility of the framework will be much greater. At present, only stable capabilities are open-sourced to the community. More planning refers to RoadMap .
- Multi-protocol support The Hertz framework provides HTTP/1.1, HTTP/2, HTTP/3, ALPN protocol support natively. In addition, due to the layered design, Hertz even supports custom build protocol resolution logic to meet any needs of protocol layer extensions.
- Network layer switching capability Hertz implements the function to switch between Netpoll and Go Net on demand. Users can choose the appropriate network library for different scenarios. And Hertz also supports the extension of network library in the form of plug-ins. Documentation Getting Started Example The Hertz-Examples repository provides code out of the box. more Basic Features Contains introduction and use of general middleware, context selection, data binding, data rendering, direct access, logging, error handling. more Observability Contains instrumentation, logging, tracing, monitoring, OpenTelemetry integration. more Service Governance Contains service registration and discovery extensions, Sentinel integration. more Framework Extension Contains network library extensions. more Reference Apidoc, framework configurable items list. more FAQ Frequently Asked Questions. more Performance Performance testing can only provide a relative reference. In production, there are many factors that can affect actual performance.
We provide the hertz-benchmark project to track and compare the performance of Hertz and other frameworks in different situations for reference. Related Projects Netpoll : A high-performance network library. Hertz integrated by default. Hertz-contrib : A partial extension library of Hertz, which users can integrate into Hertz through options according to their needs. Example : Use examples of Hertz. Extensions | Extensions | Description |
|----------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Autotls | Make Hertz support Let's Encrypt. |
| Http2 | HTTP2 support for Hertz. |
| Websocket | Enable Hertz to support the Websocket protocol. |
| Etag | Support ETag (or entity tag) HTTP response header for Hertz. |
| Limiter | Provides a current limiter based on the bbr algorithm. |
| Monitor-prometheus | Provides service monitoring based on Prometheus. |
| Obs-opentelemetry | Hertz's Opentelemetry extension that supports Metric, Logger, Tracing and works out of the box. |
| Opensergo | The Opensergo extension. |
| Pprof | Extension for Hertz integration with Pprof. |
| Registry | Provides service registry and discovery functions. So far, the supported service discovery extensions are nacos, consul, etcd, eureka, polaris, servicecomb, zookeeper, redis. |
| Sentry | Sentry extension provides some unified interfaces to help users perform real-time error monitoring. |
| Tracer | Link tracing based on Opentracing. |
| Basicauth | Basicauth middleware can provide HTTP basic authentication. |
| Jwt | Jwt extension. |
| Keyauth | Provides token-based authentication. |
| Requestid | Add request id in response. |
| Sessions | Session middleware with multi-state store support. |
| Casbin | Supports various access control models by Casbin. |
| Cors | Provides cross-domain resource sharing support. |
| Csrf | Csrf middleware is used to prevent cross-site request forgery attacks. |
| Secure | Secure middleware with multiple configuration items. |
| Gzip | A Gzip extension with multiple options. |
| I18n | Helps translate Hertz programs into multi programming languages. |
| Lark | Use hertz handle Lark/Feishu card message and event callback. |
| Loadbalance | Provides load balancing algorithms for Hertz. |
| Logger | Logger extension for Hertz, which provides support for zap, logrus, zerologs logging frameworks. |
| Recovery | Recovery middleware for Hertz. |
| Reverseproxy | Implement a reverse proxy. |
| Swagger | Automatically generate RESTful API documentation with Swagger 2.0. |
| Cache | Hertz middleware for cache HTTP response with multi-backend support | Blogs ByteDance Practice on Go Network Library Ultra-large-scale Enterprise-level Microservice HTTP Framework — Hertz is Officially Open Source! ByteDance Open Source Go HTTP Framework Hertz Design Practice Help ByteDance Reduce Costs and Increase Efficiency, the Design Practice for Large-scale Enterprise-level HTTP Framework Hertz A Practical Introduction to the HTTP Framework Hertz: A Guide to Performance Testing Contributing Contributing RoadMap Hertz RoadMap License Hertz is distributed under the Apache License, version 2.0 . The licenses of third party dependencies of Hertz are explained here . Community Email: conduct@cloudwego.io How to become a member: COMMUNITY MEMBERSHIP Issues: Issues Slack: Join our CloudWeGo community Slack Channel . Lark: Scan the QR code below with Lark to join our CloudWeGo/hertz user group. Contributors Thank you for your contribution to Hertz! Landscapes CloudWeGo enriches the CNCF CLOUD NATIVE Landscape . | Go HTTP framework with high-performance and strong-extensibility for building micro-services. | go,http,microservices | 49 | 71 | 577 | 539 | 29 | 26 | 6 |
winglang/wing | Welcome to the Wing Language! :wave: Take a Tour ▪︎ Getting Started ▪︎ Join Discord ▪︎ FAQ ▪︎ Roadmap ▪︎ Issues ▪︎ Discussions ▪︎ Contribute Winglang is a new open-source programming language designed for the cloud (aka " cloud-oriented ").
Wing enables developers to build distributed systems that leverage cloud services as first-class citizens by combining infrastructure and application code in a safe and unified programming model (aka " cloud-oriented ").
Wing programs can be executed locally ( yes, no internet required ) using a fully-functional simulator, or deployed to any cloud provider ( yes, Wing programs are portable across providers ). The mission of Winglang is to bring back your creative flow and close the gap between imagination and creation. Developing for the cloud today requires mastering various layers of the cloud stack, IAM roles, networking, and numerous tools, along with finding creative ways to test and debug code. In addition, long deployment times hinder iteration cycles and take developers out of their creative flow. Winglang addresses these pains by letting you work at a higher level of abstraction and allowing you to focus on business logic instead of cloud mechanics, only surfacing low-level details when it's needed.
We also provide you with a set of tools that let you test your code locally, significantly faster than before. Wing is built by Elad Ben-Israel , the guy behind the AWS CDK , the gang at the Wing Cloud team and an amazing community of contributors (also known as Wingnuts). Click here to watch a short video introduction to the Wing language. Why do we think the cloud needs a programming language? 🤔 Cloud applications are fundamentally different from applications that run on a single machine -
they are distributed systems that rely on cloud infrastructure to achieve their goals. In order to be able to express both infrastructure and application logic in a safe and unified programming model,
Winglang has two execution phases: preflight for infrastructure definitions and inflight for runtime code. Preflight code is executed during compilation and produces the infrastructure configuration for your app (e.g. Terraform , CloudFormation , etc).
Inflight code is compiled into JavaScript and executed within cloud compute platforms in Node.js environments. Let's look at a simple example: ```js
bring cloud; let queue = new cloud.Queue();
let counter = new cloud.Counter();
let bucket = new cloud.Bucket(); queue.setConsumer(inflight (message) => {
let i = counter.inc();
bucket.put("file-{i}.txt", message);
});
``` cloud.Queue , cloud.Counter and cloud.Bucket are preflight objects .
They represent cloud infrastructure resources.
When compiled to a specific cloud provider, such as AWS, a Terraform file will be produced with the provider's implementation
of these resources. The queue.setConsumer() method is a preflight method that configures the infrastructure to
invoke a particular inflight function for each message in the queue. Now comes the cool part: the code that runs inside the inflight function interacts with the counter and the bucket objects
through their inflight methods ( counter.inc() and bucket.put() ). These methods can only be
called from inflight scopes. Very cool, but what here cannot be done by a library or compiler extension? In existing languages, where there is no way to distinguish between multiple execution phases, it is impossible to naturally represent this idea that an object has methods that can only be executed from within a specific execution phase (or within certain scopes of the program).
You are welcome to read more about it here (including code samples that show the same app built in Wing vs. other solutions). What makes Wing a good fit for cloud development? 🌟 Wing was built from scratch to make it easy for building applications on any cloud.
It includes an assembly of different features that serve that purpose: Cloud services as first-class citizens, with phase modifiers for describing infrastructure and runtime code ( preflight and inflight ). Wing Cloud Library provides a standard set of resources that lets you write cloud portable code. Custom platforms that keep you in control by allowing you to customize the infrastructure definitions and run policy checks. Use any resource in the Terraform ecosystem as first-class citizen in your app. JavaScript interoperability . Automatic generation of IAM policies and other cloud mechanics based on source code. Wing Console - a visual application-centric operations and management console, that lets you interact with... A simulator that can used for testing and debugging in milliseconds. JSON as a primitive data type with schema validation support for each conversion to and from structs. Immutability by default , implicit async code , and safety from nulls and undefined . For a more in-depth look at Wing's features and benefits, check out our documentation . Getting started 🛠️ 🚧 This is a pre-release, please see our project status for more details. If you'd just like to dip your feet in the water and see what Wing is all about, you can try it out in our online playground or walk through the interactive tour . When you're ready to start building your own Wing apps, you'll need to: Install the Wing CLI . Get the Wing IDE Extension for your favorite editor. Launch the Wing Console and take it for a spin! For a step-by-step guide, head over to our Getting Started guide.
It's a once-in-a-lifetime adventure into the Wing rabbit hole! FAQs ❓ Here are some questions we're commonly asked that are covered by our FAQ : Who is behind this project? Which clouds are supported by Wing? Which provisioning engines are supported by Wing? Community 💬 Join our flock in the Wing Discord community.
We're here to help each other, answer questions, and share our cloud adventures.
Alternatively, post any questions on GitHub Discussions . Contributing 🤝 Want to help Wing take flight?
Check out our contribution guide to learn how to set up a development environment and contribute to the project.
You can also get started by opening the project in GitHub Codespaces. We are incredibly grateful to our entire community for contributing bug fixes and improvements: License 📜 Wing is licensed under the MIT License .
Contributions are made under our contribution license . Happy coding, and remember: the sky's the limit with Wing (yes, another pun)! 🌤️🚀 | A programming language for the cloud ☁️ A unified programming model, combining infrastructure and runtime code into one language ⚡ | programming-language,cloud,compiler,sdk,serverless,language,winglang,devops-tools,devtool,rust | 1,000 | 947 | 3,359 | 3,114 | 883 | 280 | 13 |
I-Am-Jakoby/Flipper-Zero-BadUSB | # 💀 BadUSB 💀 Subscribing to my YouTube would also be greatly appreciated.
[ ](https://jakoby.lol/yno) Table of Contents Description The Payloads Contact Acknowledgments Unleash the power of your Flipper 🤓💻 Description 🥇 I am in 1st place for most payloads submitted to Hak5❗ 🔓 I have taken my colllection of payloads and formatted them to work for the Flipper for all of you to use❗ ⚠️ Please ENJOY and use RESPONSIBLY❗ The Payloads This repository has been optimized to facilitate plug and play functionality. I purchased the domain jakoby.lol for the sole purpose of creating my own short URLs. I did this with the intention of making room for Discord webhooks and Dropbox tokens to fit in my one-liners. This, in turn, makes it so the user no longer needs to host their own version of the script. | Payloads | Description | Plug'n'Play | Author |
| :-------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------ | :-----------| :-----------|
| VoiceLogger | Activates your target's microphone, converts their speech to text, and exfils it to Discord. |✅ | Jakoby |
| Evil-Goose | A payload that hires a goose to hack your target in real time. |✅ | Jakoby | | ADV-Recon | A script used to do an advanced level of recon on the target's computer. |✅ | Jakoby | | AcidBurn | A script I put together to be used on your friends or foes. Prepare to be roasted. |✅ | Jakoby |
| Jump-Scare | Just a little jumpscare that changes the target's wallpaper. |✅ | Jakoby |
| Jump-Scare V2 | Just a little jumpscare that plays a video in the target's PowerShell console. |✅ | Jakoby |
| ADV-RickRoll | RickRoll that plays in the PowerShell console after a mouse movement is detected. |✅ | Jakoby |
| PineApple | Connect a target's PC to your WiFi PineApple. |⛔ | Jakoby |
| Play-WAV | Download a WAV file and play it after a mouse movement is detected. |✅ | Jakoby |
| Rage-Pop-Ups | Generates an infinite loop of insulting pop-ups. |⛔ | Jakoby |
| Subscribe | Used to make your target subscribe to your YouTube channel. |✅ | Jakoby | | Must Sub | A script used to make your target subscribe to 15 of Jakoby's favorite YouTube channels. |✅ | Jakoby |
| PS-Draw | A script used to generate and draw images in the PowerShell window. |⛔ | Jakoby |
| WallPaper-Troll | Collects sensitive info from your target and displays it as their wallpaper to taunt them. |✅ | Jakoby |
| WallPaper-URL | Sets the target's wallpaper to an image you provide via a URL after a mouse movement is detected. |✅ | Jakoby |
| We-Found-You | Opens a map with your target's current location on it. |✅ | Jakoby |
| YT-Tripwire | Opens any YouTube video after a mouse movement is detected. |✅ | Jakoby |
| Credz-Plz | A script used to prompt the target to enter their credentials to later be exfiltrated. |✅ | Jakoby |
| Shortcut Jacker | A script used to embed malware in the shortcut on your target's desktop. |⛔ | Jakoby |
| Wifi Grabber | Grabs your target's WiFi passwords and uploads them to either Dropbox, Discord, or both. |✅ | Jakoby |
| IP Grabber | Grabs your target's IP addresses and uploads them to either Dropbox, Discord, or both. |✅ | Jakoby |
| Browser Data | This payload can be used to retrieve the browsing history and bookmarks of your target. |✅ | Jakoby | Contact 📱 My Socials 📱 YouTube Twitter Instagram Discord TikTok Acknowledgments Hak5 Darren UberGuidoZ ( back to top ) | Repository for my flipper zero badUSB payloads. Now almost entirely plug and play. | badusb,badusb-payloads,flipper-zero,flipperzero,hak5 | 0 | 5 | 40 | 619 | 41 | 2 | 0 |
Cveinnt/LiveTerm | 💻 LiveTerm - build terminal styled websites in minutes! Highly customizable, easy-to-use, and minimal terminal styled website template, powered by Next.js. Building a simple website with LiveTerm only takes minutes , and you only need to work with one file: config.json . After you cloned this repository, simply run yarn install && yarn dev and start editing config.json to build your website! LiveTerm can be used to build a variety of websites: personal website browser startpage project page or maybe just a cool browser music player...be creative! Feel free to play with the web demo above! 📸 Showcase LiveTerm with different themes my personal website 🚀 Ship your LiveTerm site in less than 5 minutes LiveTerm requires the yarn package manager. You can install yarn here . Simply run the following commmand in your terminal: bash
sh -c "$(curl -fsSL https://raw.github.com/Cveinnt/LiveTerm/main/install/install.sh)" This will install LiveTerm to the current directory. You can start building your website with: bash
cd LiveTerm && yarn dev Start editing config.json and try saving and see the updated changes! Alternatively, you can clone this repository to a location of your choosing bash
git clone https://github.com/Cveinnt/LiveTerm.git && cd LiveTerm Then install dependencies and start developing there: bash
yarn install && yarn dev Docker Usage First, clone the project and edit config.json to your liking. Then run the following to start the container in the background: shell
docker-compose up -d If you know what you were doing, you can also try changing Dockerfile & docker-compose.yml !
Learn more about Docker here . 📄 Configuration Basic Configuration 90% of LiveTerm's configurations are done through the config.json file. javascript
{
"readmeUrl": // create a Github README and link it here!
"title": // title of the website
"name": // your name, included in 'about' command
"ascii": // ascii art to display
"social": {
"github": // your handle
"linkedin": // your handle
},
"email": // your email
"ps1_hostname": "liveterm" // hostname in prompt
"ps1_username": "visitor", // username in prompt
"resume_url": "../resume.pdf", // path to your resume
"non_terminal_url": "W",
"colors": {
"light": {
...
},
"dark": {
... // you can use existing templates in themes.json or use your own!
}
}
} Feel free to change it as you see fit! Themes You can find several pre-configured themes in themes.json , and you can replace the colors in config.json with the theme color you like! The themes are based on the themes on this website . For a better preview of the themes, checkout the images in the demo folder. Favicons Favicons are located in public/ , along with the other files you may want to upload to your website. I used this website to generate favicons. Banner You may also want to change the output of the banner command. To do that, simply paste your generated banner in src/utils/bin/commands.ts . I used this website to generate my banner. Advanced Configuration If you want to further customize your page, feel free to change the source code to your liking! 🌐 Deploy on Vercel The easiest way to deploy a Next.js app is to use the Vercel Platform from the creators of Next.js. You can install vercel cli and follow the instruction here . You can also connect your github account to vercel and have vercel automatically deploy the github repository for you. Credit Based on M4TT72's awesome Terminal . | 💻 Build terminal styled websites in minutes! | nextjs,personal-website,terminal,typescript,vercel,website-template | 0 | 3 | 23 | 23 | 7 | 1 | 0 |
hagezi/dns-blocklists | :zap: DNS Blocklists - For a better internet! Made with :heartbeat: for a safer and cleaner internet! It always seems impossible until it’s done. Privacy is not a crime, protect yourself. Privacy matters. Privacy is what allows us to determine who we are and who we want to be :bangbang: If you like the project and you can benefit from it, leave a :star: (top right) and become a stargazer ! Thanks for your support! :bookmark_tabs: Table of Contents Overview Multi light - Hand brush: Light protection Multi normal - Broom: All-round protection Multi pro - Big broom: Extended protection (Recommended) : Full - Mini Multi pro++ - Sweeper: Maximum protection (more aggressive) : Full - Mini Multi ultimate - Ultimate Sweeper: Aggressive protection : Full - Mini Fake - Protects against internet scams, traps & fakes! Pop-Up Ads - Protects against annoying and malicious pop-up ads! Threat Intelligence Feeds - Increases security significantly! (Recommended) : Full - Medium - IPs Newly Registered Domains - Favoured by threat actors to launch malicious campaigns! : 14 days - 30 days DoH/VPN/TOR/Proxy Bypass - Prevent methods to bypass your DNS! : Full - DoH only - DoH IPs Safesearch not supported - Prevent the use of search engines that do not support Safesearch! Dynamic DNS - Protects against the malicious use of dynamic DNS services! Badware Hoster - Protects against the malicious use of free host services! Most Abused TLDs - Protects against known malicious Top Level Domains! Anti Piracy - Protects against piracy! Gambling - Protects against gambling content! NSFW (external) - oisd NSFW - Protects against adult content! Native Tracker - Broadband tracker of devices, services and operating systems Supporter - Leave a star (top right)! Recommendation - Which version of the lists should I use? Online DNS Services About : Contact - Groups - Repository - Referral Domains - Support Me Statistics - Sources FAQ - Frequently Asked Questions :books: Multi - Cleans the Internet and protects your privacy! An all-in-one DNS blocklist in various versions (light, normal, pro, pro++ and ultimate) . It can be used as a standalone blocklist. For every region. Blocks ads, affiliate, tracking, metrics, telemetry, fake, phishing, malware, scam, coins and other "crap". Based on various blocklists . No, they are not just block lists cobbled together from different sources. They have been optimized and extended to efficiently "clean the Internet" in all areas. See also: Which sources are used for the lists and how are the lists compiled on the basis of these sources? Blocklist version and size overview: | Version | Entries | Pro++ | Pro | Nor mal | Light | Fake | TIF | Nat ive | PopUp Ads | Bug Tracker |
|:--------|---:|:------:|:-----:|:----:|:----:|:---:|:------:|:----------:|:----:|:----:|
| :green_book: Light | 136656 66876 | | | | :green_circle: | | | :yellow_square: | | |
| :blue_book: Normal | 421516 139117 | | | :green_circle: | :green_circle: | | :yellow_square: | :yellow_square: | :yellow_square: | |
| :ledger: Pro | 485618 159996 | | :green_circle: | :green_circle: | :green_circle: | | :yellow_square: | :yellow_square: | :yellow_square: | :green_circle: |
| :orange_book: Pro++ | 586254 181993 | :green_circle: | :green_circle: | :green_circle: | :green_circle: | |:yellow_square: | :yellow_square: | :yellow_square: | :green_circle: |
| :closed_book: Ultimate | 627491 196408 | :green_circle: | :green_circle: | :green_circle: | :green_circle: | | :yellow_square: | :green_circle: | :yellow_square: | :green_circle: | :green_circle: contains the list named in the column caption
:yellow_square: partially contains the list named in the column caption Blocking level: | Version | Blocking level | Blocking type |
|:--------|:---------------|:--------------|
| :green_book: Light | :green_book::green_book: | Relaxed |
| :blue_book: Normal | :blue_book::blue_book::blue_book: | Relaxed/Balanced |
| :ledger: Pro | :ledger::ledger::ledger::ledger: | Balanced |
| :orange_book: Pro++ | :orange_book::orange_book::orange_book::orange_book::orange_book::orange_book: | Balanced/Aggressive |
| :closed_book: Ultimate | :closed_book::closed_book::closed_book::closed_book::closed_book::closed_book::closed_book: | Aggressive | [!TIP] For a recommendation, see: Which version of the lists should I use? :green_book: Multi LIGHT - Basic protection Hand brush - Cleans the Internet and protects your privacy! Blocks Ads, Tracking, Metrics and some Badware. [!NOTE] Does not block error trackers such as Bugsnag, Crashlytics, Firebase, Instabug, Sentry, ... and other app-specific crash trackers. These are only blocked from the Pro version onwards. Entries: 136656 domains/hosts - 66876 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile, Windows |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ |
| PAC | Link Mirror Mirror | Proxy Auto Configuration | :blue_book: Multi NORMAL - All-round protection Broom - Cleans the Internet and protects your privacy! Blocks Ads, Affiliate, Tracking, Metrics, Telemetry, Phishing, Malware, Scam, Fake, Coins and other "Crap". [!NOTE] Does not block error trackers such as Bugsnag, Crashlytics, Firebase, Instabug, Sentry, ... and other app-specific crash trackers. These are only blocked from the Pro version onwards. Entries: 421516 domains/hosts - 139117 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :ledger: Multi PRO - Extended protection (Recommended) Big broom - Cleans the Internet and protects your privacy! Blocks Ads, Affiliate, Tracking, Metrics, Telemetry, Phishing, Malware, Scam, Fake, Coins and other "Crap". Entries: 485618 domains/hosts - 159996 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :ledger: Multi PRO mini Size-optimised version for DNS/Browser adblockers. This list only contains domains from the Pro full that have been found on Top 1M lists (Umbrella, Cloudflare, Tranco, Chrome, ...) in the last 12 months. Entries: 80167 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :orange_book: Multi PRO++ - Maximum protection Sweeper - Aggressive cleans the Internet and protects your privacy! Blocks Ads, Affiliate, Tracking, Metrics, Telemetry, Phishing, Malware, Scam, Fake, Coins and other "Crap". [!NOTE] More aggressive version of the Multi PRO blocklist. It may contain a few false positive domains that limit functionality. Therefore it should only be used by experienced users. Furthermore, an admin should be available to unblock incorrectly blocked domains. Reported false positive domains will be removed from the list! Entries: 586254 domains/hosts - 181993 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :orange_book: Multi PRO++ mini Size-optimised version for DNS/Browser adblockers. This list only contains domains from the Pro++ full that have been found on Top 1M lists (Umbrella, Cloudflare, Tranco, Chrome, ...) in the last 12 months. Entries: 91695 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :closed_book: Multi ULTIMATE - Aggressive protection Ultimate Sweeper - Strictly cleans the Internet and protects your privacy! Blocks Ads, Affiliate, Tracking, Metrics, Telemetry, Phishing, Malware, Scam, Free Hoster, Fake, Coins and other "Crap". [!NOTE] Stricter version of the Multi PRO++ blocklist. It may contain false positive domains that limit functionality. Therefore it should only be used by experienced users. Furthermore, an admin should be available to unblock incorrectly blocked domains. Reported false positive domains will be removed from the list! [!WARNING] META trackers are blocked in Ultimate. This restricts the use of Facebook/Instagram and Facebook Messenger apps. To use Facebook/Instagram apps with Ultimate, unblock the following domains: META Tracker Entries: 627491 domains/hosts - 196408 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :closed_book: Multi ULTIMATE mini Size-optimised version for DNS/Browser adblockers. This list only contains domains from the Ultimate full that have been found on Top 1M lists (Umbrella, Cloudflare, Tranco, Chrome, ...) in the last 12 months. Entries: 102755 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: 24 hours (update frequency) :trollface: Fake - Protects against internet scams, traps & fakes! A blocklist for blocking fake stores, -streaming, rip-offs, cost traps and co. | | Light | Normal | Pro | Pro++ | Ultimate | TIF TIF medium |
|:-----------:|:-----:|:---------------:|:--------------:|:--------------:|:--------------:|:--------------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: | :green_circle: yes :yellow_square: partially :x: no Entries: 31589 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :tada: Pop-Up Ads - Protects against annoying and malicious pop-up ads! A blocklist for annoying and malicious blocking pop-up ads. | | Light | Normal | Pro | Pro++ | Ultimate | TIF |
|:-----------:|:--------------:|:--------------:|:--------------:|:--------------:|:--------------:|:--------:|
| Included in | :x: | :yellow_square: | :yellow_square: | :yellow_square: | :yellow_square: | :yellow_square: | :green_circle: yes :yellow_square: partially :x: no [!NOTE] In the combination of the Pro or higher and additionally the Threat Intelligence Feeds (TIF), all domains from the Pop-Up Ads list are included. This means that if you use the Pro or higher and also the TIF full, you no longer need to add this list separately. Entries: 82532 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: 24 hours (update frequency) :closed_lock_with_key: Threat Intelligence Feeds - Increases security significantly! (Recommended) A blocklist for blocking malware, cryptojacking, scam, spam and phishing. Blocks domains known to spread malware, launch phishing attacks and host command-and-control servers. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|
| Included in | :x: | :yellow_square: | :yellow_square: | :yellow_square: | :yellow_square: | :green_circle: yes :yellow_square: partially :x: no Entries: 1278539 domains/hosts - 728405 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, ~~AdGuard~~ (too big!), AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :closed_lock_with_key: Threat Intelligence Feeds - Medium version A medium version of the Threat Intelligence Feeds list. Designed for Adblockers that have problems with the size of the full TIF list. Contains only important feeds. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|
| Included in | :x: | :yellow_square: | :yellow_square: | :yellow_square: | :yellow_square: | :green_circle: yes :yellow_square: partially :x: no Entries: 184171 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | :closed_lock_with_key: Threat Intelligence Feeds - IPs IPv4 lists in plain IP format for firewalls and AdGuard Home format are also available as an extension to the TIF list. [!TIP] If the IP list is used in AdGuard Home, all domains that would resolve to the blocked IP are blocked. To prevent the blocked domains from being resolved via IPv6, it is necessary to deactivate resolving via IPv6 in AdGuard Home: Settings > DNS settings > DNS server configuration > Disable resolving of IPv6 addresses Expires: 24 hours (update frequency) :new: Newly Registered Domains (NRDs) A blocklist for blocking domains registered in the last 14 or 30 days. These domains are known to be favoured by threat actors to launch malicious campaigns. [!IMPORTANT] This is an external list that is created and maintained by @xRuffKez . Please address requests directly to the maintainer in the corresponding repository . [!NOTE] It may contain a few false positive domains that limit functionality. Therefore it should only be used by experienced users. Furthermore, an admin should be available to unblock incorrectly blocked domains. | | Light | Normal | Pro | Pro++ | Ultimate | TIF |
|:-----------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:| :---: |
| Included in | :x: | :x: | :x: | :x: | :x: | :yellow_square: | :green_circle: yes :yellow_square: partially :x: no :new: Domains registered in the last 14 days | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link | Pi-hole, ~~AdGuard~~ (too big!), AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam |
| Wildcard Asterisk | Link | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro | :new: Domains registered in the last 30 days [!IMPORTANT] The 30-day list has been divided into two parts in order not to exceed the maximum size for files on Github. Both parts must be used. [!WARNING] The total size of the list can lead to problems in some Adblockers. If this is the case, use the 14-day version of the list. | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Part1 Part2 | Pi-hole, ~~AdGuard~~ (too big!), AdGuard Home (can lead to problems due to the size!), eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam |
| Wildcard Asterisk | Part1 Part2 | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Part1 Part2 | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro | Expires: 24 hours (update frequency) :outbox_tray: DoH/VPN/TOR/Proxy Bypass - Prevent methods to bypass your DNS! Prevent methods to bypass your DNS. [!NOTE] To ensure the bootstrap is your DNS server you must redirect or block standard DNS outbound (TCP/UDP 53) and block all DNS over TLS (TCP 853) outbound. The block list exists in two versions: Complete Edition - Encrypted DNS Servers, VPN, TOR, Proxies | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 3141 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :outbox_tray: Encrypted DNS Servers only | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 1229 domains/hosts - 1072 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Domains Subdomains | Link Mirror Mirror | Blocky (older than v0.23), Diversion (older than v5), OpenSnitch, PersonalBlocklist, pfBlockerNG |
| Hosts | Link Mirror Mirror | AdAway, uMatrix, DNS66, GasMask, NetGuard, Hostfile |
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :outbox_tray: Encrypted DNS Servers IPs IPv4 lists in plain IP format for firewalls and AdGuard Home format are also available. [!TIP] If the IP list is used in AdGuard Home, all domains that would resolve to the blocked IP are blocked. To prevent the blocked domains from being resolved via IPv6, it is necessary to deactivate resolving via IPv6 in AdGuard Home: Settings > DNS settings > DNS server configuration > Disable resolving of IPv6 addresses Expires: Updated regularly :mag: Safesearch not supported - Prevent the use of search engines that do not support Safesearch! A blocklist for blocking search engines that do not support Safesearch. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 228 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :lock_with_ink_pen: Dynamic DNS blocking - Protects against the malicious use of dynamic DNS services! A blocklist for blocking dynamic DNS services to protect against malicious use in phishing campaigns and others. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 1455 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :computer: Badware Hoster blocking - Protects against the malicious use of free host services! A blocklist for blocking known free hosters that also host badware via user content to prevent the use of these hosters for malicious purposes. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 1824 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :crystal_ball: Most Abused TLDs - Protects against known malicious Top Level Domains! A blocklist for blocking Top Most Abused Top Level Domains, merged from @Yokoffing , @DandelionSprout , @LennyFox Cloudflare Radar and SpamHaus. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| AdGuard | Link Mirror Mirror | AdGuard, AdGuard Home |
| uBlock | Link Mirror Mirror | uBlock, AdBlock Plus |
| AdBlock | Link Mirror Mirror | Pi-hole, AdBlock, TechnitiumDNS Contains only spam TLDs that do not have any exclusions. |
| AdBlock (Aggressive) Allowlist | Link Mirror Mirror Link Mirror Mirror | Pi-hole, AdBlock, TechnitiumDNS |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ Contains only spam TLDs that do not have any exclusions. |
| RPZ (Aggressive) | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ Contains all spam TLDs, corresponds to the AdGuard and uBlock version without exclusions. | Expires: Updated regularly :skull: Anti Piracy - Protects against piracy! Blocks websites and services that are mainly used for the illegal distribution of copyrighted content. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 9757 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam, Little Snitch Mini |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :slot_machine: Gambling - Protects against gambling content! Blocks gambling content. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :x: | :x: | :x: | :x: | :x: | :green_circle: yes :yellow_square: partially :x: no Entries: 308539 compressed domains | Format | Links | Should be used for |
|:-------|:-----|:----------------|
| Adblock | Link Mirror Mirror | Pi-hole, AdGuard, AdGuard Home, eBlocker, uBlock, AdBlock, AdBlock Plus, Opera, Vivaldi, Brave, AdNauseam |
| Unbound | Link Mirror Mirror | Unbound |
| DNSMasq v2.85- | Link Mirror Mirror | DNSMasq (v2.85 or older) |
| DNSMasq v2.86+ | Link Mirror Mirror | DNSMasq (v2.86 or newer), adblock-lean, Diversion (v5 or newer) |
| Wildcard Asterisk | Link Mirror Mirror | Blocky (v0.23 or newer), Nebulo, NetDuma, OPNsense, YogaDNS |
| Wildcard Domains | Link Mirror Mirror | DNSCloak, DNSCrypt, TechnitiumDNS, PersonalDNSfilter, InviZible Pro |
| RPZ | Link Mirror Mirror | Response Policy Zone, Bind, Knot, PowerDNS, Unbound RPZ | Expires: Updated regularly :calling: Native Tracker - Broadband tracker of devices, services and operating systems Blocks native broadband tracker from devices, services and operating systems that track your activity. | | Light | Normal | Pro | Pro++ | Ultimate |
|:-----------:|:-----:|:------:|:---:|:-----:|:--------:|
| Included in | :yellow_square: | :yellow_square: | :yellow_square: | :yellow_square: | :green_circle: | :green_circle: yes :yellow_square: partially :x: no | Device/Service | Domains | Hosts | Adblock | Unbound | DNSMasq v2.86+ | DNSMasq v2.85- | Wildcard Asterisk | Wildcard Domains | RPZ |
|:-------|:--------:|:------:|:--------:|:--------:|:--------:|:---------:|:--------:|:--------:|:--------:|
| Amazon (Devices, Shopping, Video) | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| Apple (iOS, macOS, tvOS) | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| Huawei (Devices) | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| Microsoft (Windows, Office, MSN) | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| TikTok (Fingerprinting) | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| TikTok (Fingerprinting) Aggressive | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| LG webOS | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| Vivo | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| OPPO/Realme | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror |
| Xiaomi | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Link Mirror Mirror | Expires: Updated regularly :bulb: Recommendation As a network-wide DNS blocker, I recommend using Adguard Home , Pi-hole , TechnitiumDNS , Blocky (advanced users), adblock-lean (OpenWrt) or eBlocker . DNS blockers offer good protection of privacy by blocking tracking, metrics and telemetry. They can be used to block the vast majority of ads, malware, scam, fake and co, but not everything can be blocked at the DNS level!
Therefore, I additionally recommend the use of a browser content blocker such as AdGuard , uBlock or Ghostery . Check out Yokoffing's Recommended Filters for uBlock Origin for content blocker filter lists. For a browser recommendation see also Yokoffing's I need a browser with ad blocking. Which one should I choose? :information_desk_person: Which version of the lists should I use? Use Light if you have to pay attention to the size of the list because the AdBlocker does not support large lists, or light protection is sufficient for you. Use Normal if there is no admin nearby who can unblock something from time to time. E.g. for grandma and grandpa or the whole home or family network. Use Pro if an admin is available who could unblock something if necessary. My personal recommendation for almost problem-free adblocking. Use Pro++ if you are an experienced user, know what you are doing and privacy is important to you. This is an aggressive list and you may need to unblock things more often. Use Ultimate if Pro++ is not enough for you. [!IMPORTANT] Another recommendation is to combine the main lists with the Threat Intelligence Feeds list if possible. For Adblockers that have problems with the size of the full TIF list, there is a smaller medium version. If you use AdGuard Home or AdGuard DNS, I also recommend using the Dandelion Sprout's Anti-Malware List . There is also an IPv4 list that can be used in addition to the TIF or TIF medium list. [!TIP] NextDNS users: The Threat Intelligence Feeds list is not available in NextDNS, the security features should be used instead. Furthermore, I recommend that NextDNS users also use the OISD list, which also contains some TIF sources that are not covered by the security features. Further additional options to the main lists depending on the use case are: Security: In addition to the Threat Intelligence Feeds list, use the Dynamic DNS , Badware Hoster , Most Abused TLDs and Newly Registered Domains (NRDs) list to further protect yourself from malicious things. Protection of children: Use the Gambling , Anti Piracy , Safesearch , DoH/VPN/TOR/Proxy Bypass and oisd NSFW lists in addition to blocking gambling, piracy, no Safesearch engines, DNS bypassing, porn, shock and adult sites. :department_store: Online DNS Services If you don't run your own DNS server on your home network or if you are looking for additional protection for your mobile devices when they are not connected to the home network, then you can use one of the following DNS services: :department_store: AdGuardDNS - limited free/paid In AdGuardDNS you can use my Multi Normal, Pro, Pro++, Ultimate, TIF, Gambling, Anti Piracy, DoH/VPN/TOR/Proxy Bypass, DynDNS, Badware Hoster, Most Abused TLDs list and the Allowlist Referral. :department_store: ControlD - free/paid In ControlD you can use my Light, Normal, Pro, Pro++, Ultimate and TIF lists. Free: [!TIP] For Apple devices, you can use my pre-configured mobileconfigs or create your own mobileconfig under https://dns.notjakob.com/tool.html | Blocklists | DNS-over-HTTPS | DNS-over-TLS/QUIC | Legacy DNS |
|:-----------|:---------------|:-------------|:-------------|
| Light | https://freedns.controld.com/x-hagezi-light | x-hagezi-light.freedns.controld.com | 76.76.2.37 76.76.10.37 2606:1a40::37 2606:1a40:1::37 |
| Normal | https://freedns.controld.com/x-hagezi-normal | x-hagezi-normal.freedns.controld.com | 76.76.2.40 76.76.10.40 2606:1a40::40 2606:1a40:1::40 |
| Pro | https://freedns.controld.com/x-hagezi-pro | x-hagezi-pro.freedns.controld.com | 76.76.2.41 76.76.10.41 2606:1a40::41 2606:1a40:1::41 |
| Pro Plus | https://freedns.controld.com/x-hagezi-proplus | x-hagezi-proplus.freedns.controld.com | 76.76.2.42 76.76.10.42 2606:1a40::42 2606:1a40:1::42 |
| Ultimate | https://freedns.controld.com/x-hagezi-ultimate | x-hagezi-ultimate.freedns.controld.com | 76.76.2.45 76.76.10.45 2606:1a40::45 2606:1a40:1::45 |
| TIF | https://freedns.controld.com/x-hagezi-tif | x-hagezi-tif.freedns.controld.com | 76.76.2.46 76.76.10.46 2606:1a40::46 2606:1a40:1::46 | Paid: Check out Yokoffing's ControlD Config Guide for recommended ControlD configuration settings. :department_store: NextDNS - limited free/paid In NextDNS you can use my Light, Normal, Pro, Pro++ and Ultimate lists. Check out Yokoffing's NextDNS Config Guide and the Techlore Video The ULTIMATE Guide to Mastering NextDNS! for recommended NextDNS configuration settings. :department_store: RethinkDNS - free In RethinkDNS you can use my Light, Normal, Pro, Pro++, Ultimate, TIF, DynDNS and Badware Hoster lists. [!NOTE] The lists in RethinkDNS are only updated once a week. | Blocklists | DNS-over-HTTPS | DNS-over-TLS/QUIC |
|:-----------|:---------------|:-------------|
| Light + TIF | https://sky.rethinkdns.com/1:AAkACAQA | 1-aaeqacaeaa.max.rethinkdns.com |
| Normal + TIF | https://sky.rethinkdns.com/1:AAkACAgA | 1-aaeqacaiaa.max.rethinkdns.com |
| Pro + TIF | https://sky.rethinkdns.com/1:AAoACBAA | 1-aafaacaqaa.max.rethinkdns.com |
| Pro plus + TIF | https://sky.rethinkdns.com/1:AAoACAgA | 1-aafaacaiaa.max.rethinkdns.com |
| Ultimate + TIF | https://sky.rethinkdns.com/1:gAgACABA | 1-qaeaacaaia.max.rethinkdns.com | :department_store: DNSwarden - free In DNSwarden you can use my Light, Normal, Pro, Pro++, Ultimate and TIF lists. | Blocklists | DNS-over-HTTPS | DNS-over-TLS/QUIC |
|:-----------|:---------------|:------------------|
| Light + TIF | https://dns.dnswarden.com/00000000000000000000048 | 00000000000000000000048.dns.dnswarden.com |
| Normal + TIF | https://dns.dnswarden.com/00000000000000000000028 | 00000000000000000000028.dns.dnswarden.com |
| Pro + TIF | https://dns.dnswarden.com/00000000000000000000018 | 00000000000000000000018.dns.dnswarden.com |
| Pro plus + TIF | https://dns.dnswarden.com/0000000000000000000000o | 0000000000000000000000o.dns.dnswarden.com |
| Ultimate + TIF | https://dns.dnswarden.com/0000000000000000000000804 | 0000000000000000000000804.dns.dnswarden.com | :department_store: DNSforge (Germany) - free DNSforge use my Multi Light blocklist in addition to other blocklists. :department_store: OpenBLD.net - free OpenBLD.net use my Multi Pro blocklist in addition to other blocklists. :department_store: RobinGroppe.de (Germany) - free RobinGroppe.de DNS offers a free German DNS server without logging to block malware, phishing and other threats. It uses my TIF list. :loudspeaker: About "If the plan doesn‘t work, change the plan but never the goal." There's no place like 127.0.0.1! The blocklists are based on various sources and my own denylists/extensions. They were designed to avoid false positive domains as much as possible without losing effectiveness and efficiency. Dead hosts are regularly removed from the lists to keep them as small as possible.
Made with :heartbeat: for a safer and cleaner internet. All lists were tested against 10000 websites from the Cisco Umbrella Top 1 million list. It was checked whether the pages load, the page content is displayed correctly, navigation links work, images load, videos start and much more. They are updated and maintained daily. No, they are not just block lists cobbled together from different sources. They have been optimized and extended to efficiently "clean the Internet" in all areas. See also: Which sources are used for the lists and how are the lists compiled on the basis of these sources? The results of a test against the 10000 whotracks.me pages. All pages were opened and fully loaded via batch in Edge with privacy features turned off. Cookies were all accepted. | List | Total queries | Blocked queries | % blocked | % gap to light |
|-------------:|--------------:|----------------:|----------:|---------------:|
| Ultimate | 299646 | 131093 | 43.75 | 12.85 |
| Pro++ | 299646 | 119681 | 39.94 | 9.05 |
| Pro | 299646 | 97508 | 32.54 | 1.65 |
| Normal | 299646 | 93258 | 31.12 | 0.23 |
| Light | 299646 | 92576 | 30.90 | |
| ---- | 299646 | 67888 | 22.66 | -8.24 | Test them, give feedback and report blockable or incorrectly blocked domains. :email: Contact | Mail |
|:----:|
| hagezi@protonmail.com | :family: Groups | Telegram | Discord |
|:---------:|:------:|
| Link | CipherOps' Pi-hole & AdGuard Home | :octocat: Repository The repository is occasionally compressed (reinitialised) to reduce the overall size. Among other things, this invalidates forks and cleans up the commit history. :cyclone: Referral Domains Affiliate and tracking links (referral domains) that appear frequently on offer web pages like Slickdeals, in emails or in search results are allowed in my lists. These are mostly called only after manually clicking on a link and are not used to display advertising.
If these are blocked, the first hit links from search results, for example, no longer work. Furthermore, some of these domains are also used to unsubscribe from newsletters. See also: Why are referral domains (affiliate and tracking links) not blocked in the lists? :dizzy: Support Me If you like the project and you can benefit from it, leave a :star: (top right) and become a stargazer ! Give feedback, show me your ideas, report domains to be blocked, report false positive domains and help to keep the internet safe and clean. Help and cooperation of any kind are welcome! Thanks for your support! :stars: Stargazers Keep the internet clean! | DNS-Blocklists: For a better internet - keep the internet clean! | dns,ads,blacklist,blocklist,coins,domains,fake,filterlist,hosts,malware | 2 | 11 | 205 | 200 | 1 | 1 | 7 |
taikoxyz/taiko-mono | Taiko A based rollup. [![Twitter Follow](https://img.shields.io/twitter/follow/taikoxyz?style=social)](https://twitter.com/taikoxyz)
[![Discord](https://img.shields.io/discord/984015101017346058?color=%235865F2&label=Discord&logo=discord&logoColor=%23fff)](https://discord.gg/taikoxyz)
[![YouTube](https://img.shields.io/youtube/channel/subscribers/UCxd_ARE9LtAEdnRQA6g1TaQ)](https://www.youtube.com/@taikoxyz)
[![GitPOAP Badge](https://public-api.gitpoap.io/v1/repo/taikoxyz/taiko-mono/badge)](https://www.gitpoap.io/gh/taikoxyz/taiko-mono)
[![License](https://img.shields.io/github/license/taikoxyz/taiko-mono)](https://github.com/taikoxyz/taiko-mono/blob/main/LICENSE.md) Documentation End user documentation can be found at docs.taiko.xyz . Protocol specs can be found here . Each package of the monorepo is well documented and includes a README. Project structure taiko-mono/
├── CHANGELOG.md ├── CONTRIBUTING.md ├── LICENSE.md ├── README.md ├── packages │ ├── branding : Taiko branding materials.
│ ├── bridge-ui : Bridge UI.
│ ├── docs-site : End user documentation site.
│ ├── eventindexer : Event indexer.
│ ├── fork-diff : Fork diff page.
│ ├── guardian-prover-health-check : Guardian prover health check service.
│ ├── guardian-prover-health-check-ui : Guardian prover health check UI.
│ ├── protocol : Taiko protocol smart contracts.
│ ├── relayer : Bridge backend relayer.
│ ├── supplementary-contracts : Supplementary smart contracts that are not part of the Taiko rollup protocol.
│ ├── taiko-client : Taiko client implementation in Go.
│ ├── nfts : Taiko NFTs.
│ └── taikoon-ui : Taikoon NFT UI.
... Issues If you find a bug or have a feature request, please open an issue . Contributing Check out CONTRIBUTING.md for details on how to contribute. You can also check out our grants cycle at grants.taiko.xyz . Getting support Reach out to the community on Discord if you need any help! | A based rollup. 🥁 | based,ethereum,rollup,taiko,layer2,zk | 182 | 203 | 4,008 | 2,267 | 12 | 215 | 27 |
facebookincubator/AITemplate | AITemplate | | AITemplate (AIT) is a Python framework that transforms deep neural networks into CUDA (NVIDIA GPU) / HIP (AMD GPU) C++ code for lightning-fast inference serving. AITemplate highlights include: High performance: close to roofline fp16 TensorCore (NVIDIA GPU) / MatrixCore (AMD GPU) performance on major models, including ResNet, MaskRCNN, BERT, VisionTransformer, Stable Diffusion, etc. Unified, open, and flexible. Seamless fp16 deep neural network models for NVIDIA GPU or AMD GPU. Fully open source, Lego-style easily extendable high-performance primitives for new model support. Supports a significantly more comprehensive range of fusions than existing solutions for both GPU platforms. More about AITemplate Excellent Backward Capability AITemplate doesn't depend on third-party libraries or runtimes, such as cuBLAS, cuDNN, rocBLAS, MIOpen, TensorRT, MIGraphX, etc. Each model is compiled into a self-contained portable binary, which can be used on any software environment with the same hardware. Horizontal Fusion AITemplate provides unique advanced horizontal fusion. AITemplate can fuse parallel GEMM, LayerNorm, and other operators with different input shapes into a single GPU kernel. Vertical Fusion AITemplate provides strong vertical fusion. AITemplate can fuse a large range of operations into TensorCore/MatrixCore operations, such as elementwise operations, reductions, and layout permutations. AITemplate also provides back-to-back style TensorCore / MatrixCore operation fusion. Memory Fusion AITemplate provides innovative memory fusions. AITemplate can fuse GEMM, LayerNorm, and other operators, followed by memory operations such as concatenation, split, and slice into a single operator. Working w/wo PyTorch The AITemplate-generated Python runtime can take PyTorch tensors as inputs and outputs without an extra copy. For environments without PyTorch, the AITemplate Python/C++ runtime is self-contained. Extensions without suffering AITemplate provides a straightforward approach for making an extension in codegen. To add a new operator or a new fused kernel into AITemplate, most of the time one only needs to add two Python files: one for a graph node definition and another for the backend codegen. The CUDA/HIP kernel in a text header file can be directly utilized in the codegen. FX2AIT FX2AIT is a Python-based tool that converts PyTorch models into AITemplate (AIT) engine for lightning-fast inference serving. Using FX2AIT's built-in AITLowerer, partial AIT acceleration can be achieved for models with unsupported operators in AITemplate. Key features of FX2AIT include: Easy Conversion: FX2AIT requires only a PyTorch model and input for conversion, generating an "AITModule" output for inference serving. Expanded Support: AITemplate does not support all PyTorch operators. FX2AIT's AITLowerer offers a solution for partial AIT conversion for models with unsupported operators. Check the fx2ait/fx2ait/example/03_lowering_split for more information. More info can be found from https://github.com/facebookincubator/AITemplate/tree/main/fx2ait. Installation Hardware requirements: NVIDIA : AIT is only tested on SM80+ GPUs (Ampere etc). Not all kernels work with old SM75/SM70 (T4/V100) GPUs. AMD : AIT is only tested on CDNA2 (MI-210/250) GPUs. There may be compiler issues for old CDNA1 (MI-100) GPUs. Clone the code When cloning the code, please use the following command to also clone the submodules: git clone --recursive https://github.com/facebookincubator/AITemplate Docker Image We highly recommend using AITemplate with Docker to avoid accidentally using a wrong version of NVCC or HIPCC. CUDA: ./docker/build.sh cuda ROCM: DOCKER_BUILDKIT=1 ./docker/build.sh rocm This will build a docker image with tag ait:latest . From Source The following command will create a Python wheel for AITemplate. Please ensure you have correct CUDA/ROCm compiler installed. CUDA: CUDA 11.6 ROCm: We tested on ROCm 5.2.3 with a customized build HIPCC with the command in docker/Dockerfile.rocm#L87-L96 Incorrect compiler will lead performance regression. Please check all submodules are cloned correctly before go to next step. cd python
python setup.py bdist_wheel
pip install dist/*.whl --force-reinstall Getting Started Check out the AITemplate Documentation for API reference. There are a few tutorials for onboarding: 01: How to inference a PyTorch model with AIT 02: How to add an op to AIT codegen 03: How to visualize AIT's optimization Examples & Performance AITemplate provides the following model templates & reference performance data on A100/MI-250: 01_ResNet-50 with PyTorch Image Models (TIMM) 02_MaskRCNN-FPN with Detectron2 03_BERT with Hugging Face Transformer 04_Vision Transformer with PyTorch Image Models (TIMM) 05_Stable Diffusion with Hugging Face Diffusers Release All current development updates can be seen in the AITemplate repository. Releases are not on a set schedule and will only be tagged for significant feature releases. Mid-term plan: Better dynamic shape support: Focus on the dynamic sequence in Transformers. Add symbolic shape support. More automatic graph passes: Relief manual rewrite models to obtain the best performance. Quantization: fp8/int8/int4. Sparsity pruning for Gemm. PT2 integration: Aten2AIT is under active development. Long-term plan: Automatic ONNX, Open-XLA and other format model conversion. Composable Kernel CPU extension on AVX2/AVX-512 for AMD Epyc CPU. Contributing Check our contributing guide to learn about how to contribute to the project. The Team AITemplate is currently maintained by Meta engineers: Ying Zhang , Yang Chen , Terry Chen , Mu-Chu Lee , Max Podkorytov , Adnan Akhundov . AITemplate is co-created by Meta engineers: Bing Xu , Ying Zhang , Hao Lu , Yang Chen , and Terry Chen , with major contributions coming from other talented engineers. A non-exhaustive list to mention is Mike Iovine, Mu-Chu Lee, Scott Wolchok, Oleg Khabinov, Shirong Wu, Huamin Li, Hui Guo, Zhijing Li, Max Podkorytov. We also want to thank Andrew Tulloch, Yinghai Lu, Lu Fang for the valuable discussions. FX2AIT and Aten2AIT are co-created and maintained by Meta engineers: Wei Wei , Shirong Wu and Zhijing Li . Acknowledgements AITemplate team works closely with NVIDIA CUTLASS Team (led by Andrew Kerr, Haicheng Wu) and AMD Composable Kernel Team (led by Chao Liu, Jing Zhang). We co-designed many advanced GPU optimizations specialized for each platform, and nothing is possible without our close collaboration. License AITemplate is licensed under the Apache 2.0 License . | AITemplate is a Python framework which renders neural network into high performance CUDA/HIP C++ code. Specialized for FP16 TensorCore (NVIDIA GPU) and MatrixCore (AMD GPU) inference. | [] | 2 | 95 | 767 | 700 | 105 | 4 | 4 |
dair-ai/Mathematics-for-ML | Mathematics for Machine Learning A collection of resources to learn and review mathematics for machine learning. :book: Books Algebra, Topology, Differential Calculus, and Optimization Theory For Computer Science and Machine Learning by Jean Gallier and Jocelyn Quaintance Includes mathematical concepts for machine learning and computer science. Book: https://www.cis.upenn.edu/~jean/math-deep.pdf Applied Math and Machine Learning Basics by Ian Goodfellow and Yoshua Bengio and Aaron Courville This includes the math basics for deep learning from the Deep Learning book. Chapter: https://www.deeplearningbook.org/contents/part_basics.html Mathematics for Machine Learning by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong This is probably the place you want to start. Start slowly and work on some examples. Pay close attention to the notation and get comfortable with it. Book: https://mml-book.github.io Probabilistic Machine Learning: An Introduction by Kevin Patrick Murphy This book contains a comprehensive overview of classical machine learning methods and the principles explaining them. Book: https://probml.github.io/pml-book/book1.html Mathematics for Deep Learning by Brent Werness, Rachel Hu et al. This reference contains some mathematical concepts to help build a better understanding of deep learning. Chapter: https://d2l.ai/chapter_appendix-mathematics-for-deep-learning/index.html The Mathematical Engineering of Deep Learning by Benoit Liquet, Sarat Moka and Yoni Nazarathy This book provides a complete and concise overview of the mathematical engineering of deep learning. In addition to overviewing deep learning foundations, the treatment includes convolutional neural networks, recurrent neural networks, transformers, generative adversarial networks, reinforcement learning, and multiple tricks of the trade. The focus is on the basic mathematical description of deep learning models, algorithms and methods. Book: https://deeplearningmath.org Bayes Rules! An Introduction to Applied Bayesian Modeling by Alicia A. Johnson, Miles Q. Ott, Mine Dogucu Great online book covering Bayesian approaches. Book: https://www.bayesrulesbook.com/index.html 📄 Papers The Matrix Calculus You Need For Deep Learning by Terence Parr & Jeremy Howard In deep learning, you need to understand a bunch of fundamental matrix operations. If you want to dive deep into the math of matrix calculus this is your guide. Paper: https://arxiv.org/abs/1802.01528 The Mathematics of AI by Gitta Kutyniok An article summarising the importance of mathematics in deep learning research and how it’s helping to advance the field. Paper: https://arxiv.org/pdf/2203.08890.pdf 🎥 Video Lectures Multivariate Calculus by Imperial College London by Dr. Sam Cooper & Dr. David Dye Backpropagation is a key algorithm for training deep neural nets that rely on Calculus. Get familiar with concepts like chain rule, Jacobian, gradient descent. Video Playlist: https://www.youtube.com/playlist?list=PLiiljHvN6z193BBzS0Ln8NnqQmzimTW23 Mathematics for Machine Learning - Linear Algebra by Dr. Sam Cooper & Dr. David Dye A great companion to the previous video lectures. Neural networks perform transformations on data and you need linear algebra to get better intuitions of how that is done. Video Playlist: https://www.youtube.com/playlist?list=PLiiljHvN6z1_o1ztXTKWPrShrMrBLo5P3 CS229: Machine Learning by Anand Avati Lectures containing mathematical explanations to many concepts in machine learning. Course: https://www.youtube.com/playlist?list=PLoROMvodv4rNH7qL6-efu_q2_bPuy0adh 🧮 Math Basics The Elements of Statistical Learning by Jerome H. Friedman, Robert Tibshirani, and Trevor Hastie Machine learning deals with data and in turn uncertainty which is what statistics aims to teach. Get comfortable with topics like estimators, statistical significance, etc. Book: https://hastie.su.domains/ElemStatLearn/ If you are interested in an introduction to statistical learning, then you might want to check out "An Introduction to Statistical Learning" . Probability Theory: The Logic of Science by E. T. Jaynes In machine learning, we are interested in building probabilistic models and thus you will come across concepts from probability theory like conditional probability and different probability distributions. Source: https://bayes.wustl.edu/etj/prob/book.pdf Information Theory, Inference and Learning Algorithms by David J. C. MacKay When you are applying machine learning you are dealing with information processing which in essence relies on ideas from information theory such as entropy and KL Divergence,... Book: https://www.inference.org.uk/itprnn/book.html Statistics and probability by Khan Academy A complete overview of statistics and probability required for machine learning. Course: https://www.khanacademy.org/math/statistics-probability Linear Algebra Done Right by Sheldon Axler Slides and video lectures on the popular linear algebra book Linear Algebra Done Right. Lecture and Slides: https://linear.axler.net/LADRvideos.html Linear Algebra by Khan Academy Vectors, matrices, operations on them, dot & cross product, matrix multiplication etc. is essential for the most basic understanding of ML maths. Course: https://www.khanacademy.org/math/linear-algebra Calculus by Khan Academy Precalculus, Differential Calculus, Integral Calculus, Multivariate Calculus Course: https://www.khanacademy.org/math/calculus-home This collection is far from exhaustive but it should provide a good foundation to start learning some of the mathematical concepts used in machine learning. Reach out on Twitter if you have any questions. | 🧮 A collection of resources to learn mathematics for machine learning | ai,deep-learning,machine-learning,mathematics,ml | 0 | 4 | 5 | 22 | 10 | 1 | 0 |
hxu296/leetcode-company-wise-problems-2022 | Leetcode Company-wise Problem Lists Curated lists of Leetcode questions group by companies, updated as of May, 2022. Shout out to fishercoder1534 for the awesome Leetcode repo for solutions. Company Index APT Portfolio Accenture Activision Adobe Affirm Airbnb Akamai Akuna Capital Alation Alibaba AllinCall Amazon American Express Apple Arcesium Arista Networks Asana Athenahealth Atlassian Baidu Barclays BlackRock Bloomberg Bolt Booking Box ByteDance C3 IoT Canonical Capital One Cashfree Cisco Citadel Citrix Cohesity Commvault Coursera Cruise Automation DE Shaw DJI DRW Databricks Dataminr Dell Deutsche Bank Directi Docusign DoorDash Drawbridge Dropbox Druva Dunzo Duolingo Epic Systems Expedia FPT Facebook FactSet Flipkart Gilt Groupe GoDaddy Goldman Sachs Google Grab HBO HRT Honeywell Hotstar Huawei Hulu IBM IIT Bombay IMC IXL Indeed Info Edge Infosys Intel Intuit JPMorgan Jane Street Jeavio Karat Leap Motion LinkedIn LiveRamp Lyft MAQ Software MakeMyTrip Mathworks Mercari Microsoft MindTickle MindTree Moengage Morgan Stanley National Instruments Netflix Netsuite Nuro Nutanix Nvidia OT Opendoor Optum Oracle Palantir Technologies PayTM Paypal PhonePe Pinterest Pocket Gems Postmates Pure Storage Qualcomm Qualtrics Quora Rakuten Reddit Redfin Riot Games Robinhood Roblox Rubrik Rupeek SAP Salesforce Samsung Sapient ServiceNow Shopee Snapchat Softwire Sony Splunk Spotify Sprinklr Square Sumologic Swiggy T System TIAA Tencent Tesla Thumbtack Tiger Analytics Toptal TripleByte TuSimple Twilio Twitch Twitter Two Sigma Uber United Health Group VMware Valve Virtu Financial Visa Walmart Global Tech Wayfair Wealthfront Wish Works Applications Yahoo Yandex Yelp ZScaler Zenefits Zillow Zoho Zomato Zoom Zopsmart eBay edabit instacart payu peak6 persistent systems razorpay tcs tiktok zeta suite APT Portfolio [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Stone Game VI | Medium | Solution | Accenture [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Count Odd Numbers in an Interval Range | Easy | Solution |
| 2 | Merge Two Sorted Lists | Easy | Solution |
| 2 | Palindrome Number | Easy | Java , C++ |
| 1 | Find Subsequence of Length K With the Largest Sum | Easy | Java | Activision [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Minimum ASCII Delete Sum for Two Strings | Medium | Solution | Adobe [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 52 | Two Sum | Easy | Java , C++ , Javascript |
| 19 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 16 | Roman to Integer | Easy | Solution |
| 13 | Reverse Integer | Easy | Solution , C++ |
| 12 | Maximum Subarray | Easy | Solution |
| 12 | Contains Duplicate | Easy | Solution |
| 12 | Longest Palindromic Substring | Medium | Solution |
| 11 | Merge Two Sorted Lists | Easy | Solution |
| 11 | Longest Common Prefix | Easy | Solution |
| 10 | Add Two Numbers | Medium | Solution |
| 9 | Container With Most Water | Medium | Solution |
| 9 | 3Sum | Medium | Solution , C++ |
| 8 | Valid Parentheses | Easy | Solution |
| 8 | Merge Intervals | Medium | Solution |
| 8 | First Missing Positive | Hard | Solution |
| 8 | Best Time to Buy and Sell Stock | Easy | Solution |
| 7 | Search Insert Position | Easy | Solution |
| 7 | Search in Rotated Sorted Array | Medium | Solution |
| 7 | Trapping Rain Water | Hard | Solution |
| 7 | Product of Array Except Self | Medium | Solution | Affirm [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 10 | Insert Delete GetRandom O(1) | Medium | Solution |
| 6 | Design Hit Counter | Medium | Solution |
| 4 | Group Anagrams | Medium | Solution |
| 3 | Insert Delete GetRandom O(1) - Duplicates allowed | Hard | Solution |
| 3 | Valid Anagram | Easy | Solution |
| 2 | Optimal Account Balancing | Hard | Solution | Airbnb [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 14 | Palindrome Pairs | Hard | Solution |
| 14 | Alien Dictionary | nan | Solution |
| 9 | Flatten 2D Vector | Medium | Solution |
| 7 | Smallest Common Region | Medium | Solution |
| 6 | Pour Water | Medium | Solution |
| 6 | Minimum Window Substring | Hard | Solution |
| 5 | Combination Sum | Medium | Solution |
| 4 | Flatten Nested List Iterator | Medium | Solution |
| 3 | Design Circular Queue | Medium | Solution |
| 3 | Fraction to Recurring Decimal | Medium | Solution |
| 2 | Find the Smallest Divisor Given a Threshold | nan | Solution |
| 2 | Tag Validator | Hard | Solution |
| 2 | Intersection of Two Linked Lists | Easy | Solution |
| 1 | Pyramid Transition Matrix | Medium | Solution |
| 1 | Mini Parser | Medium | Solution | Akamai [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | N-Repeated Element in Size 2N Array | Easy | Solution | Akuna Capital [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 8 | Increasing Decreasing String | Easy | Solution | Alation [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | H-Index | Medium | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Group Anagrams | Medium | Solution | Alibaba [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Path Sum IV | Medium | Solution |
| 1 | Split Concatenated Strings | Medium | Solution | AllinCall [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Number Of Rectangles That Can Form The Largest Square | Easy | Solution | Amazon [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 117 | Two Sum | Easy | Java , C++ , Javascript |
| 117 | LRU Cache | Hard | Solution |
| 103 | Number of Islands | Medium | Solution |
| 87 | Merge Intervals | Medium | Solution |
| 68 | Search Suggestions System | Medium | Solution |
| 56 | Best Time to Buy and Sell Stock | Easy | Solution |
| 51 | Group Anagrams | Medium | Solution |
| 50 | Analyze User Website Visit Pattern | Medium | Solution |
| 49 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 49 | K Closest Points to Origin | Easy | Solution |
| 48 | Meeting Rooms II | Medium | Solution |
| 47 | Merge k Sorted Lists | Hard | Solution |
| 45 | Trapping Rain Water | Hard | Solution |
| 44 | 3Sum | Medium | Solution , C++ |
| 43 | Valid Parentheses | Easy | Solution |
| 41 | Word Ladder | Hard | Solution |
| 40 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 36 | Add Two Numbers | Medium | Solution |
| 36 | Word Search | Medium | Solution |
| 35 | Maximum Subarray | Easy | Solution | American Express [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Reorder Routes to Make All Paths Lead to the City Zero | Medium | Solution |
| 3 | Valid Palindrome | Easy | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Reducing Dishes | Hard | Solution |
| 2 | 3Sum | Medium | Solution , C++ | Apple [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 40 | Two Sum | Easy | Java , C++ , Javascript |
| 20 | LRU Cache | Hard | Solution |
| 19 | Add Two Numbers | Medium | Solution |
| 18 | Merge Intervals | Medium | Solution |
| 17 | Maximum Subarray | Easy | Solution |
| 16 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 15 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 14 | Longest Common Prefix | Easy | Solution |
| 13 | Roman to Integer | Easy | Solution |
| 11 | Spiral Matrix | Medium | Solution |
| 11 | Group Anagrams | Medium | Solution |
| 11 | Number of Islands | Medium | Solution |
| 10 | Word Break | Medium | Solution |
| 10 | Product of Array Except Self | Medium | Solution |
| 10 | 3Sum | Medium | Solution , C++ |
| 9 | Generate Parentheses | Medium | Solution |
| 8 | Rotate Image | Medium | Solution |
| 8 | Move Zeroes | Easy | Solution |
| 8 | Best Time to Buy and Sell Stock | Easy | Solution |
| 8 | Merge k Sorted Lists | Hard | Solution | Arcesium [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Maximum Length of Subarray With Positive Product | Medium | Solution |
| 2 | Broken Calculator | Medium | Solution |
| 2 | Minimum Size Subarray Sum | Medium | Solution | Arista Networks [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | Valid Parentheses | Easy | Solution |
| 2 | Compare Version Numbers | Easy | Solution |
| 2 | Restore IP Addresses | Medium | Solution |
| 1 | Construct String With Repeat Limit | Medium | Java | Asana [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 13 | Product of Array Except Self | Medium | Solution |
| 5 | K Closest Points to Origin | Easy | Solution | Athenahealth [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 4 | Degree of an Array | Easy | Solution | Atlassian [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 18 | Rank Teams by Votes | Medium | Solution |
| 10 | Logger Rate Limiter | Easy | Solution |
| 4 | Lemonade Change | Easy | Solution |
| 3 | Design Snake Game | Medium | Solution |
| 2 | Greatest Common Divisor of Strings | Easy | Solution |
| 2 | Single Number | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution | Baidu [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Out of Boundary Paths | Hard | Solution |
| 1 | Zuma Game | Hard | Solution |
| 1 | Arithmetic Slices II - Subsequence | Hard | Solution | Barclays [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Valid Parentheses | Easy | Solution | BlackRock [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 6 | Group Anagrams | Medium | Solution |
| 2 | Evaluate Division | Medium | Solution | Bloomberg [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 64 | Design Underground System | Medium | Solution |
| 33 | Number of Islands | Medium | Solution |
| 33 | Decode String | Medium | Solution |
| 28 | Flatten a Multilevel Doubly Linked List | Medium | Solution |
| 26 | Remove All Adjacent Duplicates in String II | Medium | Solution |
| 23 | Insert Delete GetRandom O(1) | Medium | Solution |
| 23 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 20 | Merge Intervals | Medium | Solution |
| 20 | Meeting Rooms II | Medium | Solution |
| 20 | Design an Ordered Stream | Easy | Solution |
| 19 | LRU Cache | Hard | Solution |
| 16 | Two Sum | Easy | Java , C++ , Javascript |
| 16 | Add Two Numbers | Medium | Solution |
| 15 | Trapping Rain Water | Hard | Solution |
| 14 | Validate Binary Search Tree | Medium | Solution |
| 14 | Two City Scheduling | Easy | Solution |
| 14 | Valid Parentheses | Easy | Solution |
| 14 | Word Search | Medium | Solution |
| 13 | First Unique Character in a String | Easy | Solution |
| 12 | Best Time to Buy and Sell Stock | Easy | Solution | Bolt [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 5 | Check if All Characters Have Equal Number of Occurrences | Easy | Solution |
| 3 | Reverse Words in a String III | Easy | Solution |
| 3 | Word Pattern | Easy | Solution |
| 2 | Subarray Sum Equals K | Medium | Solution |
| 2 | Word Pattern II | Hard | Solution | Booking [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Least Number of Unique Integers after K Removals | Medium | Solution |
| 3 | Backspace String Compare | Easy | Solution |
| 3 | Additive Number | Medium | Solution |
| 2 | Integer to English Words | Hard | Solution |
| 2 | Sliding Window Maximum | Hard | Solution |
| 2 | Valid Parentheses | Easy | Solution |
| 1 | Two Out of Three | Easy | Java | Box [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 12 | Number of 1 Bits | Easy | Solution |
| 4 | Word Ladder II | Hard | Solution | ByteDance [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 8 | Merge k Sorted Lists | Hard | Solution |
| 6 | Number of Islands | Medium | Solution |
| 6 | Search in Rotated Sorted Array | Medium | Solution |
| 6 | Binary Tree Maximum Path Sum | Hard | Solution |
| 5 | LRU Cache | Hard | Solution |
| 4 | The Maze | Medium | Solution |
| 4 | Basic Calculator II | Medium | Solution |
| 4 | Sliding Window Maximum | Hard | Solution |
| 4 | The Number of Weak Characters in the Game | Medium | Solution |
| 3 | Best Time to Buy and Sell Stock II | Easy | Solution |
| 3 | Course Schedule II | Medium | Solution |
| 3 | Longest Valid Parentheses | Hard | Solution |
| 3 | Combination Sum | Medium | Solution |
| 3 | N-Queens | Hard | Solution |
| 3 | Maximum Subarray | Easy | Solution |
| 3 | Best Time to Buy and Sell Stock | Easy | Solution |
| 3 | Sort List | Medium | Solution |
| 3 | Closest Dessert Cost | Medium | Solution |
| 3 | 3Sum | Medium | Solution , C++ |
| 3 | Basic Calculator | Hard | Solution | C3 IoT [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Pairs of Songs With Total Durations Divisible by 60 | Easy | Solution |
| 3 | Daily Temperatures | Medium | Solution |
| 3 | Generate Parentheses | Medium | Solution |
| 2 | Sort Array by Increasing Frequency | Easy | Solution | Canonical [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Divide a String Into Groups of Size k | Easy | Java | Capital One [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 18 | License Key Formatting | Medium | Solution |
| 14 | Count Primes | Easy | Solution |
| 13 | Reverse Nodes in k-Group | Hard | Solution |
| 4 | Best Time to Buy and Sell Stock | Easy | Solution |
| 3 | Candy Crush | Medium | Solution |
| 3 | Integer to Roman | Medium | Solution |
| 2 | Rotating the Box | Medium | Solution |
| 2 | Restore the Array From Adjacent Pairs | Medium | Solution |
| 2 | Add Two Numbers | Medium | Solution |
| 1 | Four Divisors | Medium | Solution | Cashfree [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Maximum Erasure Value | Medium | Solution | Cisco [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------|
| 16 | Word Search II | Hard | Solution |
| 16 | Rotate Image | Medium | Solution |
| 11 | Maximum Subarray | Easy | Solution |
| 11 | Expressive Words | Medium | Solution |
| 9 | Decode Ways | Medium | Solution |
| 8 | Validate IP Address | Medium | Solution |
| 8 | Decode String | Medium | Solution |
| 6 | House Robber | Easy | Solution |
| 6 | Maximum Difference Between Increasing Elements | Easy | Java |
| 5 | Valid Parentheses | Easy | Solution |
| 4 | Number of 1 Bits | Easy | Solution |
| 3 | Top K Frequent Elements | Medium | Solution |
| 3 | Beautiful Arrangement | Medium | Solution |
| 3 | Unique Paths II | Medium | Solution |
| 2 | Coin Change 2 | Medium | Solution |
| 2 | First Bad Version | Easy | Solution |
| 2 | Maximum Population Year | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Find Pivot Index | Easy | Solution |
| 2 | Merge Intervals | Medium | Solution | Citadel [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 8 | Pairs of Songs With Total Durations Divisible by 60 | Easy | Solution |
| 5 | Range Addition | Medium | Solution |
| 4 | Sliding Window Maximum | Hard | Solution |
| 2 | Transpose Matrix | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock IV | Hard | Solution |
| 2 | Trapping Rain Water | Hard | Solution | Citrix [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Find All Groups of Farmland | Medium | Solution | Cohesity [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Largest BST Subtree | Medium | Solution | Commvault [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Count Vowel Substrings of a String | Easy | Java | Coursera [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Rank Teams by Votes | Medium | Solution |
| 2 | Wildcard Matching | Hard | Solution | Cruise Automation [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 9 | Unique Paths II | Medium | Solution |
| 5 | Valid Sudoku | Medium | Solution , Javascript |
| 3 | Synonymous Sentences | Medium | Solution |
| 3 | The Skyline Problem | Hard | Solution |
| 2 | Product of the Last K Numbers | Medium | Solution |
| 2 | Car Pooling | Medium | Solution |
| 2 | Decode String | Medium | Solution |
| 2 | Palindrome Permutation II | Medium | Solution |
| 2 | Number of Islands | Medium | Solution | DE Shaw [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Freedom Trail | Hard | Solution |
| 2 | Sliding Window Maximum | Hard | Solution |
| 1 | Number of Substrings Containing All Three Characters | Medium | Solution | DJI [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Three Consecutive Odds | Easy | Solution | DRW [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Reorder Routes to Make All Paths Lead to the City Zero | Medium | Solution |
| 1 | Counting Elements | Easy | Solution | Databricks [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 9 | Design Hit Counter | Medium | Solution |
| 2 | Sparse Matrix Multiplication | Medium | Solution |
| 2 | First Missing Positive | Hard | Solution | Dataminr [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Valid Parentheses | Easy | Solution | Dell [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Two Sum | Easy | Java , C++ , Javascript | Deutsche Bank [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | The Time When the Network Becomes Idle | Medium | Java |
| 1 | Minimum Operations to Make the Array Increasing | Easy | Solution | Directi [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Find the Winner of an Array Game | Medium | Solution | Docusign [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 5 | Number of Islands | Medium | Solution |
| 3 | Maximum Subarray | Easy | Solution |
| 2 | Implement Trie II (Prefix Tree) | Medium | Solution |
| 2 | Rectangle Overlap | Easy | Solution |
| 2 | Minesweeper | Medium | Solution |
| 2 | LRU Cache | Hard | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution | DoorDash [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 18 | Binary Tree Maximum Path Sum | Hard | Solution |
| 13 | Buddy Strings | Easy | Solution |
| 11 | Walls and Gates | Medium | Solution |
| 7 | Minimum Number of Steps to Make Two Strings Anagram | Easy | Solution |
| 7 | Asteroid Collision | Medium | Solution |
| 7 | Serialize and Deserialize Binary Tree | Hard | Solution |
| 6 | Basic Calculator | Hard | Solution |
| 5 | Shortest Distance from All Buildings | Hard | Solution |
| 5 | Longest Common Subsequence | Medium | Solution |
| 5 | Number of Islands | Medium | Solution |
| 4 | Integer to English Words | Hard | Solution |
| 4 | Sudoku Solver | Hard | Solution |
| 4 | Design In-Memory File System | Hard | Solution |
| 4 | Car Pooling | Medium | Solution |
| 3 | Path Sum III | Easy | Solution |
| 3 | Jump Game | Medium | Solution |
| 3 | Jump Game II | Hard | Solution |
| 3 | Find Nearest Point That Has the Same X or Y Coordinate | Easy | Solution |
| 2 | K-diff Pairs in an Array | Easy | Solution |
| 2 | Subarray Sum Equals K | Medium | Solution | Drawbridge [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Set Intersection Size At Least Two | Hard | Solution | Dropbox [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 6 | Find Duplicate File in System | Medium | Solution |
| 5 | Game of Life | Medium | Solution |
| 2 | Max Area of Island | Medium | Solution |
| 2 | Design Phone Directory | Medium | Solution |
| 2 | Number of Islands | Medium | Solution |
| 1 | Seat Reservation Manager | Medium | Solution | Druva [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Maximize Sum Of Array After K Negations | Easy | Solution | Dunzo [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Maximum Number of Coins You Can Get | Medium | Solution |
| 2 | Max Sum of Rectangle No Larger Than K | Hard | Solution |
| 1 | Stone Game VII | Medium | Solution |
| 1 | Ways to Make a Fair Array | Medium | Javascript | Duolingo [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Minimum Number of People to Teach | Medium | Solution | Epic Systems [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | Letter Combinations of a Phone Number | Medium | Solution |
| 2 | Additive Number | Medium | Solution |
| 1 | Self Dividing Numbers | Easy | Solution | Expedia [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 14 | Reformat Date | Easy | Solution |
| 14 | Making File Names Unique | Medium | Solution |
| 8 | String Compression | Easy | Solution |
| 7 | Two Sum | Easy | Java , C++ , Javascript |
| 7 | Climbing Stairs | Easy | Solution |
| 6 | The kth Factor of n | Medium | Solution |
| 6 | Valid Parentheses | Easy | Solution |
| 5 | Degree of an Array | Easy | Solution |
| 5 | Best Time to Buy and Sell Stock | Easy | Solution |
| 4 | Best Meeting Point | Hard | Solution |
| 4 | Least Number of Unique Integers after K Removals | Medium | Solution |
| 3 | Integer to English Words | Hard | Solution |
| 3 | Maximum Difference Between Increasing Elements | Easy | Java |
| 2 | Move Zeroes | Easy | Solution |
| 2 | Number of Different Integers in a String | Medium | Solution |
| 2 | Palindromic Substrings | Medium | Solution |
| 2 | Find Pivot Index | Easy | Solution |
| 2 | Search in Rotated Sorted Array | Medium | Solution |
| 2 | Subarray Sum Equals K | Medium | Solution |
| 1 | Rearrange Words in a Sentence | Medium | Solution | FPT [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Find Three Consecutive Integers That Sum to a Given Number | Medium | Java | Facebook [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 282 | Minimum Remove to Make Valid Parentheses | Medium | Solution |
| 252 | Valid Palindrome II | Easy | Solution |
| 188 | Binary Tree Vertical Order Traversal | Medium | Solution |
| 180 | Lowest Common Ancestor of a Binary Tree | Medium | Solution |
| 162 | Pow(x, n) | Medium | Solution |
| 156 | Lowest Common Ancestor of a Binary Tree III | Medium | Solution |
| 149 | Range Sum of BST | Medium | Solution |
| 141 | Subarray Sum Equals K | Medium | Solution |
| 140 | Random Pick with Weight | Medium | Solution |
| 140 | Kth Largest Element in an Array | Medium | Solution |
| 127 | K Closest Points to Origin | Easy | Solution |
| 121 | Dot Product of Two Sparse Vectors | Easy | Solution |
| 118 | Basic Calculator II | Medium | Solution |
| 114 | Valid Word Abbreviation | Easy | Solution |
| 112 | Simplify Path | Medium | Solution |
| 110 | Merge Intervals | Medium | Solution |
| 108 | Binary Tree Right Side View | Medium | Solution |
| 100 | Minimum Add to Make Parentheses Valid | Medium | Solution |
| 100 | Nested List Weight Sum | Easy | Solution |
| 97 | Top K Frequent Elements | Medium | Solution | FactSet [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Maximal Square | Medium | Solution | Flipkart [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 7 | Number of Students Unable to Eat Lunch | Easy | Solution |
| 4 | Car Pooling | Medium | Solution |
| 3 | Maximum Number of Coins You Can Get | Medium | Solution |
| 3 | Jump Game | Medium | Solution |
| 2 | Shortest Subarray to be Removed to Make Array Sorted | Medium | Solution |
| 2 | Maximum Length of Pair Chain | Medium | Solution | Gilt Groupe [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Add One Row to Tree | Medium | Solution | GoDaddy [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Battleships in a Board | Medium | Solution |
| 2 | LRU Cache | Hard | Solution | Goldman Sachs [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 38 | Trapping Rain Water | Hard | Solution |
| 20 | Height Checker | Easy | Solution |
| 18 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 12 | High Five | Easy | Solution |
| 11 | Best Time to Buy and Sell Stock | Easy | Solution |
| 11 | Delete and Earn | Medium | Solution |
| 9 | Find Pivot Index | Easy | Solution |
| 7 | Two Sum | Easy | Java , C++ , Javascript |
| 7 | Count Number of Teams | Medium | Solution |
| 6 | String Compression | Easy | Solution |
| 6 | LRU Cache | Hard | Solution |
| 5 | Longest Palindromic Substring | Medium | Solution |
| 5 | Fraction Addition and Subtraction | Medium | Solution |
| 4 | Elimination Game | Medium | Solution |
| 4 | Find the Winner of the Circular Game | Medium | Solution |
| 4 | 3Sum | Medium | Solution , C++ |
| 4 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 4 | Minimum Moves to Equal Array Elements | Easy | Solution |
| 4 | First Unique Character in a String | Easy | Solution |
| 3 | Minimum Path Sum | Medium | Solution | Google [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 75 | Find Leaves of Binary Tree | Medium | Solution |
| 73 | Evaluate Reverse Polish Notation | Medium | Solution |
| 44 | Two Sum | Easy | Java , C++ , Javascript |
| 36 | Snapshot Array | Easy | Javascript |
| 30 | Stock Price Fluctuation | Medium | Java |
| 30 | Minimum Time Difference | Medium | Solution |
| 28 | Merge Intervals | Medium | Solution |
| 27 | Random Pick with Weight | Medium | Solution |
| 24 | Text Justification | Hard | Solution |
| 22 | Meeting Rooms II | Medium | Solution |
| 22 | Happy Number | Easy | Solution |
| 22 | Logger Rate Limiter | Easy | Solution |
| 21 | Number of Islands | Medium | Solution |
| 19 | First Bad Version | Easy | Solution |
| 19 | Decode String | Medium | Solution |
| 17 | Maximum Points You Can Obtain from Cards | Medium | Solution |
| 15 | Unique Paths | Medium | Solution |
| 15 | Number of Matching Subsequences | Medium | Solution |
| 15 | Subarray Sum Equals K | Medium | Solution |
| 14 | Student Attendance Record II | Hard | Solution | Grab [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Bulb Switcher III | Medium | Solution |
| 3 | Sort Colors | Medium | Solution |
| 2 | Reorder Routes to Make All Paths Lead to the City Zero | Medium | Solution |
| 2 | Number of Steps to Reduce a Number to Zero | Easy | Solution |
| 2 | Brick Wall | Medium | Solution |
| 2 | First Missing Positive | Hard | Solution |
| 1 | Minimum Number of Buckets Required to Collect Rainwater from Houses | Medium | Java | HBO [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Short Encoding of Words | Medium | Solution |
| 2 | Sliding Window Median | Hard | Solution | HRT [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:--------------|:-----------------------------------------------------------------------------------------------------------------------|
| 7 | Equal Sum Arrays With Minimum Number of Operations | Medium | Solution |
| 3 | 24 Game | Hard | Solution |
| 3 | Find Peak Element | Binary Search | Solution |
| 1 | Detect Pattern of Length M Repeated K or More Times | Easy | Solution |
| 1 | Maximum 69 Number | Easy | Solution |
| 1 | Convert Integer to the Sum of Two No-Zero Integers | Easy | Solution | Honeywell [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Minimum Distance to the Target Element | Easy | Solution | Hotstar [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | Asteroid Collision | Medium | Solution |
| 2 | Keys and Rooms | Easy | Solution |
| 2 | Find K Pairs with Smallest Sums | Medium | Solution | Huawei [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------|:-------------|:--------------------------------------------------------------------------------------------------------------------|
| 2 | Add Two Numbers | Medium | Solution | Hulu [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | K-th Smallest in Lexicographical Order | Hard | Solution | IBM [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 8 | Gas Station | Medium | Solution |
| 8 | Merge Intervals | Medium | Solution |
| 6 | Most Visited Sector in a Circular Track | Easy | Solution |
| 6 | Backspace String Compare | Easy | Solution |
| 4 | Water Bottles | Easy | Solution |
| 3 | Degree of an Array | Easy | Solution |
| 3 | Maximal Square | Medium | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Move Zeroes | Easy | Solution |
| 2 | Group Anagrams | Medium | Solution | IIT Bombay [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Score After Flipping Matrix | Medium | Solution | IMC [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Count Artifacts That Can Be Extracted | Medium | Java | IXL [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Design Snake Game | Medium | Solution |
| 1 | Find the Derangement of An Array | Medium | Solution | Indeed [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 25 | Subdomain Visit Count | Easy | Solution |
| 9 | Word Search | Medium | Solution |
| 8 | Maximum Length of Repeated Subarray | Medium | Solution |
| 6 | Find Words That Can Be Formed by Characters | Easy | Solution |
| 4 | Alert Using Same Key-Card Three or More Times in a One Hour Period | Medium | Solution |
| 3 | Text Justification | Hard | Solution |
| 3 | Merge k Sorted Lists | Hard | Solution |
| 2 | Merge Sorted Array | Easy | Solution |
| 1 | Sum of Even Numbers After Queries | Easy | Solution |
| 1 | Binary Tree Tilt | Easy | Solution | Info Edge [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Find the Minimum and Maximum Number of Nodes Between Critical Points | Medium | Java | Infosys [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Maximum Product Subarray | Medium | Solution |
| 2 | Sort Integers by The Number of 1 Bits | Easy | Solution |
| 2 | Score of Parentheses | Medium | Solution |
| 2 | Maximum Subarray | Easy | Solution |
| 2 | 4 Sum | Medium | Solution |
| 2 | 3Sum | Medium | Solution , C++ | Intel [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Valid Parentheses | Easy | Solution |
| 3 | Trapping Rain Water | Hard | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Reverse String | Easy | Solution |
| 2 | Sort Colors | Medium | Solution | Intuit [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 8 | LRU Cache | Hard | Solution |
| 6 | Unique Email Addresses | Easy | Solution |
| 4 | Sudoku Solver | Hard | Solution |
| 3 | Boats to Save People | Medium | Solution |
| 2 | Delete Nodes And Return Forest | Medium | Solution |
| 2 | Subdomain Visit Count | Easy | Solution |
| 2 | Decode String | Medium | Solution |
| 2 | Palindrome Linked List | Easy | Solution |
| 2 | Reverse Linked List | Easy | Solution |
| 2 | Merge Intervals | Medium | Solution |
| 2 | Trapping Rain Water | Hard | Solution |
| 2 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 2 | Two Sum | Easy | Java , C++ , Javascript | JPMorgan [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 69 | Reconstruct Original Digits from English | Medium | Solution |
| 8 | Group Anagrams | Medium | Solution |
| 5 | Maximum Subarray | Easy | Solution |
| 3 | Even Odd Tree | Medium | Solution |
| 3 | Best Time to Buy and Sell Stock | Easy | Solution |
| 3 | Add Two Numbers | Medium | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Maximum Units on a Truck | Easy | Solution |
| 2 | Minimum Value to Get Positive Step by Step Sum | Easy | Solution |
| 2 | Maximum Number of Events That Can Be Attended | Medium | Solution |
| 2 | Minimum Absolute Difference | Easy | Solution |
| 2 | Intersection of Two Arrays | Easy | Solution |
| 2 | Counting Bits | Medium | Solution |
| 2 | Paint Fence | Easy | Solution |
| 2 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 1 | Determine Color of a Chessboard Square | Easy | Solution | Jane Street [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Count Common Words With One Occurrence | Easy | Java | Jeavio [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Minimum Moves to Convert String | Easy | Java | Karat [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 40 | Subdomain Visit Count | Easy | Solution |
| 17 | Word Search | Medium | Solution |
| 16 | Alert Using Same Key-Card Three or More Times in a One Hour Period | Medium | Solution |
| 16 | Maximum Length of Repeated Subarray | Medium | Solution |
| 12 | Find Words That Can Be Formed by Characters | Easy | Solution |
| 10 | Text Justification | Hard | Solution |
| 5 | Word Search II | Hard | Solution |
| 4 | Check if Every Row and Column Contains All Numbers | Easy | Java |
| 4 | Course Schedule II | Medium | Solution |
| 4 | Number of Islands | Medium | Solution |
| 3 | Lowest Common Ancestor of a Binary Tree | Medium | Solution |
| 3 | Valid Sudoku | Medium | Solution , Javascript |
| 2 | Longest Common Subsequence | Medium | Solution | Leap Motion [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Generate Random Point in a Circle | Medium | Solution | LinkedIn [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 71 | Shortest Word Distance II | Medium | Solution |
| 48 | Nested List Weight Sum II | Medium | Solution |
| 43 | Maximum Subarray | Easy | Solution |
| 34 | Nested List Weight Sum | Easy | Solution |
| 33 | Valid Parentheses | Easy | Solution |
| 33 | Max Stack | Hard | Solution |
| 29 | Maximum Product Subarray | Medium | Solution |
| 26 | Closest Binary Search Tree Value II | Hard | Solution |
| 21 | Can Place Flowers | Easy | Solution |
| 18 | Serialize and Deserialize Binary Tree | Hard | Solution |
| 18 | Text Justification | Hard | Solution |
| 18 | Lowest Common Ancestor of a Binary Search Tree | Easy | Solution |
| 17 | Search in Rotated Sorted Array | Medium | Solution |
| 17 | All O`one Data Structure | Hard | Solution |
| 16 | Number of Islands | Medium | Solution |
| 15 | Insert Delete GetRandom O(1) | Medium | Solution |
| 14 | Kth Largest Element in an Array | Medium | Solution |
| 13 | Find Leaves of Binary Tree | Medium | Solution |
| 11 | Shortest Word Distance | Easy | Solution |
| 11 | Second Minimum Node In a Binary Tree | Easy | Solution | LiveRamp [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | Minesweeper | Medium | Solution |
| 3 | Spiral Matrix | Medium | Solution |
| 1 | Longest Harmonious Subsequence | Easy | Solution |
| 1 | Distribute Candies | Easy | Solution | Lyft [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 9 | Minimum Window Substring | Hard | Solution |
| 8 | Read N Characters Given Read4 II - Call multiple times | Hard | Solution |
| 7 | Time Based Key-Value Store | Medium | Solution |
| 7 | Decode Ways | Medium | Solution |
| 4 | Asteroid Collision | Medium | Solution |
| 4 | Water and Jug Problem | Medium | Solution |
| 4 | Range Sum Query 2D - Immutable | Medium | Solution |
| 4 | Word Ladder | Hard | Solution |
| 2 | Max Stack | Hard | Solution |
| 2 | Read N Characters Given Read4 | Easy | Solution | MAQ Software [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Unique Substrings in Wraparound String | Medium | Solution | MakeMyTrip [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Remove All Occurrences of a Substring | Medium | Solution |
| 2 | Remove K Digits | Medium | Solution | Mathworks [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 5 | Coin Change | Medium | Solution |
| 5 | Fraction to Recurring Decimal | Medium | Solution |
| 3 | Degree of an Array | Easy | Solution |
| 3 | Minimum Moves to Equal Array Elements | Easy | Solution |
| 3 | Distinct Subsequences | Hard | Solution |
| 2 | Keyboard Row | Easy | Solution | Mercari [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Crawler Log Folder | Easy | Solution |
| 1 | Max Difference You Can Get From Changing an Integer | Medium | Solution |
| 1 | Minimum Subsequence in Non-Increasing Order | Easy | Solution |
| 1 | Count Largest Group | Easy | Solution | Microsoft [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 52 | LRU Cache | Hard | Solution |
| 45 | Sign of the Product of an Array | Easy | Solution |
| 42 | Number of Islands | Medium | Solution |
| 39 | Two Sum | Easy | Java , C++ , Javascript |
| 34 | Reverse Words in a String | Medium | Solution |
| 29 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 29 | Group Anagrams | Medium | Solution |
| 26 | Spiral Matrix | Medium | Solution |
| 24 | Search in Rotated Sorted Array | Medium | Solution |
| 22 | Valid Parentheses | Easy | Solution |
| 22 | Find N Unique Integers Sum up to Zero | Easy | Solution |
| 20 | 3Sum | Medium | Solution , C++ |
| 20 | Letter Combinations of a Phone Number | Medium | Solution |
| 20 | Serialize and Deserialize Binary Tree | Hard | Solution |
| 18 | Longest Palindromic Substring | Medium | Solution |
| 17 | Add Two Numbers | Medium | Solution |
| 17 | Merge k Sorted Lists | Hard | Solution |
| 17 | Cinema Seat Allocation | Medium | Solution |
| 16 | First Missing Positive | Hard | Solution |
| 16 | String to Integer (atoi) | Medium | Solution | MindTickle [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Maximum Frequency Stack | Hard | Solution | MindTree [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Count Integers With Even Digit Sum | Easy | Java | Moengage [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Redistribute Characters to Make All Strings Equal | Easy | Solution | Morgan Stanley [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Longest Valid Parentheses | Hard | Solution |
| 3 | 3Sum | Medium | Solution , C++ |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | LRU Cache | Hard | Solution |
| 2 | Merge Intervals | Medium | Solution |
| 2 | Search in Rotated Sorted Array | Medium | Solution |
| 1 | Best Team With No Conflicts | Medium | Solution | National Instruments [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Trapping Rain Water | Hard | Solution | Netflix [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Rotating the Box | Medium | Solution |
| 3 | Logger Rate Limiter | Easy | Solution |
| 2 | Reconstruct Itinerary | Medium | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Valid Parentheses | Easy | Solution | Netsuite [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Average Salary Excluding the Minimum and Maximum Salary | Easy | Solution | Nuro [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Subrectangle Queries | Medium | Solution | Nutanix [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Compare Version Numbers | Easy | Solution |
| 2 | Sudoku Solver | Hard | Solution |
| 1 | Check If Word Is Valid After Substitutions | Medium | Solution | Nvidia [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 8 | Last Stone Weight | Easy | Solution |
| 4 | Serialize and Deserialize Binary Tree | Hard | Solution |
| 4 | Regular Expression Matching | Hard | Java , Javascript |
| 3 | Design Circular Queue | Medium | Solution |
| 3 | Number of Islands | Medium | Solution |
| 3 | Valid Sudoku | Medium | Solution , Javascript |
| 3 | Sort Colors | Medium | Solution |
| 3 | LRU Cache | Hard | Solution |
| 3 | Intersection of Two Linked Lists | Easy | Solution |
| 3 | Best Time to Buy and Sell Stock IV | Hard | Solution |
| 3 | Lonely Pixel I | Medium | Solution |
| 2 | Degree of an Array | Easy | Solution |
| 2 | Add and Search Word - Data structure design | Medium | Solution |
| 2 | Subarray Sum Equals K | Medium | Solution |
| 2 | Missing Number | nan | Solution |
| 2 | K Empty Slots | Hard | Solution |
| 2 | Search a 2D Matrix | Medium | Solution |
| 2 | Merge Intervals | Medium | Solution |
| 2 | Single Element in a Sorted Array | Medium | Solution |
| 2 | Search in Rotated Sorted Array | Medium | Solution | OT [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------|
| 1 | Reducing Dishes | Hard | Solution | Opendoor [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 10 | Game of Life | Medium | Solution |
| 3 | Design Excel Sum Formula | Hard | Solution | Optum [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Reverse Prefix of Word | Easy | Solution | Oracle [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 13 | LRU Cache | Hard | Solution |
| 13 | Meeting Rooms II | Medium | Solution |
| 9 | Number of Islands | Medium | Solution |
| 7 | Subarray Sum Equals K | Medium | Solution |
| 7 | Longest Palindromic Substring | Medium | Solution |
| 7 | Top K Frequent Elements | Medium | Solution |
| 6 | Decode String | Medium | Solution |
| 6 | Two Sum | Easy | Java , C++ , Javascript |
| 6 | Valid Parentheses | Easy | Solution |
| 5 | Search in Rotated Sorted Array | Medium | Solution |
| 5 | First Unique Character in a String | Easy | Solution |
| 4 | Delete Node in a BST | Medium | Solution |
| 4 | Binary Tree Level Order Traversal | Medium | Solution |
| 4 | Add Two Numbers | Medium | Solution |
| 4 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 4 | Merge Two Sorted Lists | Easy | Solution |
| 4 | Add Strings | Easy | Solution |
| 4 | Best Time to Buy and Sell Stock | Easy | Solution |
| 4 | Product of Array Except Self | Medium | Solution |
| 4 | Merge Intervals | Medium | Solution | Palantir Technologies [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | UTF-8 Validation | Medium | Solution |
| 1 | Check If It Is a Straight Line | Easy | Solution | PayTM [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Distribute Candies to People | Easy | Solution |
| 2 | Reverse Words in a String III | Easy | Solution |
| 2 | Maximum Subarray | Easy | Solution | Paypal [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Happy Number | Easy | Solution |
| 3 | Number of Islands | Medium | Solution |
| 3 | Trapping Rain Water | Hard | Solution |
| 3 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | Reverse Substrings Between Each Pair of Parentheses | Medium | Solution |
| 2 | Squares of a Sorted Array | Easy | Solution |
| 2 | LRU Cache | Hard | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | ZigZag Conversion | Easy | Solution |
| 2 | Longest Substring Without Repeating Characters | Medium | Solution , C++ | PhonePe [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Distribute Coins in Binary Tree | Medium | Solution |
| 1 | Simple Bank System | Medium | Java | Pinterest [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 4 | Parallel Courses III | Hard | Java |
| 3 | Find the Celebrity | Medium | Solution |
| 2 | Accounts Merge | Medium | Solution |
| 2 | Number of Connected Components in an Undirected Graph | Medium | Solution |
| 2 | Word Pattern II | Hard | Solution |
| 2 | Alien Dictionary | nan | Solution |
| 2 | Clone Graph | Medium | Solution | Pocket Gems [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Non-negative Integers without Consecutive Ones | Hard | Solution | Postmates [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Print Words Vertically | Medium | Solution |
| 1 | Determine if Two Strings Are Close | Medium | Solution | Pure Storage [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Maximum Repeating Substring | Easy | Solution |
| 2 | Valid Square | Medium | Java , Javascript | Qualcomm [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | Reverse Bits | Easy | Solution |
| 2 | Middle of the Linked List | Easy | Solution |
| 2 | Find the Duplicate Number | Medium | Solution | Qualtrics [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 7 | Word Break | Medium | Solution |
| 6 | Word Ladder | Hard | Solution |
| 5 | 3Sum | Medium | Solution , C++ |
| 4 | Max Area of Island | Medium | Solution |
| 4 | Binary Tree Right Side View | Medium | Solution |
| 4 | Unique Paths II | Medium | Solution |
| 4 | Trapping Rain Water | Hard | Solution |
| 3 | First Unique Character in a String | Easy | Solution |
| 3 | Number of Islands | Medium | Solution |
| 2 | Sort Integers by The Power Value | Medium | Solution |
| 2 | Fixed Point | Easy | Solution |
| 2 | Summary Ranges | Medium | Solution |
| 2 | Longest Consecutive Sequence | Hard | Solution |
| 2 | Validate Binary Search Tree | Medium | Solution |
| 2 | Jump Game | Medium | Solution | Quora [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Construct Target Array With Multiple Sums | Hard | Solution |
| 3 | Subarray Sum Equals K | Medium | Solution |
| 3 | Longest Common Prefix | Easy | Solution |
| 2 | Sliding Window Maximum | Hard | Solution |
| 1 | Range Frequency Queries | Medium | Java |
| 1 | Maximum Number of Words You Can Type | Easy | Solution |
| 1 | Substrings of Size Three with Distinct Characters | Easy | Solution | Rakuten [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Sum of Floored Pairs | Hard | Solution | Reddit [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 3 | Subsets | Medium | Solution |
| 3 | Text Justification | Hard | Solution |
| 3 | Combination Sum II | Medium | Solution | Redfin [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 3 | String Compression | Easy | Solution |
| 3 | String to Integer (atoi) | Medium | Solution | Riot Games [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Teemo Attacking | Medium | Solution | Robinhood [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 8 | Alert Using Same Key-Card Three or More Times in a One Hour Period | Medium | Solution |
| 7 | Course Schedule II | Medium | Solution |
| 3 | Insert Intervals | Hard | Solution |
| 1 | Count Good Meals | Medium | Solution | Roblox [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:--------------|:-----------------------------------------------------------------------------------------------------------------------|
| 5 | Design Browser History | Medium | Solution |
| 5 | Course Schedule II | Medium | Solution |
| 4 | Minimum Falling Path Sum II | Hard | Solution |
| 2 | Number of Matching Subsequences | Medium | Solution |
| 2 | Find Peak Element | Binary Search | Solution | Rubrik [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 5 | Snapshot Array | Easy | Javascript |
| 4 | Kth Smallest Number in Multiplication Table | Hard | Solution |
| 4 | Trapping Rain Water | Hard | Solution |
| 2 | Task Scheduler | Medium | Solution |
| 2 | Majority Element | Easy | Solution |
| 2 | Read N Characters Given Read4 | Easy | Solution |
| 2 | Edit Distance | Hard | Solution |
| 2 | Rotate Image | Medium | Solution | Rupeek [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Find Kth Bit in Nth Binary String | Medium | Solution | SAP [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 9 | Monotone Increasing Digits | Medium | Solution |
| 4 | Number of Islands | Medium | Solution |
| 3 | Longest Common Prefix | Easy | Solution |
| 2 | Prison Cells After N Days | Medium | Solution |
| 2 | Count Primes | Easy | Solution |
| 2 | Add Two Numbers | Medium | Solution |
| 1 | Maximum Students Taking Exam | Hard | Solution | Salesforce [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 19 | Merge Intervals | Medium | Solution |
| 10 | LRU Cache | Hard | Solution |
| 7 | Maximum Product of Three Numbers | Easy | Solution |
| 6 | Largest Number | Medium | Solution |
| 6 | Number of Islands | Medium | Solution |
| 6 | Sliding Window Maximum | Hard | Solution |
| 6 | Zuma Game | Hard | Solution |
| 5 | Boats to Save People | Medium | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 3 | Min Stack | Easy | Solution |
| 3 | Course Schedule II | Medium | Solution |
| 3 | Maximal Square | Medium | Solution |
| 3 | Construct the Lexicographically Largest Valid Sequence | Medium | Solution |
| 3 | Integer to English Words | Hard | Solution |
| 3 | Sort Colors | Medium | Solution |
| 3 | Design In-Memory File System | Hard | Solution |
| 3 | Design HashMap | Easy | Solution |
| 3 | Maximum Subarray | Easy | Solution |
| 3 | Group Anagrams | Medium | Solution |
| 3 | Minimum Absolute Difference | Easy | Solution | Samsung [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Maximum Subarray | Easy | Solution |
| 2 | Range Sum Query 2D - Immutable | Medium | Solution |
| 2 | Reorder List | Medium | Solution |
| 2 | Reverse Integer | Easy | Solution , C++ |
| 2 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 2 | Two Sum | Easy | Java , C++ , Javascript |
| 1 | Maximum Product of Two Elements in an Array | Easy | Solution | Sapient [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Trapping Rain Water | Hard | Solution | ServiceNow [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 3 | Number of Islands | Medium | Solution |
| 3 | Binary Tree Level Order Traversal | Medium | Solution |
| 3 | Group Anagrams | Medium | Solution |
| 3 | Search in Rotated Sorted Array | Medium | Solution |
| 2 | Pairs of Songs With Total Durations Divisible by 60 | Easy | Solution |
| 2 | Rectangle Overlap | Easy | Solution |
| 2 | Subarray Sum Equals K | Medium | Solution |
| 2 | Palindrome Linked List | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Maximum Subarray | Easy | Solution |
| 2 | Trapping Rain Water | Hard | Solution |
| 2 | Valid Parentheses | Easy | Solution | Shopee [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 5 | Maximum Subarray | Easy | Solution |
| 4 | Merge Sorted Array | Easy | Solution |
| 4 | Merge Intervals | Medium | Solution |
| 3 | Palindrome Pairs | Hard | Solution |
| 3 | Number of Islands | Medium | Solution |
| 3 | Construct Binary Tree from Inorder and Postorder Traversal | Medium | Solution |
| 3 | Merge Two Sorted Lists | Easy | Solution |
| 2 | LRU Cache | Hard | Solution |
| 2 | Merge k Sorted Lists | Hard | Solution | Snapchat [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------|:--------------|:------------------------------------------------------------------------------------------------------------------------------|
| 7 | Word Search | Medium | Solution |
| 5 | Decode String | Medium | Solution |
| 5 | Sparse Matrix Multiplication | Medium | Solution |
| 5 | Evaluate Division | Medium | Solution |
| 5 | Word Search II | Hard | Solution |
| 5 | Merge Intervals | Medium | Solution |
| 4 | Find Peak Element | Binary Search | Solution |
| 4 | Implement Trie | Medium | Solution |
| 4 | Word Ladder | Hard | Solution |
| 3 | Add Binary | Easy | Solution |
| 3 | Binary Tree Maximum Path Sum | Hard | Solution |
| 3 | Basic Calculator II | Medium | Solution |
| 3 | LRU Cache | Hard | Solution |
| 3 | Word Break II | Hard | Solution |
| 3 | Word Ladder II | Hard | Solution |
| 3 | Shortest Path in Binary Matrix | Medium | Solution |
| 3 | Android Unlock Patterns | Medium | Solution |
| 3 | Subarray Sums Divisible by K | Medium | Solution |
| 3 | LFU Cache | Hard | Solution |
| 3 | The Maze | Medium | Solution | Softwire [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Second Largest Digit in a String | Easy | Solution | Sony [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------|
| 2 | Reducing Dishes | Hard | Solution | Splunk [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Search in Rotated Sorted Array | Medium | Solution | Spotify [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 16 | Moving Average from Data Stream | Easy | Solution |
| 13 | Valid Parentheses | Easy | Solution |
| 10 | Ransom Note | Easy | Solution |
| 10 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 8 | Valid Anagram | Easy | Solution |
| 8 | Two Sum | Easy | Java , C++ , Javascript |
| 6 | Analyze User Website Visit Pattern | Medium | Solution |
| 4 | Sliding Window Median | Hard | Solution |
| 4 | Linked List Cycle | Easy | Solution |
| 4 | Balanced Binary Tree | Easy | Solution |
| 4 | Maximum Depth of Binary Tree | Easy | Solution |
| 3 | Longest Consecutive Sequence | Hard | Solution |
| 2 | Lowest Common Ancestor of a Binary Tree III | Medium | Solution |
| 2 | Remove All Adjacent Duplicates in String II | Medium | Solution | Sprinklr [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Max Points on a Line | Hard | Solution |
| 2 | Binary Tree Maximum Path Sum | Hard | Solution |
| 2 | Merge k Sorted Lists | Hard | Solution |
| 1 | Next Greater Numerically Balanced Number | Medium | Java |
| 1 | Maximum Subarray Sum After One Operation | Medium | Solution | Square [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Design Snake Game | Medium | Solution |
| 3 | Text Justification | Hard | Solution |
| 2 | Rank Teams by Votes | Medium | Solution |
| 2 | Available Captures for Rook | Easy | Solution |
| 2 | Unique Morse Code Words | Easy | Solution |
| 2 | The Maze | Medium | Solution |
| 2 | Number of Islands | Medium | Solution |
| 2 | Minimum Path Sum | Medium | Solution |
| 1 | Squirrel Simulation | Medium | Solution | Sumologic [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Lowest Common Ancestor of a Binary Tree | Medium | Solution |
| 2 | Lowest Common Ancestor of a Binary Tree III | Medium | Solution |
| 2 | K Closest Points to Origin | Easy | Solution |
| 2 | Decode String | Medium | Solution | Swiggy [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 4 | Container With Most Water | Medium | Solution |
| 3 | Letter Combinations of a Phone Number | Medium | Solution |
| 2 | Rabbits in Forest | Medium | Solution | T System [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Grid Game | Medium | Java | TIAA [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Find Greatest Common Divisor of Array | Easy | Solution | Tencent [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Minimum Factorization | Medium | Solution | Tesla [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | Largest Perimeter Triangle | Easy | Solution |
| 3 | Trapping Rain Water | Hard | Solution |
| 2 | Minimum Changes To Make Alternating Binary String | Easy | Solution |
| 2 | Find Winner on a Tic Tac Toe Game | Easy | Solution |
| 2 | Spiral Matrix III | Medium | Solution |
| 2 | Basic Calculator II | Medium | Solution |
| 2 | Jump Game II | Hard | Solution |
| 2 | Generate Parentheses | Medium | Solution |
| 2 | 3Sum | Medium | Solution , C++ | Thumbtack [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Vowel Spellchecker | Medium | Solution | Tiger Analytics [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Find Closest Number to Zero | Easy | Java | Toptal [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 3 | Reformat Phone Number | Easy | Solution | TripleByte [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 4 | Minimum ASCII Delete Sum for Two Strings | Medium | Solution | TuSimple [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Binary Tree Postorder Traversal | Easy | Solution |
| 2 | Binary Tree Maximum Path Sum | Hard | Solution |
| 2 | Merge k Sorted Lists | Hard | Solution | Twilio [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 7 | LRU Cache | Hard | Solution |
| 5 | Course Schedule II | Medium | Solution |
| 3 | Subarray Sums Divisible by K | Medium | Solution |
| 3 | Sliding Window Maximum | Hard | Solution |
| 3 | Text Justification | Hard | Solution |
| 3 | Group Anagrams | Medium | Solution |
| 2 | Sort Array by Increasing Frequency | Easy | Solution |
| 2 | Knight Dialer | Medium | Solution |
| 2 | Binary Tree Maximum Path Sum | Hard | Solution | Twitch [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 7 | Battleships in a Board | Medium | Solution |
| 2 | Number of Islands | Medium | Solution | Twitter [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------|
| 23 | Word Search | Medium | Solution |
| 10 | Finding the Users Active Minutes | Medium | Solution |
| 7 | Design Hit Counter | Medium | Solution |
| 6 | Tweet Counts Per Frequency | Medium | Solution |
| 6 | Implement Trie | Medium | Solution |
| 5 | Insert Delete GetRandom O(1) | Medium | Solution |
| 5 | Word Search II | Hard | Solution |
| 5 | Design Log Storage System | Medium | Solution |
| 4 | Paint House | Medium | Solution |
| 4 | Reconstruct Itinerary | Medium | Solution |
| 3 | Minimum Moves to Equal Array Elements | Easy | Solution |
| 3 | Subsets | Medium | Solution |
| 3 | Flatten Nested List Iterator | Medium | Solution |
| 3 | Rearrange String k Distance Apart | Hard | Solution |
| 3 | Merge Intervals | Medium | Solution |
| 3 | Accounts Merge | Medium | Solution |
| 2 | Random Pick with Weight | Medium | Solution |
| 2 | Design Search Autocomplete System | Hard | Solution |
| 2 | Palindromic Substrings | Medium | Solution |
| 2 | Maximum Frequency Stack | Hard | Solution | Two Sigma [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Maximum Product of Splitted Binary Tree | Medium | Solution |
| 4 | Multiply Strings | Medium | Solution |
| 2 | Top K Frequent Words | Medium | Solution | Uber [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 36 | Word Search II | Hard | Solution |
| 17 | Word Search | Medium | Solution |
| 17 | Top K Frequent Words | Medium | Solution |
| 14 | Evaluate Division | Medium | Solution |
| 14 | Merge Intervals | Medium | Solution |
| 14 | Longest Continuous Subarray With Absolute Diff Less Than or Equal to Limit | Medium | Solution |
| 13 | Random Pick with Weight | Medium | Solution |
| 12 | Dungeon Game | Hard | Solution |
| 11 | Two Sum | Easy | Java , C++ , Javascript |
| 11 | Kth Smallest Element in a BST | Medium | Solution |
| 10 | Leftmost Column with at Least a One | Medium | Solution |
| 10 | Reconstruct Itinerary | Medium | Solution |
| 9 | Letter Combinations of a Phone Number | Medium | Solution |
| 8 | Product of Array Except Self | Medium | Solution |
| 8 | Optimal Account Balancing | Hard | Solution |
| 7 | Construct K Palindrome Strings | Medium | Solution |
| 7 | Number of Islands | Medium | Solution |
| 7 | Best Time to Buy and Sell Stock | Easy | Solution |
| 7 | Insert Delete GetRandom O(1) | Medium | Solution |
| 7 | Minesweeper | Medium | Solution | United Health Group [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:---------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Check If All 1's Are at Least Length K Places Away | Medium | Solution |
| 1 | Day of the Week | Easy | Solution | VMware [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 5 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 5 | Valid Parentheses | Easy | Solution |
| 5 | Merge Intervals | Medium | Solution |
| 4 | Maximum Subarray | Easy | Solution |
| 4 | Course Schedule II | Medium | Solution |
| 4 | Merge k Sorted Lists | Hard | Solution |
| 4 | Verify Preorder Sequence in Binary Search Tree | Medium | Solution |
| 4 | Rotting Oranges | Medium | Solution |
| 3 | 3Sum | Medium | Solution , C++ |
| 3 | Max Stack | Hard | Solution |
| 3 | Search in Rotated Sorted Array | Medium | Solution |
| 3 | Number of Islands | Medium | Solution |
| 3 | Sort Colors | Medium | Solution |
| 2 | Remove Duplicates from Sorted Array | Easy | Solution |
| 2 | Longest Valid Parentheses | Hard | Solution |
| 2 | Search Insert Position | Easy | Solution |
| 2 | Group Anagrams | Medium | Solution |
| 2 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 2 | Rearrange Words in a Sentence | Medium | Solution |
| 2 | Validate Binary Search Tree | Medium | Solution | Valve [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Dota2 Senate | Medium | Solution | Virtu Financial [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 2 | Hexspeak | Easy | Solution |
| 2 | Array Transformation | Easy | Solution |
| 2 | How Many Apples Can You Put into the Basket | Easy | Solution |
| 2 | Count Substrings with Only One Distinct Letter | Easy | Solution |
| 1 | Count Number of Homogenous Substrings | Medium | Solution | Visa [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 11 | Last Stone Weight | Easy | Solution |
| 6 | Maximal Square | Medium | Solution |
| 4 | Two Sum | Easy | Java , C++ , Javascript |
| 4 | Subarray Sum Equals K | Medium | Solution |
| 4 | String Compression | Easy | Solution |
| 3 | Group Anagrams | Medium | Solution |
| 3 | Backspace String Compare | Easy | Solution |
| 3 | Minimum Moves to Equal Array Elements | Easy | Solution |
| 3 | Meeting Rooms II | Medium | Solution |
| 3 | Linked List Cycle | Easy | Solution |
| 3 | Search in Rotated Sorted Array | Medium | Solution |
| 2 | Add Two Numbers | Medium | Solution |
| 2 | Reverse Integer | Easy | Solution , C++ |
| 2 | Trapping Rain Water | Hard | Solution |
| 2 | Reverse Words in a String | Medium | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Longest Consecutive Sequence | Hard | Solution |
| 2 | Remove All Adjacent Duplicates in String II | Medium | Solution |
| 2 | Largest Number | Medium | Solution |
| 1 | Create Target Array in the Given Order | Easy | Solution | Walmart Global Tech [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 9 | Remove Colored Pieces if Both Neighbors are the Same Color | Medium | Java |
| 7 | 3Sum | Medium | Solution , C++ |
| 5 | Search in Rotated Sorted Array | Medium | Solution |
| 5 | Group Anagrams | Medium | Solution |
| 5 | Merge Intervals | Medium | Solution |
| 4 | Strange Printer | Hard | Solution |
| 4 | Best Time to Buy and Sell Stock II | Easy | Solution |
| 4 | LRU Cache | Hard | Solution |
| 3 | Two Sum | Easy | Java , C++ , Javascript |
| 3 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 3 | Remove Nth Node From End of List | Medium | Solution |
| 3 | Generate Parentheses | Medium | Solution |
| 3 | Gas Station | Medium | Solution |
| 3 | Design HashMap | Easy | Solution |
| 3 | Best Time to Buy and Sell Stock | Easy | Solution |
| 3 | Degree of an Array | Easy | Solution |
| 2 | Maximum Subarray | Easy | Solution |
| 2 | Median of Two Sorted Arrays | Hard | Solution , C++ |
| 2 | Reformat Date | Easy | Solution |
| 2 | Merge k Sorted Lists | Hard | Solution | Wayfair [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 20 | Add Strings | Easy | Solution |
| 9 | Subdomain Visit Count | Easy | Solution |
| 7 | Maximum Length of Repeated Subarray | Medium | Solution |
| 5 | Alert Using Same Key-Card Three or More Times in a One Hour Period | Medium | Solution |
| 4 | The Number of Full Rounds You Have Played | Medium | Solution |
| 4 | Course Schedule II | Medium | Solution |
| 3 | Valid Sudoku | Medium | Solution , Javascript |
| 2 | Max Chunks To Make Sorted | Medium | Solution |
| 2 | Valid Palindrome | Easy | Solution |
| 1 | Minimum Moves to Reach Target Score | Medium | Java |
| 1 | Delete Characters to Make Fancy String | Easy | Solution | Wealthfront [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Minimum Number of Steps to Make Two Strings Anagram II | Medium | Java | Wish [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 2 | Find Right Interval | Medium | Solution | Works Applications [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | K Inverse Pairs Array | Hard | Solution | Yahoo [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 6 | Add Two Numbers | Medium | Solution |
| 5 | Restore IP Addresses | Medium | Solution |
| 5 | 3Sum | Medium | Solution , C++ |
| 4 | Two Sum | Easy | Java , C++ , Javascript |
| 4 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 4 | Top K Frequent Words | Medium | Solution |
| 4 | Roman to Integer | Easy | Solution |
| 4 | Intersection of Two Arrays II | Easy | Solution |
| 3 | Find All Anagrams in a String | Easy | Solution |
| 3 | Climbing Stairs | Easy | Solution |
| 3 | Pow(x, n) | Medium | Solution |
| 3 | First Unique Character in a String | Easy | Solution |
| 3 | LRU Cache | Hard | Solution |
| 3 | Merge Two Binary Trees | Easy | Solution |
| 3 | Valid Parenthesis String | Medium | Solution |
| 2 | Divide Two Integers | Medium | Solution |
| 2 | Group Anagrams | Medium | Solution |
| 2 | One Edit Distance | nan | Solution |
| 2 | 4 Sum | Medium | Solution |
| 2 | Palindrome Number | Easy | Java , C++ | Yandex [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:---------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 8 | Maximize Distance to Closest Person | Easy | Solution |
| 6 | Move Zeroes | Easy | Solution |
| 6 | Zigzag Iterator | Medium | Solution |
| 6 | Summary Ranges | Medium | Solution |
| 4 | Permutation in String | Medium | Solution |
| 4 | Merge Intervals | Medium | Solution |
| 3 | Valid Parentheses | Easy | Solution |
| 3 | Merge k Sorted Lists | Hard | Solution |
| 3 | Reverse Linked List | Easy | Solution |
| 3 | Longest Subarray of 1's After Deleting One Element | Medium | Solution |
| 3 | Is Subsequence | Medium | Solution |
| 3 | String Compression | Easy | Solution |
| 3 | Max Stack | Hard | Solution |
| 3 | Number of Recent Calls | Easy | Solution |
| 2 | Number of Students Doing Homework at a Given Time | Easy | Solution |
| 2 | Line Reflection | Medium | Solution |
| 2 | Evaluate Reverse Polish Notation | Medium | Solution |
| 2 | Valid Palindrome | Easy | Solution |
| 2 | Reverse Words in a String III | Easy | Solution |
| 2 | Median of Two Sorted Arrays | Hard | Solution , C++ | Yelp [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 5 | Destination City | Easy | Solution |
| 4 | Filter Restaurants by Vegan-Friendly, Price and Distance | Medium | Solution |
| 1 | Minimum Index Sum of Two Lists | Easy | Solution | ZScaler [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Day of the Year | Easy | Solution | Zenefits [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:----------------------------------------------------------------------------------------------------------------------|
| 1 | Verify Preorder Sequence in Binary Search Tree | Medium | Solution |
| 1 | N-Queens II | Hard | Solution | Zillow [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 2 | Spiral Matrix | Medium | Solution |
| 2 | Two Sum | Easy | Java , C++ , Javascript | Zoho [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Two Sum | Easy | Java , C++ , Javascript |
| 2 | K-diff Pairs in an Array | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Longest Substring Without Repeating Characters | Medium | Solution , C++ |
| 1 | Sort Even and Odd Indices Independently | Easy | Java | Zomato [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 2 | Two Sum | Easy | Java , C++ , Javascript | Zoom [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 4 | Rotate String | Easy | Solution |
| 3 | Fibonacci Number | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Reverse Nodes in k-Group | Hard | Solution |
| 2 | Two Sum | Easy | Java , C++ , Javascript | Zopsmart [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Reverse Nodes in Even Length Groups | Medium | Java | eBay [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Group Anagrams | Medium | Solution |
| 3 | Reorganize String | Medium | Solution |
| 2 | Rotating the Box | Medium | Solution |
| 2 | Candy Crush | Medium | Solution |
| 2 | Odd Even Linked List | Medium | Solution |
| 2 | Implement Queue using Stacks | Medium | Solution |
| 2 | Number of Islands | Medium | Solution |
| 2 | Longest Consecutive Sequence | Hard | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution | edabit [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Calculate Money in Leetcode Bank | Easy | Solution | instacart [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 2 | Check if Every Row and Column Contains All Numbers | Easy | Java | payu [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:---------------------------------------------------------------------------------------------------------------------|
| 2 | Jump Game II | Hard | Solution |
| 1 | Count Operations to Obtain Zero | Easy | Java | peak6 [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 1 | Coordinate With Maximum Network Quality | Medium | Solution | persistent systems [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:----------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Minimum Bit Flips to Convert Number | Easy | Java | razorpay [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Two Best Non-Overlapping Events | Medium | Java | tcs [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------|:-------------|:-----------------------------------------------------------------------------------------------------------------------|
| 4 | Move Zeroes | Easy | Solution |
| 3 | Reverse String | Easy | Solution |
| 2 | Contains Duplicate | Easy | Solution |
| 2 | Rotate Array | Easy | Solution |
| 2 | Best Time to Buy and Sell Stock | Easy | Solution |
| 2 | Valid Parentheses | Easy | Solution |
| 1 | Maximum Ascending Subarray Sum | Easy | Solution |
| 1 | Palindrome Partitioning IV | Hard | Solution | tiktok [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 13 | Design a Stack With Increment Operation | Medium | Solution |
| 10 | Reformat Date | Easy | Solution |
| 8 | Maximum Swap | Medium | Solution |
| 7 | Course Schedule | Medium | Solution |
| 6 | LRU Cache | Hard | Solution |
| 6 | Split Array Largest Sum | Hard | Solution |
| 6 | Number of Islands | Medium | Solution |
| 6 | Spiral Matrix II | Medium | Solution |
| 6 | Binary Tree Maximum Path Sum | Hard | Solution |
| 5 | Gas Station | Medium | Solution |
| 5 | Decode String | Medium | Solution |
| 5 | Subarray Sum Equals K | Medium | Solution |
| 5 | Asteroid Collision | Medium | Solution |
| 5 | Walls and Gates | Medium | Solution |
| 5 | Roman to Integer | Easy | Solution |
| 4 | Path Sum III | Easy | Solution |
| 4 | Search in Rotated Sorted Array | Medium | Solution |
| 4 | Count Complete Tree Nodes | Medium | Solution |
| 4 | Evaluate Division | Medium | Solution |
| 4 | 3Sum | Medium | Solution , C++ | zeta suite [back to top] | Occurence | Problem | Difficulty | Solution |
|------------:|:--------------------------------------------------------------------------------------------------------------------------|:-------------|:-------------------------------------------------------------------------------------------------------------------|
| 1 | Count Equal and Divisible Pairs in an Array | Easy | Java | | Lists of company wise questions available on leetcode premium. Every csv file in the companies directory corresponds to a list of questions on leetcode for a specific company based on the leetcode company tags. Updated as of May, 2022. | google-interview,interview,leetcode,amazon-interview,facebook-interview,microsoft-interview,leetcode-company-questions | 0 | 1 | 1 | 37 | 3 | 2 | 0 |
facebook/memlab | MemLab memlab is an end-to-end testing and analysis framework for identifying
JavaScript memory leaks and optimization opportunities. Online Resources: [ Website and Demo ] | [ Documentation ] | [ Meta Engineering Blog Post ] Features:
* Browser memory leak detection - Write test scenarios with the Puppeteer
API, and memlab will automatically compare JavaScript heap snapshots, filter
out memory leaks, and aggregate the results
* Object-oriented heap traversing API - Supports the creation of
self-defined memory leak detector, and enables programmatic analysis JS heap
snapshots taken from Chromium-based browsers, Node.js, Electron.js, and Hermes
* Memory CLI toolbox - Built-in toolbox and APIs for finding memory
optimization opportunities (not necessarily just memory leaks)
* Memory assertions in Node.js - Enables unit tests or running node.js
programs to take a heap snapshot of their own state, perform self memory
checking, or write advanced memory assertions CLI Usage Install the CLI bash
npm install -g memlab Find Memory Leaks To find memory leaks in Google Maps, you can create a scenario file defining how
to interact with the Google Maps, let's name it test-google-maps.js : ```javascript
// initial page load url: Google Maps
function url() {
return 'https://www.google.com/maps/@37.386427,-122.0428214,11z';
} // action where we want to detect memory leaks: click the Hotels button
async function action(page) {
// puppeteer page API
await page.click('button[aria-label="Hotels"]');
} // action where we want to go back to the step before: click clear search
async function back(page) {
// puppeteer page API
await page.click('[aria-label="Close"]');
} module.exports = {action, back, url};
``` Now run memlab with the scenario file, memlab will interact with
the web page and detect memory leaks with built-in leak detectors: bash
memlab run --scenario test-google-maps.js memlab will print memory leak results showing one representative
retainer trace for each cluster of leaked objects. Retainer traces : This is the result from an example website ,
the retainer trace is an object reference chain from the GC root to a leaked
object. The trace shows why and how a leaked object is still kept alive in
memory. Breaking the reference chain means the leaked object will no longer
be reachable from the GC root, and therefore can be garbage collected.
By following the leak trace one step at a time, you will be able to find
a reference that should be set to null (but it wasn't due to a bug). bash
MemLab found 46 leak(s)
--Similar leaks in this run: 4--
--Retained size of leaked objects: 8.3MB--
[Window] (native) @35847 [8.3MB]
--20 (element)---> [InternalNode] (native) @130981728 [8.3MB]
--8 (element)---> [InternalNode] (native) @130980288 [8.3MB]
--1 (element)---> [EventListener] (native) @131009888 [8.3MB]
--1 (element)---> [V8EventListener] (native) @224808192 [8.3MB]
--1 (element)---> [eventHandler] (closure) @168079 [8.3MB]
--context (internal)---> [<function scope>] (object) @181905 [8.3MB]
--bigArray (variable)---> [Array] (object) @182925 [8.3MB]
--elements (internal)---> [(object elements)] (array) @182929 [8.3MB]
... To get readable trace, the web site under test needs to serve non-minified code (or at least minified code
with readable variables, function name, and property names on objects). Alternatively, you can debug the leak by loading the heap snapshot taken by memlab (saved in $(memlab get-default-work-dir)/data/cur )
in Chrome DevTool and search for the leaked object ID ( @182929 ). View Retainer Trace Interactively View memory issues detected by memlab based on a single JavaScript
heap snapshot taken from Chromium, Hermes, memlab, or any node.js
or Electron.js program: bash
memlab view-heap --snapshot <PATH TO .heapsnapshot FILE> You can optionally specify a specific heap object with the object's id: --node-id @28173 to pinpoint a specific object. Self-defined leak detector : If you want to use a self-defined leak detector, add a leakFilter callback
( doc )
in the scenario file. leakFilter will be called for every unreleased heap
object ( node ) allocated by the target interaction. javascript
function leakFilter(node, heap) {
// ... your leak detector logic
// return true to mark the node as a memory leak
}; heap is the graph representation of the final JavaScript heap snapshot.
For more details, view the doc site . Heap Analysis and Investigation View which object keeps growing in size during interaction in the previous run: bash
memlab analyze unbound-object Analyze pre-fetched V8/hermes .heapsnapshot files: bash
memlab analyze unbound-object --snapshot-dir <DIR_OF_SNAPSHOT_FILES> Use memlab analyze to view all built-in memory analyses.
For extension, view the doc site . View retainer trace of a particular object: bash
memlab trace --node-id <HEAP_OBJECT_ID> Use memlab help to view all CLI commands. APIs Use the memlab npm package to start a E2E run in browser and detect memory leaks. ```javascript
const memlab = require('memlab'); const scenario = {
// initial page load url
url: () => 'https://www.google.com/maps/@37.386427,-122.0428214,11z', // action where we want to detect memory leaks
action: async (page) => await page.click('button[aria-label="Hotels"]'),
// action where we want to go back to the step before
back: async (page) => await page.click('[aria-label="Close"]'), }
memlab.run({scenario});
``` Memory Assertions memlab makes it possible to enable a unit test or running node.js program
to take a heap snapshot of its own state, and write advanced memory assertions: ```typescript
// save as example.test.ts
import type {IHeapSnapshot, Nullable} from '@memlab/core';
import {config, takeNodeMinimalHeap} from '@memlab/core'; class TestObject {
public arr1 = [1, 2, 3];
public arr2 = ['1', '2', '3'];
} test('memory test with heap assertion', async () => {
config.muteConsole = true; // no console output let obj: Nullable = new TestObject();
// get a heap snapshot of the current program state
let heap: IHeapSnapshot = await takeNodeMinimalHeap(); // call some function that may add references to obj
rabbitHole(obj) expect(heap.hasObjectWithClassName('TestObject')).toBe(true);
obj = null; heap = await takeNodeMinimalHeap();
// if rabbitHole does not have any side effect that
// adds new references to obj, then obj can be GCed
expect(heap.hasObjectWithClassName('TestObject')).toBe(false); }, 30000);
``` For other APIs check out the API documentation . Development Use node version 16 or above. To build on Windows, please use Git Bash. First build the project as follows: bash
npm install
npm run build Then keep this helper script running to ensure that local changes are picked up
and compiled automatically during development: bash
npm run dev NOTE: To run the memlab cli locally, make sure to prefix the memlab command with
npx from within the memlab repo e.g. npx memlab Run tests: bash
npm run test License memlab is MIT licensed, as found in the LICENSE file. Contributing Check our contributing guide to learn about how to
contribute to the project. Code of Conduct Check our Code Of Conduct to learn more about our
contributor standards and expectations. | A framework for finding JavaScript memory leaks and analyzing heap snapshots | facebook,javascript,memory,detector,heap,leak,nodejs,snapshot,v8,e2e | 0 | 124 | 27 | 359 | 10 | 13 | 1 |
readysettech/readyset | ReadySet is a transparent database cache for Postgres & MySQL that gives you the performance and scalability of an in-memory key-value store without requiring that you rewrite your app or manually handle cache invalidation. ReadySet sits between your application and database and turns even the most complex SQL reads into lightning-fast lookups. Unlike other caching solutions, it keeps cached query results in sync with your database automatically by utilizing your database’s replication stream. It is wire-compatible with Postgres and MySQL and can be used along with your current ORM or database client. :star: If you find ReadySet useful, please consider giving us a star on GitHub! Your support helps us continue to innovate and deliver exciting new features. Quickstart To get started in five minutes or less, run: bash -c "$(curl -sSL https://launch.readyset.io)" You can also install via a Docker image or with a Linux binary . See our getting started guide for more details! ReadySet Cloud is a managed service that scales your database with ease. If you're interested in trying out ReadySet Cloud, try it today ! Useful Links Interactive demo : interactive walk through of ReadySet’s features. Getting started guide : instructions for how to connect ReadySet to your database and start caching queries. Why ReadySet : explains the motivation behind ReadySet and how it compares to traditional database caching. Documentation : more in-depth information about how to use ReadySet. Blog : articles from the ReadySet universe. Community support For general help using ReadySet, see our official docs . For additional help, you can use one of these channels to ask questions, or give us feedback:
* Slack : Discussions with the community and the team.
* GitHub : For bug reports and feature requests.
* 𝕏 (Twitter) : For product updates and other news. Contributing We welcome contributions! Here are a few helpful links to get you started:
* Guide to build ReadySet from source * Good first issues for first-time contributors * Github issues link to suggest bug fixes and features * #source-code channel in Slack to discuss larger projects License ReadySet is licensed under the BSL 1.1 license, converting to the open-source Apache 2.0 license after 4 years. It is free to use on any number of nodes. | Readyset is a MySQL and Postgres wire-compatible caching layer that sits in front of existing databases to speed up queries and horizontally scale read throughput. Under the hood, ReadySet caches the results of cached select statements and incrementally updates these results over time as the underlying data changes. | caching,caching-proxy,databases,mysql,postgres,postgresql,rust,streaming-data,sql,backend | 19 | 35 | 702 | 9,855 | 273 | 911 | 0 |
sensity-ai/dot | the Deepfake Offensive Toolkit [![stars](https://img.shields.io/github/stars/sensity-ai/dot)](https://github.com/sensity-ai/dot/stargazers)
[![license](https://img.shields.io/badge/License-BSD_3--Clause-blue.svg)](https://github.com/sensity-ai/dot/blob/main/LICENSE)
[![Python 3.8](https://img.shields.io/badge/python-3.8-blue.svg)](https://www.python.org/downloads/release/python-3812/)
[![build-dot](https://github.com/sensity-ai/dot/actions/workflows/build_dot.yaml/badge.svg)](https://github.com/sensity-ai/dot/actions/workflows/build_dot.yaml)
[![code-check](https://github.com/sensity-ai/dot/actions/workflows/code_check.yaml/badge.svg)](https://github.com/sensity-ai/dot/actions/workflows/code_check.yaml) dot (aka Deepfake Offensive Toolkit) makes real-time, controllable deepfakes ready for virtual cameras injection. dot is created for performing penetration testing against e.g. identity verification and video conferencing systems, for the use by security analysts, Red Team members, and biometrics researchers. If you want to learn more about dot is used for penetration tests with deepfakes in the industry, read these articles by The Verge and Biometric Update . dot is developed for research and demonstration purposes. As an end user, you have the responsibility to obey all applicable laws when using this program. Authors and contributing developers assume no liability and are not responsible for any misuse or damage caused by the use of this program. How it works In a nutshell, dot works like this mermaid
flowchart LR;
A(your webcam feed) --> B(suite of realtime deepfakes);
B(suite of realtime deepfakes) --> C(virtual camera injection); All deepfakes supported by dot do not require additional training. They can be used
in real-time on the fly on a photo that becomes the target of face impersonation.
Supported methods: face swap (via SimSwap ), at resolutions 224 and 512 with the option of face superresolution (via GPen ) at resolutions 256 and 512 lower quality face swap (via OpenCV) FOMM , First Order Motion Model for image animation Running dot Graphical interface GUI Installation Download and run the dot executable for your OS: Windows (Tested on Windows 10 and 11): Download dot.zip from here , unzip it and then run dot.exe Ubuntu: ToDo Mac (Tested on Apple M2 Sonoma 14.0): Download dot-m2.zip from here and unzip it Open terminal and run xattr -cr dot-executable.app to remove any extended attributes In case of camera reading error: Right click and choose Show Package Contents Execute dot-executable from Contents/MacOS folder GUI Usage Usage example: Specify the source image in the field source . Specify the camera id number in the field target . In most cases, 0 is the correct camera id. Specify the config file in the field config_file . Select a default configuration from the dropdown list or use a custom file. (Optional) Check the field use_gpu to use the GPU. Click on the RUN button to start the deepfake. For more information about each field, click on the menu Help/Usage . Watch the following demo video for better understanding of the interface Command Line CLI Installation Install Pre-requisites Linux bash
sudo apt install ffmpeg cmake MacOS bash
brew install ffmpeg cmake Windows Download and install Visual Studio Community from here Install Desktop development with C++ from the Visual studio installer Create Conda Environment The instructions assumes that you have Miniconda installed on your machine. If you don't, you can refer to this link for installation instructions. With GPU Support bash
conda env create -f envs/environment-gpu.yaml
conda activate dot Install the torch and torchvision dependencies based on the CUDA version installed on your machine: Install CUDA 11.8 from link Install cudatoolkit from conda : conda install cudatoolkit=<cuda_version_no> (replace <cuda_version_no> with the version on your machine) Install torch and torchvision dependencies: pip install torch==2.0.1+<cuda_tag> torchvision==0.15.2+<cuda_tag> torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118 , where <cuda_tag> is the CUDA tag defined by Pytorch. For example, pip install torch==2.0.1+cu118 torchvision==0.15.2+cu118 torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118 for CUDA 11.8. Note: torch1.9.0+cu111 can also be used. To check that torch and torchvision are installed correctly, run the following command: python -c "import torch; print(torch.cuda.is_available())" . If the output is True , the dependencies are installed with CUDA support. With MPS Support(Apple Silicon) bash
conda env create -f envs/environment-apple-m2.yaml
conda activate dot To check that torch and torchvision are installed correctly, run the following command: python -c "import torch; print(torch.backends.mps.is_available())" . If the output is True , the dependencies are installed with Metal programming framework support. With CPU Support (slow, not recommended) bash
conda env create -f envs/environment-cpu.yaml
conda activate dot Install dot bash
pip install -e . Download Models Download dot model checkpoints from here Unzip the downloaded file in the root of this project CLI Usage Run dot --help to get a full list of available options. Simswap bash
dot -c ./configs/simswap.yaml --target 0 --source "./data" --use_gpu SimSwapHQ bash
dot -c ./configs/simswaphq.yaml --target 0 --source "./data" --use_gpu FOMM bash
dot -c ./configs/fomm.yaml --target 0 --source "./data" --use_gpu FaceSwap CV2 ```bash
dot -c ./configs/faceswap_cv2.yaml --target 0 --source "./data" --use_gpu ``` Note : To enable face superresolution, use the flag --gpen_type gpen_256 or --gpen_type gpen_512 . To use dot on CPU (not recommended), do not pass the --use_gpu flag. Controlling dot with CLI Disclaimer : We use the SimSwap technique for the following demonstration Running dot via any of the above methods generates real-time Deepfake on the input video feed using source images from the data/ folder. When running dot a list of available control options appear on the terminal window as shown above. You can toggle through and select different source images by pressing the associated control key. Watch the following demo video for better understanding of the control options: Docker Setting up docker Build the container docker-compose up --build -d Access the container docker-compose exec dot "/bin/bash" Connect docker to the webcam Ubuntu Build the container docker build -t dot -f Dockerfile . Run the container xhost +
docker run -ti --gpus all \
-e NVIDIA_DRIVER_CAPABILITIES=compute,utility \
-e NVIDIA_VISIBLE_DEVICES=all \
-e PYTHONUNBUFFERED=1 \
-e DISPLAY \
-v .:/dot \
-v /tmp/.X11-unix:/tmp/.X11-unix:rw \
--runtime nvidia \
--entrypoint /bin/bash \
-p 8080:8080 \
--device=/dev/video0:/dev/video0 \
dot Windows Follow the instructions here under Windows to set up the webcam with docker. Build the container docker build -t dot -f Dockerfile . 3. Run the container docker run -ti --gpus all \
-e NVIDIA_DRIVER_CAPABILITIES=compute,utility \
-e NVIDIA_VISIBLE_DEVICES=all \
-e PYTHONUNBUFFERED=1 \
-e DISPLAY=192.168.99.1:0 \
-v .:/dot \
--runtime nvidia \
--entrypoint /bin/bash \
-p 8080:8080 \
--device=/dev/video0:/dev/video0 \
-v /tmp/.X11-unix:/tmp/.X11-unix \
dot macOS Follow the instructions here to set up the webcam with docker. Build the container docker build -t dot -f Dockerfile . 3. Run the container docker run -ti --gpus all \
-e NVIDIA_DRIVER_CAPABILITIES=compute,utility \
-e NVIDIA_VISIBLE_DEVICES=all \
-e PYTHONUNBUFFERED=1 \
-e DISPLAY=$IP:0 \
-v .:/dot \
-v /tmp/.X11-unix:/tmp/.X11-unix \
--runtime nvidia \
--entrypoint /bin/bash \
-p 8080:8080 \
--device=/dev/video0:/dev/video0 \
dot Virtual Camera Injection Instructions vary depending on your operating system. Windows Install OBS Studio . Run OBS Studio. In the Sources section, press on Add button ("+" sign), select Windows Capture and press OK. In the appeared window,
choose "[python.exe]: fomm" in Window drop-down menu and press OK.
Then select Edit -> Transform -> Fit to screen. In OBS Studio, go to Tools -> VirtualCam. Check AutoStart, set Buffered Frames to 0 and press Start. Now OBS-Camera camera should be available in Zoom (or other videoconferencing software). Ubuntu bash
sudo apt update
sudo apt install v4l-utils v4l2loopback-dkms v4l2loopback-utils
sudo modprobe v4l2loopback devices=1 card_label="OBS Cam" exclusive_caps=1
v4l2-ctl --list-devices
sudo add-apt-repository ppa:obsproject/obs-studio
sudo apt install obs-studio Open OBS Studio and check if tools --> v4l2sink exists.
If it doesn't follow these instructions: bash
mkdir -p ~/.config/obs-studio/plugins/v4l2sink/bin/64bit/
ln -s /usr/lib/obs-plugins/v4l2sink.so ~/.config/obs-studio/plugins/v4l2sink/bin/64bit/ Use the virtual camera with OBS Studio : Open OBS Studio Go to tools --> v4l2sink Select /dev/video2 and YUV420 Click on start Join a meeting and select OBS Cam MacOS Download and install OBS Studio for MacOS from here Open OBS and follow the first-time setup (you might be required to enable certain permissions in System Preferences ) Run dot with --use_cam flag to enable camera feed Click the "+" button in the sources section → select "Windows Capture", create a new source and enter "OK" → select window with "python" included in the name and enter OK Click "Start Virtual Camera" button in the controls section Select "OBS Cam" as default camera in the video settings of the application target of the injection Run dot with an Android emulator If you are performing a test against a mobile app, virtual cameras are much harder to inject. An alternative is to use mobile emulators and still resort to virtual camera injection. Run dot . Check running dot for more information. Run OBS Studio and set up the virtual camera. Check virtual-camera-injection for more information. Download and Install Genymotion . Open Genymotion and set up the Android emulator. Set up dot with the Android emulator: Open the Android emulator. Click on camera and select OBS-Camera as front and back cameras. A preview of the dot window should appear.
In case there is no preview, restart OBS and the emulator and try again.
If that didn't work, use a different virtual camera software like e2eSoft VCam or ManyCam . dot deepfake output should be now the emulator's phone camera. Speed With GPU Tested on a AMD Ryzen 5 2600 Six-Core Processor with one NVIDIA GeForce RTX 2070 example
Simswap: FPS 13.0
Simswap + gpen 256: FPS 7.0
SimswapHQ: FPS 11.0
FOMM: FPS 31.0 With Apple Silicon Tested on Macbook Air M2 2022 16GB example
Simswap: FPS 3.2
Simswap + gpen 256: FPS 1.8
SimswapHQ: FPS 2.7
FOMM: FPS 2.0 License This is not a commercial Sensity product, and it is distributed freely with no warranties The software is distributed under BSD 3-Clause . dot utilizes several open source libraries. If you use dot , make sure you agree with their
licenses too. In particular, this codebase is built on top of the following research projects: https://github.com/AliaksandrSiarohin/first-order-model https://github.com/alievk/avatarify-python https://github.com/neuralchen/SimSwap https://github.com/yangxy/GPEN Contributing If you have ideas for improving dot , feel free to open relevant Issues and PRs. Please read CONTRIBUTING.md before contributing to the repository. Maintainers @ghassen1302 @vassilispapadop @giorgiop @AjinkyaIndulkar @kjod Contributors Run dot on pre-recorded image and video files Run dot on image and video files instead of camera feed FAQ dot is very slow and I can't run it in real time Make sure that you are running it on a GPU card by using the --use_gpu flag. CPU is not recommended.
If you still find it too slow it may be because you running it on an old GPU model, with less than 8GB of RAM. Does dot only work with a webcam feed or also with a pre-recorded video? You can use dot on a pre-recorded video file by these scripts or try it directly on Colab . | The Deepfake Offensive Toolkit | [] | 4 | 6 | 66 | 64 | 13 | 6 | 2 |
htmlstreamofficial/preline | Preline UI is an open-source set of prebuilt UI components based on the utility-first Tailwind CSS framework. Why use Preline UI? Based on the Tailwind CSS utility classes, Preline UI's prebuilt components and UI elements help you quickly design and customize responsive mobile-first websites with the components a website needs, including buttons, dropdowns, navigation bars, modals, and more. What's in the box? Components are grouped by visual usability criteria (components, navigation, forms, etc.) and styled directly on top of Tailwind CSS, making them easy to extend and customize. This is a lifesaver for developers looking to create a unique and eye-catching design system without the hassle of creating each component by hand. Getting Started Quick Setup This guide will help you get started with Preline UI, including how to run, customize, update, and integrate your project! First, you need to make sure that you have a working Tailwind CSS project installed and that you also have Node and NPM installed on your machine. Require via NPM Install preline via npm npm i preline Include Preline UI as a plugin in the tailwind.config.js file module.exports = {
content: [
'node_modules/preline/dist/*.js'
],
plugins: [
require('preline/plugin')
],
} Include the JavaScript Documentation For full documentation of the Preline options, visit preline.co . The site also contains information on the wide variety of plugins that are available for TailwindCSS projects. Community For help, discussion about best practices, or any other conversation that would benefit from being searchable use GitHub Discussions License Preline UI is Open Source project and licensed under MIT . Preline UI Figma is free for both commercial and personal projects, learn more here . All brand icons are trademarks of their respective owners. The use of these trademarks does not indicate endorsement of the trademark holder by Preline UI, nor vice versa. A product of Htmlstream Preline UI is built and maintend by Htmlstream team. Over the last decade at Htmlstream, our journey has involved crafting UI Components and Templates. This process has allowed us to understand and explore a range of strategies for developing versatile UI designs that can adapt to a variety of needs. Share your thoughts about Preline on Twitter or leave supportive review on ProductHunt . | Preline UI is an open-source set of prebuilt UI components based on the utility-first Tailwind CSS framework. | css,html,javascript,tailwindcss,tailwindcss-plugin,ui-components,typescript | 20 | 18 | 24 | 73 | 23 | 1 | 0 |
facundoolano/software-papers | Papers for Software Engineers A curated list of papers that may be of interest to Software Engineering students or professionals.
See the sources and selection criteria below. List of papers by topic 1. **Von Neumann's First Computer Program. [Knuth (1970)](https://dl.acm.org/doi/pdf/10.1145/356580.356581).**
\ Computer History; Early Programming * The Education of a Computer. [Hopper (1952)](https://people.cs.umass.edu/~emery/classes/cmpsci691st/readings/PL/p243-hopper.pdf).
* Recursive Programming. [Dijkstra (1960)](https://www.ics.uci.edu/~jajones/INF102-S18/readings/07_dijkstra.pdf).
* Programming Considered as a Human Activity. [Dijkstra (1965)](https://www.cs.utexas.edu/~EWD/transcriptions/EWD01xx/EWD117.html).
* Goto Statement Considered Harmful. [Dijkstra (1968)](https://homepages.cwi.nl/~storm/teaching/reader/Dijkstra68.pdf).
* Program development by stepwise refinement. [Wirth (1971)](https://dl.acm.org/doi/pdf/10.1145/362575.362577).
* The Humble Programmer. [Dijkstra (1972)](http://rkka21.ru/docs/turing-award/ed1972e.pdf).
* Computer Programming as an Art. [Knuth (1974)](http://www.cs.bilkent.edu.tr/~canf/knuth1974.pdf).
* The paradigms of programming. [Floyd (1979)](https://dl.acm.org/doi/pdf/10.1145/1283920.1283934).
* Literate Programming. [Knuth (1984)](http://www.literateprogramming.com/knuthweb.pdf).
1. **Computing Machinery and Intelligence. [Turing (1950)](https://www.csee.umbc.edu/courses/471/papers/turing.pdf).**
\ Early Artificial Intelligence * Some Moral and Technical Consequences of Automation. [Wiener (1960)](https://nissenbaum.tech.cornell.edu/papers/Wiener.pdf).
* Steps towards Artificial Intelligence. [Minsky (1960)](http://worrydream.com/refs/Minsky%20-%20Steps%20Toward%20Artificial%20Intelligence.pdf).
* ELIZA—a computer program for the study of natural language communication between man and machine. [Weizenbaum (1966)](http://web.stanford.edu/class/cs124/p36-weizenabaum.pdf).
* A Theory of the Learnable. [Valiant (1984)](https://people.mpi-inf.mpg.de/~mehlhorn/SeminarEvolvability/ValiantLearnable.pdf).
1. **A Method for the Construction of Minimum-Redundancy Codes. [Huffman (1952)](http://compression.ru/download/articles/huff/huffman_1952_minimum-redundancy-codes.pdf).**
\ Information Theory * A Universal Algorithm for Sequential Data Compression. [Ziv, Lempel (1977)](https://courses.cs.duke.edu/spring03/cps296.5/papers/ziv_lempel_1977_universal_algorithm.pdf).
* Fifty Years of Shannon Theory. [Verdú (1998)](https://monoskop.org/images/7/78/Verdu_Sergio_1998_Fifty_Years_of_Shannon_Theory.pdf).
1. **Engineering a Sort Function. [Bentley, McIlroy (1993)](https://cs.fit.edu/~pkc/classes/writing/samples/bentley93engineering.pdf).**
\ Data Structures; Algorithms * On the Shortest Spanning Subtree of a Graph and the Traveling Salesman Problem. [Kruskal (1956)](https://www.ams.org/proc/1956-007-01/S0002-9939-1956-0078686-7/S0002-9939-1956-0078686-7.pdf).
* A Note on Two Problems in Connexion with Graphs. [Dijkstra (1959)](https://jmvidal.cse.sc.edu/library/dijkstra59a.pdf).
* Quicksort. [Hoare (1962)](https://academic.oup.com/comjnl/article-pdf/5/1/10/1111445/050010.pdf).
* Space/Time Trade-offs in Hash Coding with Allowable Errors. [Bloom (1970)](https://dl.acm.org/doi/pdf/10.1145/362686.362692).
* The Ubiquitous B-Tree. [Comer (1979)](http://carlosproal.com/ir/papers/p121-comer.pdf).
* Programming pearls: Algorithm design techniques. [Bentley (1984)](https://dl.acm.org/doi/pdf/10.1145/358234.381162).
* Programming pearls: The back of the envelope. [Bentley (1984)](https://dl.acm.org/doi/pdf/10.1145/357994.381168).
* Making data structures persistent. [Driscoll et al (1986)](https://dl.acm.org/doi/pdf/10.1145/12130.12142).
1. **A Design Methodology for Reliable Software Systems. [Liskov (1972)](https://dl.acm.org/doi/pdf/10.1145/1479992.1480018).**
\ Software Design * On the Criteria To Be Used in Decomposing Systems into Modules. [Parnas (1971)](https://www.win.tue.nl/~wstomv/edu/2ip30/references/criteria_for_modularization.pdf).
* Information Distribution Aspects of Design Methodology. [Parnas (1972)](https://cseweb.ucsd.edu/~wgg/CSE218/Parnas-IFIP71-information-distribution.PDF).
* Designing Software for Ease of Extension and Contraction. [Parnas (1979)](https://courses.cs.washington.edu/courses/cse503/08wi/parnas-1979.pdf).
* Programming as Theory Building. [Naur (1985)](https://pages.cs.wisc.edu/~remzi/Naur.pdf).
* Software Aging. [Parnas (1994)](https://dl.acm.org/doi/pdf/10.5555/257734.257788).
* Towards a Theory of Conceptual Design for Software. [Jackson (2015)](https://groups.csail.mit.edu/sdg/pubs/2015/concept-essay.pdf).
1. **Programming with Abstract Data Types. [Liskov, Zilles (1974)](https://dl.acm.org/doi/pdf/10.1145/942572.807045).**
\ Abstract Data Types; Object-Oriented Programming * The Smalltalk-76 Programming System Design and Implementation. [Ingalls (1978)](https://dl.acm.org/doi/pdf/10.1145/512760.512762).
* A Theory of Type Polymorphism in Programming. [Milner (1978)](https://homepages.inf.ed.ac.uk/wadler/papers/papers-we-love/milner-type-polymorphism.pdf).
* On understanding types, data abstraction, and polymorphism. [Cardelli, Wegner (1985)](https://dl.acm.org/doi/pdf/10.1145/6041.6042).
* SELF: The Power of Simplicity. [Ungar, Smith (1991)](https://people.eecs.berkeley.edu/~fateman/264/papers/selfpower.ps).
1. **Why Functional Programming Matters. [Hughes (1990)](https://www.cs.kent.ac.uk/people/staff/dat/miranda/whyfp90.pdf).**
\ Functional Programming * Recursive Functions of Symbolic Expressions and Their Computation by Machine. [McCarthy (1960)](http://jmc.stanford.edu/articles/recursive/recursive.pdf).
* The Semantics of Predicate Logic as a Programming Language. [Van Emden, Kowalski (1976)](https://dl.acm.org/doi/pdf/10.1145/321978.321991).
* Can Programming Be Liberated from the von Neumann Style? [Backus (1978)](https://dl.acm.org/doi/pdf/10.1145/359576.359579).
* The Semantic Elegance of Applicative Languages. [Turner (1981)](http://nsl.com/misc/sasl/paraffins-turner.pdf).
* The essence of functional programming. [Wadler (1992)](https://dl.acm.org/doi/pdf/10.1145/143165.143169).
* QuickCheck: A Lightweight Tool for Random Testing of Haskell Programs. [Claessen, Hughes (2000)](https://www.cs.tufts.edu/~nr/cs257/archive/john-hughes/quick.pdf).
* Church's Thesis and Functional Programming. [Turner (2006)](https://kar.kent.ac.uk/88944/1/ctfp.pdf_nocoversheet).
1. **An Incremental Approach to Compiler Construction. [Ghuloum (2006)](http://scheme2006.cs.uchicago.edu/11-ghuloum.pdf).**
\ Language Design; Compilers * The Next 700 Programming Languages. [Landin (1966)](https://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf).
* Programming pearls: little languages. [Bentley (1986)](https://dl.acm.org/doi/pdf/10.1145/6424.315691).
* The Essence of Compiling with Continuations. [Flanagan et al (1993)](https://dl.acm.org/doi/pdf/10.1145/173262.155113).
* A Brief History of Just-In-Time. [Aycock (2003)](https://user.it.uu.se/~kostis/Teaching/KT2-04/jit_survey.pdf).
* LLVM: A Compilation Framework for Lifelong Program Analysis & Transformation. [Lattner, Adve (2004)](https://llvm.org/pubs/2004-01-30-CGO-LLVM.pdf).
* A Unified Theory of Garbage Collection. [Bacon, Cheng, Rajan (2004)](https://courses.cs.washington.edu/courses/cse590p/05au/p50-bacon.pdf).
* A Nanopass Framework for Compiler Education. [Sarkar, Waddell, Dybvig (2005)](https://legacy.cs.indiana.edu/~dyb/pubs/nano-jfp.pdf).
* Bringing the Web up to Speed with WebAssembly. [Haas (2017)](https://dl.acm.org/doi/pdf/10.1145/3062341.3062363).
1. **No Silver Bullet: Essence and Accidents of Software Engineering. [Brooks (1987)](http://worrydream.com/refs/Brooks-NoSilverBullet.pdf).**
\ Software Engineering; Project Management * How do committees invent? [Conway (1968)](https://www.melconway.com/Home/pdf/committees.pdf).
* Managing the Development of Large Software Systems. [Royce (1970)](https://www.praxisframework.org/files/royce1970.pdf).
* The Mythical Man Month. [Brooks (1975)](https://www.cs.virginia.edu/~evans/greatworks/mythical.pdf).
* On Building Systems That Will Fail. [Corbató (1991)](https://dl.acm.org/doi/pdf/10.1145/114669.114686).
* The Cathedral and the Bazaar. [Raymond (1998)](http://users.ece.utexas.edu/~perry/education/382v-s08/papers/raymond.pdf).
* Out of the Tar Pit. [Moseley, Marks (2006)](http://curtclifton.net/papers/MoseleyMarks06a.pdf).
1. **Communicating sequential processes. [Hoare (1978)](https://www.cs.cmu.edu/~crary/819-f09/Hoare78.pdf).**
\ Concurrency * Solution Of a Problem in Concurrent Program Control. [Dijkstra (1965)](https://dl.acm.org/doi/pdf/10.1145/365559.365617).
* Monitors: An operating system structuring concept. [Hoare (1974)](https://dl.acm.org/doi/pdf/10.1145/355620.361161).
* On the Duality of Operating System Structures. [Lauer, Needham (1978)](https://dl.acm.org/doi/pdf/10.1145/850657.850658).
* Software Transactional Memory. [Shavit, Touitou (1997)](https://groups.csail.mit.edu/tds/papers/Shavit/ShavitTouitou.pdf).
1. **The UNIX Time- Sharing System. [Ritchie, Thompson (1974)](https://dsf.berkeley.edu/cs262/unix.pdf).**
\ Operating Systems * An Experimental Time-Sharing System. [Corbató, Merwin Daggett, Daley (1962)](http://larch-www.lcs.mit.edu:8001/~corbato/sjcc62/).
* The Structure of the \"THE\"-Multiprogramming System. [Dijkstra (1968)](https://www.eecs.ucf.edu/~eurip/papers/dijkstra-the68.pdf).
* The nucleus of a multiprogramming system. [Hansen (1970)](http://www.brinch-hansen.net/papers/1970a.pdf).
* Reflections on Trusting Trust. [Thompson (1984)](https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_ReflectionsonTrustingTrust.pdf).
* The Design and Implementation of a Log-Structured File System. [Rosenblum, Ousterhout (1991)](https://people.eecs.berkeley.edu/~brewer/cs262/LFS.pdf).
1. **A Relational Model of Data for Large Shared Data Banks. [Codd (1970)](https://www.seas.upenn.edu/~zives/03f/cis550/codd.pdf).**
\ Databases * Granularity of Locks and Degrees of Consistency in a Shared Data Base. [Gray et al (1975)](https://www.cs.cmu.edu/~natassa/courses/15-721/papers/GrayLocks.pdf).
* Access Path Selection in a Relational Database Management System. [Selinger et al (1979)](https://courses.cs.duke.edu/compsci516/cps216/spring03/papers/selinger-etal-1979.pdf).
* The Transaction Concept: Virtues and Limitations. [Gray (1981)](https://jimgray.azurewebsites.net/papers/theTransactionConcept.pdf).
* The design of POSTGRES. [Stonebraker, Rowe (1986)](https://dl.acm.org/doi/pdf/10.1145/16856.16888).
* Rules of Thumb in Data Engineering. [Gray, Shenay (1999)](http://research.microsoft.com/en-us/um/people/gray/papers/ms_tr_99_100_rules_of_thumb_in_data_engineering.pdf).
1. **A Protocol for Packet Network Intercommunication. [Cerf, Kahn (1974)](https://www.cs.princeton.edu/courses/archive/fall06/cos561/papers/cerf74.pdf).**
\ Networking * Ethernet: Distributed packet switching for local computer networks. [Metcalfe, Boggs (1978)](https://dl.acm.org/doi/pdf/10.1145/360248.360253).
* End-To-End Arguments in System Design. [Saltzer, Reed, Clark (1984)](https://groups.csail.mit.edu/ana/Publications/PubPDFs/End-to-End%20Arguments%20in%20System%20Design.pdf).
* An algorithm for distributed computation of a Spanning Tree in an Extended LAN. [Perlman (1985)](https://dl.acm.org/doi/pdf/10.1145/319056.319004).
* The Design Philosophy of the DARPA Internet Protocols. [Clark (1988)](http://ccr.sigcomm.org/archive/1995/jan95/ccr-9501-clark.pdf).
* TOR: The second generation onion router. [Dingledine et al (2004)](https://svn-archive.torproject.org/svn/projects/design-paper/tor-design.pdf).
* Why the Internet only just works. [Handley (2006)](http://www0.cs.ucl.ac.uk/staff/m.handley/papers/only-just-works.pdf).
* The Network is Reliable. [Bailis, Kingsbury (2014)](https://queue.acm.org/detail.cfm?id=2655736).
1. **New Directions in Cryptography. [Diffie, Hellman (1976)](https://ee.stanford.edu/~hellman/publications/24.pdf).**
\ Cryptography * A Method for Obtaining Digital Signatures and Public-Key Cryptosystems. [Rivest, Shamir, Adleman (1978)](https://dl.acm.org/doi/pdf/10.1145/359340.359342).
* How To Share A Secret. [Shamir (1979)](https://web.mit.edu/6.857/OldStuff/Fall03/ref/Shamir-HowToShareASecret.pdf).
* A Digital Signature Based on a Conventional Encryption Function. [Merkle (1987)](https://people.eecs.berkeley.edu/~raluca/cs261-f15/readings/merkle.pdf).
* The Salsa20 family of stream ciphers. [Bernstein (2007)](https://cr.yp.to/snuffle/salsafamily-20071225.pdf).
1. **Time, Clocks, and the Ordering of Events in a Distributed System. [Lamport (1978)](https://lamport.azurewebsites.net/pubs/time-clocks.pdf).**
\ Distributed Systems * Self-stabilizing systems in spite of distributed control. [Dijkstra (1974)](https://dl.acm.org/doi/pdf/10.1145/361179.361202).
* The Byzantine Generals Problem. [Lamport, Shostak, Pease (1982)](https://lamport.azurewebsites.net/pubs/byz.pdf).
* Impossibility of Distributed Consensus With One Faulty Process. [Fisher, Lynch, Patterson (1985)](https://groups.csail.mit.edu/tds/papers/Lynch/jacm85.pdf).
* Implementing Fault-Tolerant Services Using the State Machine Approach: A Tutorial. [Schneider (1990)](https://www.cs.cornell.edu/fbs/publications/SMSurvey.pdf).
* Practical Byzantine Fault Tolerance. [Castro, Liskov (1999)](https://pmg.csail.mit.edu/papers/osdi99.pdf).
* Paxos made simple. [Lamport (2001)](https://lamport.azurewebsites.net/pubs/paxos-simple.pdf).
* Paxos made live - An Engineering Perspective. [Chandra, Griesemer, Redstone (2007)](https://www.cs.utexas.edu/users/lorenzo/corsi/cs380d/papers/paper2-1.pdf).
* In Search of an Understandable Consensus Algorithm. [Ongaro, Ousterhout (2014)](https://raft.github.io/raft.pdf).
1. **Designing for Usability: Key Principles and What Designers Think. [Gould, Lewis (1985)](https://dl.acm.org/doi/pdf/10.1145/3166.3170).**
\ Human-Computer Interaction; User Interfaces * As We May Think. [Bush (1945)](https://web.mit.edu/STS.035/www/PDFs/think.pdf).
* Man-Computer symbiosis. [Licklider (1958)](http://worrydream.com/refs/Licklider%20-%20Man-Computer%20Symbiosis.pdf).
* Some Thoughts About the Social Implications of Accessible Computing. [David, Fano (1965)](https://dl.acm.org/doi/pdf/10.1145/1463891.1463917).
* Tutorials for the First-Time Computer User. [Al-Awar, Chapanis, Ford (1981)](https://drive.google.com/file/d/1zA4LkSHoanjjhOVCwYzrkdkdzgPbKWJ9/view?usp=sharing).
* The star user interface: an overview. [Smith, Irby, Kimball (1982)](https://www.tech-insider.org/star/research/acrobat/8206.pdf).
* Design Principles for Human-Computer Interfaces. [Norman (1983)](https://dl.acm.org/doi/pdf/10.1145/800045.801571).
* Human-Computer Interaction: Psychology as a Science of Design. [Carroll (1997)](https://home.cs.colorado.edu/~martin/Csci6402/Papers/carroll97.pdf).
1. **The anatomy of a large-scale hypertextual Web search engine. [Brin, Page (1998)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/334.pdf).**
\ Information Retrieval; World-Wide Web * A Statistical Interpretation of Term Specificity in Retrieval. [Spärck Jones (1972)](http://openlib.org/home/krichel/courses/lis618/readings/spaerk-jones72.pdf).
* World-Wide Web: Information Universe. [Berners-Lee et al (1992)](https://www.w3.org/History/1992/ENRAP/Article_9202.pdf).
* The PageRank Citation Ranking: Bringing Order to the Web. [Page, Brin, Motwani (1998)](http://www.eecs.harvard.edu/~michaelm/CS222/pagerank.pdf).
1. **Dynamo, Amazon’s Highly Available Key-value store. [DeCandia et al (2007)](https://www.allthingsdistributed.com/files/amazon-dynamo-sosp2007.pdf).**
\ Internet Scale Data Systems * The Google File System. [Ghemawat, Gobioff, Leung (2003)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/035fc972c796d33122033a0614bc94cff1527999.pdf).
* MapReduce: Simplified Data Processing on Large Clusters. [Dean, Ghemawat (2004)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/16cb30b4b92fd4989b8619a61752a2387c6dd474.pdf).
* Bigtable: A Distributed Storage System for Structured Data. [Chang et al (2006)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/68a74a85e1662fe02ff3967497f31fda7f32225c.pdf).
* ZooKeeper: wait-free coordination for internet scale systems. [Hunt et al (2010)](https://www.usenix.org/legacy/event/atc10/tech/full_papers/Hunt.pdf).
* The Hadoop Distributed File System. [Shvachko et al (2010)](https://cse.buffalo.edu/~okennedy/courses/papers/hdfs.pdf).
* Kafka: a Distributed Messaging System for Log Processing. [Kreps, Narkhede, Rao (2011)](http://notes.stephenholiday.com/Kafka.pdf).
* CAP Twelve Years Later: How the "Rules" Have Changed. [Brewer (2012)](https://sites.cs.ucsb.edu/~rich/class/cs293b-cloud/papers/brewer-cap.pdf).
* Amazon Aurora: Design Considerations for High Throughput Cloud-Native Relational Databases. [Verbitski et al (2017)](https://pages.cs.wisc.edu/~yxy/cs764-f20/papers/aurora-sigmod-17.pdf).
1. **On Designing and Deploying Internet Scale Services. [Hamilton (2007)](https://s3.amazonaws.com/systemsandpapers/papers/hamilton.pdf).**
\ Operations; Reliability; Fault-tolerance * Ironies of Automation. [Bainbridge (1983)](https://ckrybus.com/static/papers/Bainbridge_1983_Automatica.pdf).
* Why do computers stop and what can be done about it? [Gray (1985)](https://jimgray.azurewebsites.net/papers/TandemTR85.7_WhyDoComputersStop.pdf).
* Recovery Oriented Computing (ROC): Motivation, Definition, Techniques, and Case Studies. [Patterson et al (2002)](http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/CSD-02-1175.pdf).
* Crash-Only Software. [Candea, Fox (2003)](https://research.cs.wisc.edu/areas/os/ReadingGroup/os-old/Papers/HotOSIX/Candea-CrashOnlySoftware.pdf).
* Building on Quicksand. [Helland, Campbell (2009)](https://arxiv.org/ftp/arxiv/papers/0909/0909.1788.pdf).
1. **Thinking Methodically about Performance. [Gregg (2012)](https://queue.acm.org/detail.cfm?id=2413037).**
\ Performance * Performance Anti-Patterns. [Smaalders (2006)](https://queue.acm.org/detail.cfm?id=1117403).
* Thinking Clearly about Performance. [Millsap (2010)](https://queue.acm.org/detail.cfm?id=1854041).
1. **Bitcoin, A peer-to-peer electronic cash system. [Nakamoto (2008)](https://bitcoin.org/bitcoin.pdf).**
\ Decentralized Distributed Systems; Peer-to-peer systems * Operational transformation in real-time group editors: issues, algorithms, and achievements. [Sun, Ellis (1998)](https://dl.acm.org/doi/pdf/10.1145/289444.289469).
* Kademlia: A Peer-to-Peer Information System Based on the XOR Metric. [Maymounkov, Mazières (2002)](https://pdos.csail.mit.edu/~petar/papers/maymounkov-kademlia-lncs.pdf).
* Incentives Build Robustness in BitTorrent. [Cohen (2003)](https://www.bittorrent.org/bittorrentecon.pdf).
* Conflict-free Replicated Data Types. [Shapiro et al (2011)](https://pages.lip6.fr/Marc.Shapiro/papers/RR-7687.pdf).
* IPFS - Content Addressed, Versioned, P2P File System. [Benet (2014)](https://raw.githubusercontent.com/ipfs/papers/master/ipfs-cap2pfs/ipfs-p2p-file-system.pdf).
* Ethereum: A Next-Generation Smart Contract and Decentralized Application Platform. [Buterin (2014)](https://ethereum.org/content/whitepaper/whitepaper-pdf/Ethereum_Whitepaper_-_Buterin_2014.pdf).
* Local-First Software: You Own Your Data, in spite of the Cloud. [Kleppmann et al (2019)](https://www.inkandswitch.com/local-first/static/local-first.pdf).
1. **A Few Useful Things to Know About Machine Learning. [Domingos (2012)](https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf).**
\ Machine Learning * Statistical Modeling: The Two Cultures. [Breiman (2001)](https://projecteuclid.org/journalArticle/Download?urlId=10.1214%2Fss%2F1009213726).
* The Unreasonable Effectiveness of Data. [Halevy, Norvig, Pereira (2009)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/35179.pdf).
* ImageNet Classification with Deep Convolutional Neural Networks. [Krizhevsky, Sutskever, Hinton (2012)](https://papers.nips.cc/paper_files/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf).
* Playing Atari with Deep Reinforcement Learning. [Mnih et al (2013)](https://arxiv.org/pdf/1312.5602.pdf).
* Generative Adversarial Nets. [Goodfellow et al (2014)](https://proceedings.neurips.cc/paper_files/paper/2014/file/5ca3e9b122f61f8f06494c97b1afccf3-Paper.pdf).
* Deep Learning. [LeCun, Bengio, Hinton (2015)](https://www.cs.toronto.edu/~hinton/absps/NatureDeepReview.pdf).
* Attention Is All You Need. [Vaswani et al (2017)](https://arxiv.org/pdf/1706.03762.pdf). Top-level papers only 1. **Von Neumann's First Computer Program. [Knuth (1970)](https://dl.acm.org/doi/pdf/10.1145/356580.356581).**
1. **Computing Machinery and Intelligence. [Turing (1950)](https://www.csee.umbc.edu/courses/471/papers/turing.pdf).**
1. **A Method for the Construction of Minimum-Redundancy Codes. [Huffman (1952)](http://compression.ru/download/articles/huff/huffman_1952_minimum-redundancy-codes.pdf).**
1. **Engineering a Sort Function. [Bentley, McIlroy (1993)](https://cs.fit.edu/~pkc/classes/writing/samples/bentley93engineering.pdf).**
1. **A Design Methodology for Reliable Software Systems. [Liskov (1972)](https://dl.acm.org/doi/pdf/10.1145/1479992.1480018).**
1. **Programming with Abstract Data Types. [Liskov, Zilles (1974)](https://dl.acm.org/doi/pdf/10.1145/942572.807045).**
1. **Why Functional Programming Matters. [Hughes (1990)](https://www.cs.kent.ac.uk/people/staff/dat/miranda/whyfp90.pdf).**
1. **An Incremental Approach to Compiler Construction. [Ghuloum (2006)](http://scheme2006.cs.uchicago.edu/11-ghuloum.pdf).**
1. **No Silver Bullet: Essence and Accidents of Software Engineering. [Brooks (1987)](http://worrydream.com/refs/Brooks-NoSilverBullet.pdf).**
1. **Communicating sequential processes. [Hoare (1978)](https://www.cs.cmu.edu/~crary/819-f09/Hoare78.pdf).**
1. **The UNIX Time- Sharing System. [Ritchie, Thompson (1974)](https://dsf.berkeley.edu/cs262/unix.pdf).**
1. **A Relational Model of Data for Large Shared Data Banks. [Codd (1970)](https://www.seas.upenn.edu/~zives/03f/cis550/codd.pdf).**
1. **A Protocol for Packet Network Intercommunication. [Cerf, Kahn (1974)](https://www.cs.princeton.edu/courses/archive/fall06/cos561/papers/cerf74.pdf).**
1. **New Directions in Cryptography. [Diffie, Hellman (1976)](https://ee.stanford.edu/~hellman/publications/24.pdf).**
1. **Time, Clocks, and the Ordering of Events in a Distributed System. [Lamport (1978)](https://lamport.azurewebsites.net/pubs/time-clocks.pdf).**
1. **Designing for Usability: Key Principles and What Designers Think. [Gould, Lewis (1985)](https://dl.acm.org/doi/pdf/10.1145/3166.3170).**
1. **The anatomy of a large-scale hypertextual Web search engine. [Brin, Page (1998)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/334.pdf).**
1. **Dynamo, Amazon’s Highly Available Key-value store. [DeCandia et al (2007)](https://www.allthingsdistributed.com/files/amazon-dynamo-sosp2007.pdf).**
1. **On Designing and Deploying Internet Scale Services. [Hamilton (2007)](https://s3.amazonaws.com/systemsandpapers/papers/hamilton.pdf).**
1. **Thinking Methodically about Performance. [Gregg (2012)](https://queue.acm.org/detail.cfm?id=2413037).**
1. **Bitcoin, A peer-to-peer electronic cash system. [Nakamoto (2008)](https://bitcoin.org/bitcoin.pdf).**
1. **A Few Useful Things to Know About Machine Learning. [Domingos (2012)](https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf).** All papers in chronological order 1. As We May Think. [Bush (1945)](https://web.mit.edu/STS.035/www/PDFs/think.pdf).
1. **Computing Machinery and Intelligence. [Turing (1950)](https://www.csee.umbc.edu/courses/471/papers/turing.pdf).**
1. The Education of a Computer. [Hopper (1952)](https://people.cs.umass.edu/~emery/classes/cmpsci691st/readings/PL/p243-hopper.pdf).
1. **A Method for the Construction of Minimum-Redundancy Codes. [Huffman (1952)](http://compression.ru/download/articles/huff/huffman_1952_minimum-redundancy-codes.pdf).**
1. On the Shortest Spanning Subtree of a Graph and the Traveling Salesman Problem. [Kruskal (1956)](https://www.ams.org/proc/1956-007-01/S0002-9939-1956-0078686-7/S0002-9939-1956-0078686-7.pdf).
1. Man-Computer symbiosis. [Licklider (1958)](http://worrydream.com/refs/Licklider%20-%20Man-Computer%20Symbiosis.pdf).
1. A Note on Two Problems in Connexion with Graphs. [Dijkstra (1959)](https://jmvidal.cse.sc.edu/library/dijkstra59a.pdf).
1. Recursive Programming. [Dijkstra (1960)](https://www.ics.uci.edu/~jajones/INF102-S18/readings/07_dijkstra.pdf).
1. Some Moral and Technical Consequences of Automation. [Wiener (1960)](https://nissenbaum.tech.cornell.edu/papers/Wiener.pdf).
1. Steps towards Artificial Intelligence. [Minsky (1960)](http://worrydream.com/refs/Minsky%20-%20Steps%20Toward%20Artificial%20Intelligence.pdf).
1. Recursive Functions of Symbolic Expressions and Their Computation by Machine. [McCarthy (1960)](http://jmc.stanford.edu/articles/recursive/recursive.pdf).
1. Quicksort. [Hoare (1962)](https://academic.oup.com/comjnl/article-pdf/5/1/10/1111445/050010.pdf).
1. An Experimental Time-Sharing System. [Corbató, Merwin Daggett, Daley (1962)](http://larch-www.lcs.mit.edu:8001/~corbato/sjcc62/).
1. Programming Considered as a Human Activity. [Dijkstra (1965)](https://www.cs.utexas.edu/~EWD/transcriptions/EWD01xx/EWD117.html).
1. Solution Of a Problem in Concurrent Program Control. [Dijkstra (1965)](https://dl.acm.org/doi/pdf/10.1145/365559.365617).
1. Some Thoughts About the Social Implications of Accessible Computing. [David, Fano (1965)](https://dl.acm.org/doi/pdf/10.1145/1463891.1463917).
1. ELIZA—a computer program for the study of natural language communication between man and machine. [Weizenbaum (1966)](http://web.stanford.edu/class/cs124/p36-weizenabaum.pdf).
1. The Next 700 Programming Languages. [Landin (1966)](https://www.cs.cmu.edu/~crary/819-f09/Landin66.pdf).
1. Goto Statement Considered Harmful. [Dijkstra (1968)](https://homepages.cwi.nl/~storm/teaching/reader/Dijkstra68.pdf).
1. How do committees invent? [Conway (1968)](https://www.melconway.com/Home/pdf/committees.pdf).
1. The Structure of the \"THE\"-Multiprogramming System. [Dijkstra (1968)](https://www.eecs.ucf.edu/~eurip/papers/dijkstra-the68.pdf).
1. **Von Neumann's First Computer Program. [Knuth (1970)](https://dl.acm.org/doi/pdf/10.1145/356580.356581).**
1. Space/Time Trade-offs in Hash Coding with Allowable Errors. [Bloom (1970)](https://dl.acm.org/doi/pdf/10.1145/362686.362692).
1. Managing the Development of Large Software Systems. [Royce (1970)](https://www.praxisframework.org/files/royce1970.pdf).
1. The nucleus of a multiprogramming system. [Hansen (1970)](http://www.brinch-hansen.net/papers/1970a.pdf).
1. **A Relational Model of Data for Large Shared Data Banks. [Codd (1970)](https://www.seas.upenn.edu/~zives/03f/cis550/codd.pdf).**
1. Program development by stepwise refinement. [Wirth (1971)](https://dl.acm.org/doi/pdf/10.1145/362575.362577).
1. On the Criteria To Be Used in Decomposing Systems into Modules. [Parnas (1971)](https://www.win.tue.nl/~wstomv/edu/2ip30/references/criteria_for_modularization.pdf).
1. The Humble Programmer. [Dijkstra (1972)](http://rkka21.ru/docs/turing-award/ed1972e.pdf).
1. **A Design Methodology for Reliable Software Systems. [Liskov (1972)](https://dl.acm.org/doi/pdf/10.1145/1479992.1480018).**
1. Information Distribution Aspects of Design Methodology. [Parnas (1972)](https://cseweb.ucsd.edu/~wgg/CSE218/Parnas-IFIP71-information-distribution.PDF).
1. A Statistical Interpretation of Term Specificity in Retrieval. [Spärck Jones (1972)](http://openlib.org/home/krichel/courses/lis618/readings/spaerk-jones72.pdf).
1. Computer Programming as an Art. [Knuth (1974)](http://www.cs.bilkent.edu.tr/~canf/knuth1974.pdf).
1. **Programming with Abstract Data Types. [Liskov, Zilles (1974)](https://dl.acm.org/doi/pdf/10.1145/942572.807045).**
1. Monitors: An operating system structuring concept. [Hoare (1974)](https://dl.acm.org/doi/pdf/10.1145/355620.361161).
1. **The UNIX Time- Sharing System. [Ritchie, Thompson (1974)](https://dsf.berkeley.edu/cs262/unix.pdf).**
1. **A Protocol for Packet Network Intercommunication. [Cerf, Kahn (1974)](https://www.cs.princeton.edu/courses/archive/fall06/cos561/papers/cerf74.pdf).**
1. Self-stabilizing systems in spite of distributed control. [Dijkstra (1974)](https://dl.acm.org/doi/pdf/10.1145/361179.361202).
1. The Mythical Man Month. [Brooks (1975)](https://www.cs.virginia.edu/~evans/greatworks/mythical.pdf).
1. Granularity of Locks and Degrees of Consistency in a Shared Data Base. [Gray et al (1975)](https://www.cs.cmu.edu/~natassa/courses/15-721/papers/GrayLocks.pdf).
1. The Semantics of Predicate Logic as a Programming Language. [Van Emden, Kowalski (1976)](https://dl.acm.org/doi/pdf/10.1145/321978.321991).
1. **New Directions in Cryptography. [Diffie, Hellman (1976)](https://ee.stanford.edu/~hellman/publications/24.pdf).**
1. A Universal Algorithm for Sequential Data Compression. [Ziv, Lempel (1977)](https://courses.cs.duke.edu/spring03/cps296.5/papers/ziv_lempel_1977_universal_algorithm.pdf).
1. The Smalltalk-76 Programming System Design and Implementation. [Ingalls (1978)](https://dl.acm.org/doi/pdf/10.1145/512760.512762).
1. A Theory of Type Polymorphism in Programming. [Milner (1978)](https://homepages.inf.ed.ac.uk/wadler/papers/papers-we-love/milner-type-polymorphism.pdf).
1. Can Programming Be Liberated from the von Neumann Style? [Backus (1978)](https://dl.acm.org/doi/pdf/10.1145/359576.359579).
1. **Communicating sequential processes. [Hoare (1978)](https://www.cs.cmu.edu/~crary/819-f09/Hoare78.pdf).**
1. On the Duality of Operating System Structures. [Lauer, Needham (1978)](https://dl.acm.org/doi/pdf/10.1145/850657.850658).
1. Ethernet: Distributed packet switching for local computer networks. [Metcalfe, Boggs (1978)](https://dl.acm.org/doi/pdf/10.1145/360248.360253).
1. A Method for Obtaining Digital Signatures and Public-Key Cryptosystems. [Rivest, Shamir, Adleman (1978)](https://dl.acm.org/doi/pdf/10.1145/359340.359342).
1. **Time, Clocks, and the Ordering of Events in a Distributed System. [Lamport (1978)](https://lamport.azurewebsites.net/pubs/time-clocks.pdf).**
1. The paradigms of programming. [Floyd (1979)](https://dl.acm.org/doi/pdf/10.1145/1283920.1283934).
1. The Ubiquitous B-Tree. [Comer (1979)](http://carlosproal.com/ir/papers/p121-comer.pdf).
1. Designing Software for Ease of Extension and Contraction. [Parnas (1979)](https://courses.cs.washington.edu/courses/cse503/08wi/parnas-1979.pdf).
1. Access Path Selection in a Relational Database Management System. [Selinger et al (1979)](https://courses.cs.duke.edu/compsci516/cps216/spring03/papers/selinger-etal-1979.pdf).
1. How To Share A Secret. [Shamir (1979)](https://web.mit.edu/6.857/OldStuff/Fall03/ref/Shamir-HowToShareASecret.pdf).
1. The Semantic Elegance of Applicative Languages. [Turner (1981)](http://nsl.com/misc/sasl/paraffins-turner.pdf).
1. The Transaction Concept: Virtues and Limitations. [Gray (1981)](https://jimgray.azurewebsites.net/papers/theTransactionConcept.pdf).
1. Tutorials for the First-Time Computer User. [Al-Awar, Chapanis, Ford (1981)](https://drive.google.com/file/d/1zA4LkSHoanjjhOVCwYzrkdkdzgPbKWJ9/view?usp=sharing).
1. The Byzantine Generals Problem. [Lamport, Shostak, Pease (1982)](https://lamport.azurewebsites.net/pubs/byz.pdf).
1. The star user interface: an overview. [Smith, Irby, Kimball (1982)](https://www.tech-insider.org/star/research/acrobat/8206.pdf).
1. Design Principles for Human-Computer Interfaces. [Norman (1983)](https://dl.acm.org/doi/pdf/10.1145/800045.801571).
1. Ironies of Automation. [Bainbridge (1983)](https://ckrybus.com/static/papers/Bainbridge_1983_Automatica.pdf).
1. Literate Programming. [Knuth (1984)](http://www.literateprogramming.com/knuthweb.pdf).
1. A Theory of the Learnable. [Valiant (1984)](https://people.mpi-inf.mpg.de/~mehlhorn/SeminarEvolvability/ValiantLearnable.pdf).
1. Programming pearls: Algorithm design techniques. [Bentley (1984)](https://dl.acm.org/doi/pdf/10.1145/358234.381162).
1. Programming pearls: The back of the envelope. [Bentley (1984)](https://dl.acm.org/doi/pdf/10.1145/357994.381168).
1. Reflections on Trusting Trust. [Thompson (1984)](https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_ReflectionsonTrustingTrust.pdf).
1. End-To-End Arguments in System Design. [Saltzer, Reed, Clark (1984)](https://groups.csail.mit.edu/ana/Publications/PubPDFs/End-to-End%20Arguments%20in%20System%20Design.pdf).
1. Programming as Theory Building. [Naur (1985)](https://pages.cs.wisc.edu/~remzi/Naur.pdf).
1. On understanding types, data abstraction, and polymorphism. [Cardelli, Wegner (1985)](https://dl.acm.org/doi/pdf/10.1145/6041.6042).
1. An algorithm for distributed computation of a Spanning Tree in an Extended LAN. [Perlman (1985)](https://dl.acm.org/doi/pdf/10.1145/319056.319004).
1. Impossibility of Distributed Consensus With One Faulty Process. [Fisher, Lynch, Patterson (1985)](https://groups.csail.mit.edu/tds/papers/Lynch/jacm85.pdf).
1. **Designing for Usability: Key Principles and What Designers Think. [Gould, Lewis (1985)](https://dl.acm.org/doi/pdf/10.1145/3166.3170).**
1. Why do computers stop and what can be done about it? [Gray (1985)](https://jimgray.azurewebsites.net/papers/TandemTR85.7_WhyDoComputersStop.pdf).
1. Making data structures persistent. [Driscoll et al (1986)](https://dl.acm.org/doi/pdf/10.1145/12130.12142).
1. Programming pearls: little languages. [Bentley (1986)](https://dl.acm.org/doi/pdf/10.1145/6424.315691).
1. The design of POSTGRES. [Stonebraker, Rowe (1986)](https://dl.acm.org/doi/pdf/10.1145/16856.16888).
1. **No Silver Bullet: Essence and Accidents of Software Engineering. [Brooks (1987)](http://worrydream.com/refs/Brooks-NoSilverBullet.pdf).**
1. A Digital Signature Based on a Conventional Encryption Function. [Merkle (1987)](https://people.eecs.berkeley.edu/~raluca/cs261-f15/readings/merkle.pdf).
1. The Design Philosophy of the DARPA Internet Protocols. [Clark (1988)](http://ccr.sigcomm.org/archive/1995/jan95/ccr-9501-clark.pdf).
1. **Why Functional Programming Matters. [Hughes (1990)](https://www.cs.kent.ac.uk/people/staff/dat/miranda/whyfp90.pdf).**
1. Implementing Fault-Tolerant Services Using the State Machine Approach: A Tutorial. [Schneider (1990)](https://www.cs.cornell.edu/fbs/publications/SMSurvey.pdf).
1. SELF: The Power of Simplicity. [Ungar, Smith (1991)](https://people.eecs.berkeley.edu/~fateman/264/papers/selfpower.ps).
1. On Building Systems That Will Fail. [Corbató (1991)](https://dl.acm.org/doi/pdf/10.1145/114669.114686).
1. The Design and Implementation of a Log-Structured File System. [Rosenblum, Ousterhout (1991)](https://people.eecs.berkeley.edu/~brewer/cs262/LFS.pdf).
1. The essence of functional programming. [Wadler (1992)](https://dl.acm.org/doi/pdf/10.1145/143165.143169).
1. World-Wide Web: Information Universe. [Berners-Lee et al (1992)](https://www.w3.org/History/1992/ENRAP/Article_9202.pdf).
1. **Engineering a Sort Function. [Bentley, McIlroy (1993)](https://cs.fit.edu/~pkc/classes/writing/samples/bentley93engineering.pdf).**
1. The Essence of Compiling with Continuations. [Flanagan et al (1993)](https://dl.acm.org/doi/pdf/10.1145/173262.155113).
1. Software Aging. [Parnas (1994)](https://dl.acm.org/doi/pdf/10.5555/257734.257788).
1. Software Transactional Memory. [Shavit, Touitou (1997)](https://groups.csail.mit.edu/tds/papers/Shavit/ShavitTouitou.pdf).
1. Human-Computer Interaction: Psychology as a Science of Design. [Carroll (1997)](https://home.cs.colorado.edu/~martin/Csci6402/Papers/carroll97.pdf).
1. Fifty Years of Shannon Theory. [Verdú (1998)](https://monoskop.org/images/7/78/Verdu_Sergio_1998_Fifty_Years_of_Shannon_Theory.pdf).
1. The Cathedral and the Bazaar. [Raymond (1998)](http://users.ece.utexas.edu/~perry/education/382v-s08/papers/raymond.pdf).
1. **The anatomy of a large-scale hypertextual Web search engine. [Brin, Page (1998)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/334.pdf).**
1. The PageRank Citation Ranking: Bringing Order to the Web. [Page, Brin, Motwani (1998)](http://www.eecs.harvard.edu/~michaelm/CS222/pagerank.pdf).
1. Operational transformation in real-time group editors: issues, algorithms, and achievements. [Sun, Ellis (1998)](https://dl.acm.org/doi/pdf/10.1145/289444.289469).
1. Rules of Thumb in Data Engineering. [Gray, Shenay (1999)](http://research.microsoft.com/en-us/um/people/gray/papers/ms_tr_99_100_rules_of_thumb_in_data_engineering.pdf).
1. Practical Byzantine Fault Tolerance. [Castro, Liskov (1999)](https://pmg.csail.mit.edu/papers/osdi99.pdf).
1. QuickCheck: A Lightweight Tool for Random Testing of Haskell Programs. [Claessen, Hughes (2000)](https://www.cs.tufts.edu/~nr/cs257/archive/john-hughes/quick.pdf).
1. Paxos made simple. [Lamport (2001)](https://lamport.azurewebsites.net/pubs/paxos-simple.pdf).
1. Statistical Modeling: The Two Cultures. [Breiman (2001)](https://projecteuclid.org/journalArticle/Download?urlId=10.1214%2Fss%2F1009213726).
1. Recovery Oriented Computing (ROC): Motivation, Definition, Techniques, and Case Studies. [Patterson et al (2002)](http://www2.eecs.berkeley.edu/Pubs/TechRpts/2002/CSD-02-1175.pdf).
1. Kademlia: A Peer-to-Peer Information System Based on the XOR Metric. [Maymounkov, Mazières (2002)](https://pdos.csail.mit.edu/~petar/papers/maymounkov-kademlia-lncs.pdf).
1. A Brief History of Just-In-Time. [Aycock (2003)](https://user.it.uu.se/~kostis/Teaching/KT2-04/jit_survey.pdf).
1. The Google File System. [Ghemawat, Gobioff, Leung (2003)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/035fc972c796d33122033a0614bc94cff1527999.pdf).
1. Crash-Only Software. [Candea, Fox (2003)](https://research.cs.wisc.edu/areas/os/ReadingGroup/os-old/Papers/HotOSIX/Candea-CrashOnlySoftware.pdf).
1. Incentives Build Robustness in BitTorrent. [Cohen (2003)](https://www.bittorrent.org/bittorrentecon.pdf).
1. LLVM: A Compilation Framework for Lifelong Program Analysis & Transformation. [Lattner, Adve (2004)](https://llvm.org/pubs/2004-01-30-CGO-LLVM.pdf).
1. A Unified Theory of Garbage Collection. [Bacon, Cheng, Rajan (2004)](https://courses.cs.washington.edu/courses/cse590p/05au/p50-bacon.pdf).
1. TOR: The second generation onion router. [Dingledine et al (2004)](https://svn-archive.torproject.org/svn/projects/design-paper/tor-design.pdf).
1. MapReduce: Simplified Data Processing on Large Clusters. [Dean, Ghemawat (2004)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/16cb30b4b92fd4989b8619a61752a2387c6dd474.pdf).
1. A Nanopass Framework for Compiler Education. [Sarkar, Waddell, Dybvig (2005)](https://legacy.cs.indiana.edu/~dyb/pubs/nano-jfp.pdf).
1. Church's Thesis and Functional Programming. [Turner (2006)](https://kar.kent.ac.uk/88944/1/ctfp.pdf_nocoversheet).
1. **An Incremental Approach to Compiler Construction. [Ghuloum (2006)](http://scheme2006.cs.uchicago.edu/11-ghuloum.pdf).**
1. Out of the Tar Pit. [Moseley, Marks (2006)](http://curtclifton.net/papers/MoseleyMarks06a.pdf).
1. Why the Internet only just works. [Handley (2006)](http://www0.cs.ucl.ac.uk/staff/m.handley/papers/only-just-works.pdf).
1. Bigtable: A Distributed Storage System for Structured Data. [Chang et al (2006)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/68a74a85e1662fe02ff3967497f31fda7f32225c.pdf).
1. Performance Anti-Patterns. [Smaalders (2006)](https://queue.acm.org/detail.cfm?id=1117403).
1. The Salsa20 family of stream ciphers. [Bernstein (2007)](https://cr.yp.to/snuffle/salsafamily-20071225.pdf).
1. Paxos made live - An Engineering Perspective. [Chandra, Griesemer, Redstone (2007)](https://www.cs.utexas.edu/users/lorenzo/corsi/cs380d/papers/paper2-1.pdf).
1. **Dynamo, Amazon’s Highly Available Key-value store. [DeCandia et al (2007)](https://www.allthingsdistributed.com/files/amazon-dynamo-sosp2007.pdf).**
1. **On Designing and Deploying Internet Scale Services. [Hamilton (2007)](https://s3.amazonaws.com/systemsandpapers/papers/hamilton.pdf).**
1. **Bitcoin, A peer-to-peer electronic cash system. [Nakamoto (2008)](https://bitcoin.org/bitcoin.pdf).**
1. Building on Quicksand. [Helland, Campbell (2009)](https://arxiv.org/ftp/arxiv/papers/0909/0909.1788.pdf).
1. The Unreasonable Effectiveness of Data. [Halevy, Norvig, Pereira (2009)](https://storage.googleapis.com/gweb-research2023-media/pubtools/pdf/35179.pdf).
1. ZooKeeper: wait-free coordination for internet scale systems. [Hunt et al (2010)](https://www.usenix.org/legacy/event/atc10/tech/full_papers/Hunt.pdf).
1. The Hadoop Distributed File System. [Shvachko et al (2010)](https://cse.buffalo.edu/~okennedy/courses/papers/hdfs.pdf).
1. Thinking Clearly about Performance. [Millsap (2010)](https://queue.acm.org/detail.cfm?id=1854041).
1. Kafka: a Distributed Messaging System for Log Processing. [Kreps, Narkhede, Rao (2011)](http://notes.stephenholiday.com/Kafka.pdf).
1. Conflict-free Replicated Data Types. [Shapiro et al (2011)](https://pages.lip6.fr/Marc.Shapiro/papers/RR-7687.pdf).
1. CAP Twelve Years Later: How the "Rules" Have Changed. [Brewer (2012)](https://sites.cs.ucsb.edu/~rich/class/cs293b-cloud/papers/brewer-cap.pdf).
1. **Thinking Methodically about Performance. [Gregg (2012)](https://queue.acm.org/detail.cfm?id=2413037).**
1. **A Few Useful Things to Know About Machine Learning. [Domingos (2012)](https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf).**
1. ImageNet Classification with Deep Convolutional Neural Networks. [Krizhevsky, Sutskever, Hinton (2012)](https://papers.nips.cc/paper_files/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf).
1. Playing Atari with Deep Reinforcement Learning. [Mnih et al (2013)](https://arxiv.org/pdf/1312.5602.pdf).
1. The Network is Reliable. [Bailis, Kingsbury (2014)](https://queue.acm.org/detail.cfm?id=2655736).
1. In Search of an Understandable Consensus Algorithm. [Ongaro, Ousterhout (2014)](https://raft.github.io/raft.pdf).
1. IPFS - Content Addressed, Versioned, P2P File System. [Benet (2014)](https://raw.githubusercontent.com/ipfs/papers/master/ipfs-cap2pfs/ipfs-p2p-file-system.pdf).
1. Ethereum: A Next-Generation Smart Contract and Decentralized Application Platform. [Buterin (2014)](https://ethereum.org/content/whitepaper/whitepaper-pdf/Ethereum_Whitepaper_-_Buterin_2014.pdf).
1. Generative Adversarial Nets. [Goodfellow et al (2014)](https://proceedings.neurips.cc/paper_files/paper/2014/file/5ca3e9b122f61f8f06494c97b1afccf3-Paper.pdf).
1. Towards a Theory of Conceptual Design for Software. [Jackson (2015)](https://groups.csail.mit.edu/sdg/pubs/2015/concept-essay.pdf).
1. Deep Learning. [LeCun, Bengio, Hinton (2015)](https://www.cs.toronto.edu/~hinton/absps/NatureDeepReview.pdf).
1. Bringing the Web up to Speed with WebAssembly. [Haas (2017)](https://dl.acm.org/doi/pdf/10.1145/3062341.3062363).
1. Amazon Aurora: Design Considerations for High Throughput Cloud-Native Relational Databases. [Verbitski et al (2017)](https://pages.cs.wisc.edu/~yxy/cs764-f20/papers/aurora-sigmod-17.pdf).
1. Attention Is All You Need. [Vaswani et al (2017)](https://arxiv.org/pdf/1706.03762.pdf).
1. Local-First Software: You Own Your Data, in spite of the Cloud. [Kleppmann et al (2019)](https://www.inkandswitch.com/local-first/static/local-first.pdf). Sources This list was inspired by (and draws from) several books and paper collections: Papers We Love Ideas That Created the Future The Innovators The morning paper Distributed systems for fun and profit Readings in Database Systems (the Red Book) Fermat's Library Classics in Human-Computer Interaction Awesome Compilers Distributed Consensus Reading List The Decade of Deep Learning Meta reads A few interesting resources about reading papers from Papers We Love and elsewhere: Should I read papers? How to Read an Academic Article How to Read a Paper. Keshav (2007) . Efficient Reading of Papers in Science and Technology. Hanson (1999) . On ICSE’s “Most Influential Papers”. Parnas (1995) . Selection criteria The list should stay short. Let's say no more than 30 papers. The idea is not to include every interesting paper that I come across but rather to keep a representative list that's possible to read from start to finish with a similar level of effort as reading a technical book from cover to cover. I tried to include one paper per each major topic and author. Since in the process I found a lot of noteworthy alternatives, related or follow-up papers and I wanted to keep track of those as well, I included them as sublist items. The papers shouldn't be too long. For the same reasons as the previous item, I try to avoid papers longer than 20 or 30 pages. They should be self-contained and readable enough to be approachable by the casual technical reader. They should be freely available online. Although historical relevance was taken into account, I omitted seminal papers in the cases where I found them hard to approach, when the main subject of the paper wasn't the thing that made them influential, etc. Examples of this are classic works by Von Neumann, Turing and Shannon. That being said, where possible I preferred the original paper on each subject over modern updates or survey papers. I tended to prefer topics that I can relate to my professional practice, typically papers originated in the industry
or about innovations that later saw wide adoption. Similarly, I tended to skip more theoretical papers, those focusing on mathematical foundations for Computer Science, electronic aspects of hardware, etc. I sorted the list by a mix of relatedness of topics and a vague chronological relevance, such that it makes sense to read it in the suggested order. For example, historical and seminal topics go first, contemporary internet-era developments last, networking precedes distributed systems, etc. | 📚 A curated list of papers for Software Engineers | computer-science,papers,research,software-engineering,awesome-list | 0 | 4 | 12 | 150 | 2 | 1 | 1 |
WeMakeDevs/roadmaps | 🛣 Mentorship Roadmaps 🛣 Are you prepared to embark on your quest to receive mentoring and land a job? You've arrived exactly where you need to be. Follow the roadmaps, educate yourself, and then apply for positions. This repository contains the list of communities and job portals you can join and apply to. You can also add your known job and other portals to help us in our mission : Quality education, free for all. Roadmaps: Android-Development iOS-Development Backend-Development Blockchain Data-Science DevOps DevRel Flutter-Development Frontend-Development Fullstack-Development Open-Source Contribution: Are you aware of any community, job or project which are not added to our list yet? Now you can add your known communities, jobs and projects to the list and help your fellow community member. Please read Contributing Guide and Code of Conduct before proceeding. Connect with us ❤️ Thanks to all contributors ❤ | This repository contains the list of communities and job portals you can join and apply to. | community,roadmaps,backend,devops,devrel,frontend,full-stack,open-source,hacktoberfest | 0 | 168 | 402 | 559 | 23 | 3 | 1 |
MuhammedKalkan/OpenLens | [!CAUTION]
Lens Closed its source code. So please do not expect any more updates. OpenLens Build Repo Build Repo Only This repo ONLY PROVIDES SIGNED BINARIES AND DOES NOT ALTER SOURCE CODE for the OpenLens repo. For software issues regarding OpenLens or the Lens IDE, go to the Lens repo and open an issue there. Extensions Starting with 6.3.0 some extensions are removed from Lens. To install these most used extensions simply type @alebcay/openlens-node-pod-menu into the Extensions page in the OpenLens menu and it should install automatically. If you see the extension rapidly toggling between enabled and disabled , restart OpenLens and enable it in the Extension page.
For sources please refer here Overview This is the Binary Build Repo for the Lens repository aka OpenLens . This build only includes the open source part of the Lens IDE and does not require login. This repo was created due to lensapp/lens#5444. Download and use as is. If you have trouble about not seeing pod logs, remove old config files / extensions remaining from old Lens app. Auto Updater is now live starting with every binary downloaded with the Latest tag or the corresponding release assets. OpenLens vs Lens IDE Paraphrasing from the OpenLens README OpenLens The OpenLens repository , is where Team Lens (Mirantis) develops the Lens IDE product together with the community. It is backed by a number of Kubernetes and cloud native ecosystem pioneers. This source code is available to everyone under the MIT license . Lens IDE The Kubernetes management tool Lens IDE is a distribution of the OpenLens repository with Team Lens specific customizations released under a traditional EULA. Installation Manual Go to Releases and download the relevant binary for your system. MacOS Homebrew brew install --cask openlens Linux Download and install appropriate package
( .rpm , .deb or .AppImage )
and install based on available package manager. Windows Scoop scoop bucket add extras
scoop install openlens Winget winget install openlens Chocolatey choco install -y openlens For alpha/beta builds: choco install -y openlens --pre Thanks Big Thanks to Ebby Peter , Xaver Lohmüller and those who all supported to raise funds, for their contributions to sign the app for the community | OpenLens Binary Build Repository | [] | 63 | 11 | 53 | 298 | 6 | 3 | 0 |
OFA-Sys/Chinese-CLIP | 中文说明 | English ModelScope | Demo | Paper |  Blog 本项目为CLIP模型的**中文**版本,使用大规模中文数据进行训练(~2亿图文对),旨在帮助用户快速实现中文领域的[图文特征&相似度计算](#API快速上手)、[跨模态检索](#跨模态检索)、[零样本图片分类](#零样本图像分类)等任务。本项目代码基于 [open_clip project](https://github.com/mlfoundations/open_clip) 建设,并针对中文领域数据以及在中文数据上实现更好的效果做了优化。本项目提供了API、训练代码和测试代码,下文中将详细介绍细节。 # 新闻
* 2023.11.30 Chinese-CLIP添加了转换Pytorch模型为coreml格式的[转换脚本](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/deploy/pytorch_to_coreml.py),用于部署。(感谢[@manymuch](https://github.com/manymuch)贡献代码❤️)
* 2023.9.8 Chinese-CLIP支持了基于[ModelScope](https://github.com/modelscope/modelscope)库的[知识蒸馏微调功能](distillation.md)。(感谢阿里云PAI团队[@wuziheng](https://github.com/wuziheng)和[@Jaskr616](https://github.com/Jaskr616)同学[贡献代码](https://github.com/OFA-Sys/Chinese-CLIP/pull/195)❤️)
* 2023.5.9 Chinese-CLIP适配Pytorch2.0。
* 2023.3.20 新增对比学习的[梯度累积](#gradient_accumulation)支持,可模拟更大batch size的训练效果
* 2023.2.16 新增[FlashAttention](https://github.com/HazyResearch/flash-attention)支持,提升训练速度,降低显存占用,详见[flash_attention.md](flash_attention.md)
* 2023.1.15 新增部署[ONNX](https://onnx.ai/)和[TensorRT](https://developer.nvidia.com/tensorrt)模型支持(并提供预训练TensorRT模型),提升特征推理速度,满足部署需求,详见[deployment.md](deployment.md)
* 2022.12.12 新增实现[FLIP](https://arxiv.org/abs/2212.00794)训练策略,在finetune训练时可[激活使用](#FLIP)(感谢[@zwkkk](https://github.com/zwkkk)同学[贡献代码](https://github.com/OFA-Sys/Chinese-CLIP/pull/26)❤️)
* 2022.12.3 公开[ELEVATER](https://eval.ai/web/challenges/challenge-page/1832)图像分类数据集的中文版本,详见[数据文档](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/zeroshot_dataset.md)
* 2022.12.1 Chinese-CLIP模型代码&特征提取API,同步合入Huggingface transformers🤗代码库
* 2022.11.22 新增[零样本图像分类](#零样本图像分类)代码,可支持[ELEVATER benchmark](https://eval.ai/web/challenges/challenge-page/1832)零样本分类评测任务
* 2022.11.3 新增RN50,ViT-H-14模型,公开[技术报告](https://arxiv.org/pdf/2211.01335.pdf)
* 2022.9.22 新增ViT-L-14,ViT-L-14-336模型
* 2022.7.13 新增[图文特征提取快速API](#API快速上手),几行代码快速调用中文CLIP模型,计算图文特征&相似度
* 2022.7.8 Chinese-CLIP项目正式开源,开源[图文检索](#跨模态检索)代码 # 模型及实验 ## 模型规模 & 下载链接
Chinese-CLIP目前开源5个不同规模,其模型信息和下载方式见下表: 模型规模 下载链接 参数量 视觉侧骨架 视觉侧参数量 文本侧骨架 文本侧参数量 分辨率 CN-CLIP RN50 Download 77M ResNet50 38M RBT3 39M 224 CN-CLIP ViT-B/16 Download 188M ViT-B/16 86M RoBERTa-wwm-Base 102M 224 CN-CLIP ViT-L/14 Download 406M ViT-L/14 304M RoBERTa-wwm-Base 102M 224 CN-CLIP ViT-L/14@336px Download 407M ViT-L/14 304M RoBERTa-wwm-Base 102M 336 CN-CLIP ViT-H/14 Download 958M ViT-H/14 632M RoBERTa-wwm-Large 326M 224 ## 实验结果
针对图文检索任务,我们在[MUGE Retrieval](https://tianchi.aliyun.com/muge)、[Flickr30K-CN](https://github.com/li-xirong/cross-lingual-cap)和[COCO-CN](https://github.com/li-xirong/coco-cn)上进行了zero-shot和finetune的实验。针对图像零样本分类,我们在[ELEVATER](https://eval.ai/web/challenges/challenge-page/1832)的10个数据集上进行了实验。实验结果如下表所示。篇幅所限,我们这里给出baseline模型和Chinese-CLIP的最优规模模型结果,关于Chinese-CLIP各规模的详细结果指标,请详见[Results.md](Results.md)。
**MUGE Text-to-Image Retrieval (Official Validation Set)**: Setup Zero-shot Finetune Metric R@1 R@5 R@10 MR R@1 R@5 R@10 MR Wukong 42.7 69.0 78.0 63.2 52.7 77.9 85.6 72.1 R2D2 49.5 75.7 83.2 69.5 60.1 82.9 89.4 77.5 CN-CLIP 63.0 84.1 89.2 78.8 68.9 88.7 93.1 83.6 **Flickr30K-CN Retrieval (Official Test Set)**: Task Text-to-Image Image-to-Text Setup Zero-shot Finetune Zero-shot Finetune Metric R@1 R@5 R@10 R@1 R@5 R@10 R@1 R@5 R@10 R@1 R@5 R@10 Wukong 51.7 78.9 86.3 77.4 94.5 97.0 76.1 94.8 97.5 92.7 99.1 99.6 Taiyi 60.8 85.0 91.0 - - - - - - - - - R2D2 60.9 86.8 92.7 84.4 96.7 98.4 77.6 96.7 98.9 95.6 99.8 100.0 CN-CLIP 71.2 91.4 95.5 83.8 96.9 98.6 81.6 97.5 98.8 95.3 99.7 100.0 **COCO-CN Retrieval (Official Test Set)**: Task Text-to-Image Image-to-Text Setup Zero-shot Finetune Zero-shot Finetune Metric R@1 R@5 R@10 R@1 R@5 R@10 R@1 R@5 R@10 R@1 R@5 R@10 Wukong 53.4 80.2 90.1 74.0 94.4 98.1 55.2 81.0 90.6 73.3 94.0 98.0 Taiyi 60.0 84.0 93.3 - - - - - - - - - R2D2 56.4 85.0 93.1 79.1 96.5 98.9 63.3 89.3 95.7 79.3 97.1 98.7 CN-CLIP 69.2 89.9 96.1 81.5 96.9 99.1 63.0 86.6 92.9 83.5 97.3 99.2 **Zero-shot Image Classification**: Task CIFAR10 CIFAR100 DTD EuroSAT FER FGVC KITTI MNIST PC VOC GIT 88.5 61.1 42.9 43.4 41.4 6.7 22.1 68.9 50.0 80.2 ALIGN 94.9 76.8 66.1 52.1 50.8 25.0 41.2 74.0 55.2 83.0 CLIP 94.9 77.0 56.0 63.0 48.3 33.3 11.5 79.0 62.3 84.0 Wukong 95.4 77.1 40.9 50.3 - - - - - - CN-CLIP 96.0 79.7 51.2 52.0 55.1 26.2 49.9 79.4 63.5 84.9 # 开始用起来!
## 安装要求
开始本项目前,需先检查是否满足下列环境配置要求:
* python >= 3.6.4
* pytorch >= 1.8.0 (with torchvision >= 0.9.0)
* CUDA Version >= 10.2
运行下列命令即可安装本项目所需的三方库。
```bash
pip install -r requirements.txt
```
## API快速上手
下面提供一段简单的代码示例说明如何使用中文CLIP的API。开始使用前,请先安装cn_clip:
```bash
# 通过pip安装
pip install cn_clip
# 或者从源代码安装
cd Chinese-CLIP
pip install -e .
```
安装成功后,即可通过如下方式轻松调用API,传入指定图片([示例](examples/pokemon.jpeg))和文本,提取图文特征向量并计算相似度:
```python
import torch
from PIL import Image
import cn_clip.clip as clip
from cn_clip.clip import load_from_name, available_models
print("Available models:", available_models())
# Available models: ['ViT-B-16', 'ViT-L-14', 'ViT-L-14-336', 'ViT-H-14', 'RN50']
device = "cuda" if torch.cuda.is_available() else "cpu"
model, preprocess = load_from_name("ViT-B-16", device=device, download_root='./')
model.eval()
image = preprocess(Image.open("examples/pokemon.jpeg")).unsqueeze(0).to(device)
text = clip.tokenize(["杰尼龟", "妙蛙种子", "小火龙", "皮卡丘"]).to(device)
with torch.no_grad():
image_features = model.encode_image(image)
text_features = model.encode_text(text)
# 对特征进行归一化,请使用归一化后的图文特征用于下游任务
image_features /= image_features.norm(dim=-1, keepdim=True)
text_features /= text_features.norm(dim=-1, keepdim=True)
logits_per_image, logits_per_text = model.get_similarity(image, text)
probs = logits_per_image.softmax(dim=-1).cpu().numpy()
print("Label probs:", probs) # [[1.268734e-03 5.436878e-02 6.795761e-04 9.436829e-01]]
```
我们也准备了部署ONNX和TensorRT模型的相关支持,流程详见[deployment.md](deployment.md)。
如果你不满足于仅仅使用API,欢迎继续阅读本文档,了解如何使用我们的项目进行CLIP模型的训练和测试。 # 教程
下文将包括[跨模态检索教程](#跨模态检索)(包含finetune和inference,及KNN计算等)以及[零样本图像分类教程](#零样本图像分类)。
## 跨模态检索
### 代码组织
下载本项目后, 请创建新的文件夹 ```${DATAPATH}``` 以存放数据集、预训练ckpt、以及finetune产生的模型日志&ckpt。推荐工作区目录结构如下:
```
Chinese-CLIP/
├── run_scripts/
│ ├── muge_finetune_vit-b-16_rbt-base.sh
│ ├── flickr30k_finetune_vit-b-16_rbt-base.sh
│ └── ... # 更多finetune或评测脚本...
└── cn_clip/
├── clip/
├── eval/
├── preprocess/
└── training/
${DATAPATH}
├── pretrained_weights/
├── experiments/
├── deploy/ # 用于存放ONNX & TensorRT部署模型
└── datasets/
├── MUGE/
├── Flickr30k-CN/
└── .../ # 更多自定义数据集...
```
### 准备工作
这里我们提供预训练模型参数的下载方式,以及进行finetune前对数据进行的预处理过程
#### 预训练CKPT
请参考前文[模型规模 & 下载链接](#model_card)部分,下载对应模型ckpt。推荐将下载的ckpt文件存放于`${DATAPATH}/pretrained_weights/`目录下。
#### 数据集格式预处理
为了与Chinese-CLIP代码适配,同时保证数据处理和读取的效率,我们建议将训练&评测使用的图文数据集统一组织成如下的方式:
```
${DATAPATH}
└── datasets/
└── ${dataset_name}/
├── train_imgs.tsv # 图片id & 图片内容
├── train_texts.jsonl # 文本id & 文本内容,连同匹配的图片id列表
├── valid_imgs.tsv
├── valid_texts.jsonl
├── test_imgs.tsv
└── test_texts.jsonl
```
其中`${dataset_name}`代指数据集名称(如MUGE)
为保证文件处理效率,我们不是将图片以大量的小文件方式存放,而是将训练/验证/测试图片以base64形式分别存放在`${split}_imgs.tsv`文件中。文件每行表示一张图片,包含图片id(int型)与图片base64,以tab隔开,格式如下:
```
1000002 /9j/4AAQSkZJ...YQj7314oA//2Q==
```
将图片原始文件转换为base64的方式非常简单,请执行以下python代码:
```python
from PIL import Image
from io import BytesIO
import base64
img = Image.open(file_name) # 访问图片路径
img_buffer = BytesIO()
img.save(img_buffer, format=img.format)
byte_data = img_buffer.getvalue()
base64_str = base64.b64encode(byte_data) # bytes
base64_str = base64_str.decode("utf-8") # str
```
文本信息及图文对匹配关系则保存在`${split}_texts.jsonl`文件。文件每行是一行json,格式如下:
```
{"text_id": 8428, "text": "高级感托特包斜挎", "image_ids": [1076345, 517602]}
```
对于测试集只有文本,不知道图文对匹配关系的情况,每行的`image_ids`字段处理为空列表即可,即`"image_ids": []`。
最后,我们还需要将tsv和jsonl文件一起序列化,转换为内存索引的LMDB数据库文件,方便训练时的随机读取
```
python cn_clip/preprocess/build_lmdb_dataset.py \
--data_dir ${DATAPATH}/datasets/${dataset_name}
--splits train,valid,test
```
例如对于MUGE数据集,则`${dataset_name}`设为MUGE,`--splits`指定需要转换的数据集划分,以逗号不加空格分隔。转换后,数据集文件夹下会对应增加以下LMDB序列化文件
```
${DATAPATH}
└── datasets/
└── ${dataset_name}/
└── lmdb/
├── train
│ ├── imgs
│ └── pairs
├── valid
└── test
```
为了降低上手难度,我们也提供了按上述步骤预处理好的MUGE数据([下载链接](https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/datasets/MUGE.zip))和Flickr30K-CN数据([下载链接](https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/datasets/Flickr30k-CN.zip))压缩包,直接下载解压并放置于`${DATAPATH}/datasets/`目录下即可。如果需要[COCO-CN](https://github.com/li-xirong/coco-cn)数据,请向原作者进行申请许可完成后,邮件联系我们吧。
### 模型finetune
在此我们介绍训练的步骤,方便其他用户了解模型细节,使用我们提供的中文CLIP预训练模型进行finetune。基于MUGE和Flickr30K-CN两个下游检索数据集,我们提供了训练样例脚本`run_scripts/muge_finetune_vit-b-16_rbt-base.sh`和`run_scripts/flickr30k_finetune_vit-b-16_rbt-base.sh`。 运行脚本同时支持单机(单卡或多卡)和多机分布式训练,请在运行前,先根据脚本开头的指引注释,填写好分布式相关配置,之后运行如下命令即可开始训练(多机训练请在各机器上都运行命令)。对于显存不足的情况,可以考虑激活配置项中的[重计算策略](#checkpointing)。 训练产生的log和模型ckpt文件,会自动保存在用户指定的目录下:
```bash
cd Chinese-CLIP/
bash run_scripts/muge_finetune_vit-b-16_rbt-base.sh ${DATAPATH}
```
相关的训练配置项包括:
+ 分布式
+ `WORKER_CNT`: 训练的机器个数
+ `GPUS_PER_NODE`: 每个机器上的GPU个数
+ 训练/验证数据
+ `train-data`: 训练数据LMDB目录,准备LMDB数据文件的预处理流程见上。
+ `val-data`: 验证数据LMDB目录,指定为None时,则不进行训练过程中的验证。
+ `num-workers`: 训练集数据处理(DataLoader)的进程数,默认为4。
+ `valid-num-workers`: 验证集数据处理(DataLoader)的进程数(如果进行验证),默认为1。
+ 训练超参数
+ `vision-model`: 指定视觉backbone, 从 `["ViT-B-16", "ViT-L-14", "ViT-L-14-336", "ViT-H-14", "RN50"]`选择。
+ `text-model`: 指定文本backbone, 从 `["RoBERTa-wwm-ext-base-chinese", "RoBERTa-wwm-ext-large-chinese", "RBT3-chinese"]`选择。
+ `context-length`: 文本输入序列长度。
+ `warmup`: warmup步数。
+ `batch-size`: 训练时单卡batch-size。(请保证`训练样本总数 > batch-size * GPU数`,至少满足1个训练batch)
+ `lr`: 学习率。
+ `wd`: weight decay。
+ `max-steps`: 训练步数,也可通过`max-epochs`指定训练轮数。
+ `freeze-vision`: 是否freeze视觉backbone。
+ `use-augment`: 是否使用[AutoAugment](https://arxiv.org/abs/1805.09501)对图片进行数据增强。
+ `valid-batch-size`: 验证时单机batch-size。(请保证`验证集样本总数 > batch-size * GPU数`,至少满足1个验证batch)
+ `valid-step-interval`和`valid-epoch-interval`: 验证step/epoch频率,指定为-1时则在训练中不进行验证。
+ `grad-checkpointing`: 使用[重计算策略](https://pytorch.org/docs/stable/checkpoint.html),在前向过程中不保存中间结果,以训练时间换取更小的显存开销,适用于显存不足的情况。(`store_true`参数,直接在脚本中加上`--grad-checkpointing`即可,目前要求Pytorch>1.8.0)
+ `mask-ratio`: 参照[FLIP](https://arxiv.org/abs/2212.00794)的策略,在finetune时可指定随机mask一定比例的图像patch,以降低显存开销、加快训练速度。默认为0.0,即不激活这一策略。
+ `use-flash-attention`: 使用[FlashAttention](https://arxiv.org/abs/2205.14135),可在不影响效果的条件下为Chinese-CLIP的finetune过程显著提速以及降低显存占用。(`store_true`参数,配置好环境后,在脚本中加上`--use-flash-attention`即可,请详见[flash_attention.md](flash_attention.md))
+ `accum-freq`: 梯度累积频率,默认为1。指定为大于1的整数时开启对比学习梯度累积,模拟更大的batch size。如果单卡batch size为`m`,则总的batch size为`accum_freq * m * GPU数`。
+ `gather-with-grad`: 是否在分布式训练时进行带有完整梯度的特征gather,默认关闭。
+ 输出选项
+ `name`: 指定输出路径。超参日志, 训练日志以及产出ckpt均会存放至 `${DATAPATH}/experiments/${name}/`。
+ `save-step-frequency`及`save-epoch-frequency`: 存ckpt的步数或轮数间隔。
+ `report-training-batch-acc`: 日志是否报告训练图到文&文到图batch准确率。
+ 权重读取相关选项
+ `resume`: 权重读取的路径。示例脚本中指定为预训练ckpt路径,也可以指定为用户自己finetune的ckpt路径做继续训练。
+ `reset-data-offset`: 是否从此前的数据断点续跑。如batch size或GPU卡数超参改变,建议打开此选项。
+ `reset-optimizer`: 是否使用optimizer state。
训练完毕,log 会自动存在`${DATAPATH}/experiments/${name}/out_${timestamp}.log`,训练log格式如下所示:
```
2022-12-11,20:40:34 | INFO | Rank 0 | Global Steps: 1/735 | Train Epoch: 1 [1024/250880 (0%)] | Loss: 2.371020 | Image2Text Acc: 49.90 | Text2Image Acc: 48.73 | Data Time: 1.039s | Batch Time: 3.625s | LR: 0.000000 | logit_scale: 4.605 | Global Batch Size: 1024
```
验证log格式如下所示:
```
2022-12-11,20:42:47 | INFO | Rank 0 | Validation Result (epoch 1 @ 150 steps) | Valid Loss: 0.502810 | Image2Text Acc: 84.95 | Text2Image Acc: 84.26 | logit_scale: 4.605 | Valid Batch Size: 128
```
**注意**: 对比学习的训练收敛和稳定性和总batch size相关。如您使用更小的batch size(相比默认配置128 per-GPU \* 8 GPU),建议使用更小的学习率。我们推荐使用更多的GPU和更大的batch size以取得更好的效果。
### 预测及评估
我们提供特征提取、以及图文检索任务评估的流程,具体如下:
#### 图文特征提取
目前本代码支持使用GPU单卡进行图文特征提取,请参考使用以下命令。我们也提供了部署ONNX和TensorRT模型,加速特征推理的支持,详见[deployment.md](deployment.md)。
```bash
cd Chinese-CLIP/
export CUDA_VISIBLE_DEVICES=0
export PYTHONPATH=${PYTHONPATH}:`pwd`/cn_clip
split=valid # 指定计算valid或test集特征
resume=${DATAPATH}/pretrained_weights/clip_cn_vit-b-16.pt
python -u cn_clip/eval/extract_features.py \
--extract-image-feats \
--extract-text-feats \
--image-data="${DATAPATH}/datasets/${dataset_name}/lmdb/${split}/imgs" \
--text-data="${DATAPATH}/datasets/${dataset_name}/${split}_texts.jsonl" \
--img-batch-size=32 \
--text-batch-size=32 \
--context-length=52 \
--resume=${resume} \
--vision-model=ViT-B-16 \
--text-model=RoBERTa-wwm-ext-base-chinese
```
产出图文特征默认将保存于`${DATAPATH}/datasets/${dataset_name}`目录下,图片特征保存于`${split}_imgs.img_feat.jsonl`文件,每行以json存储一张图片的特征,格式如下:
```
{"image_id": 1000002, "feature": [0.0198, ..., -0.017, 0.0248]}
```
文本特征则保存于`${split}_texts.txt_feat.jsonl`,格式如下:
```
{"text_id": 248816, "feature": [0.1314, ..., 0.0018, -0.0002]}
```
#### KNN检索
对于小规模的学术检索数据集,我们提供一个简单的KNN检索实现,便于计算文到图、图到文检索的top-k召回结果(tips:如想仿照我们在项目中搭建[检索demo](https://www.modelscope.cn/studios/damo/chinese_clip_applications/summary),建议基于中文CLIP模型产出图文特征后,结合开源工程框架[clip-retrieval](https://github.com/rom1504/clip-retrieval)搭建前后端服务。)
对于文到图检索(文本召回相关图片),请运行以下命令:
```bash
cd Chinese-CLIP/
split=valid # 指定计算valid或test集特征
python -u cn_clip/eval/make_topk_predictions.py \
--image-feats="${DATAPATH}/datasets/${dataset_name}/${split}_imgs.img_feat.jsonl" \
--text-feats="${DATAPATH}/datasets/${dataset_name}/${split}_texts.txt_feat.jsonl" \
--top-k=10 \
--eval-batch-size=32768 \
--output="${DATAPATH}/datasets/${dataset_name}/${split}_predictions.jsonl"
```
产出的结果保存在指定的jsonl文件中,每行表示一个文本召回的top-k图片id,格式如下:
```json
{"text_id": 153915, "image_ids": [5791244, 1009692167, 7454547004, 3564007203, 38130571, 2525270674, 2195419145, 2503091968, 4966265765, 3690431163]}
```
对于图到文检索(图片召回相关文本),类似地,请运行以下命令:
```bash
split=valid # 指定计算valid或test集特征
python -u cn_clip/eval/make_topk_predictions_tr.py \
--image-feats="${DATAPATH}/datasets/${dataset_name}/${split}_imgs.img_feat.jsonl" \
--text-feats="${DATAPATH}/datasets/${dataset_name}/${split}_texts.txt_feat.jsonl" \
--top-k=10 \
--eval-batch-size=32768 \
--output="${DATAPATH}/datasets/${dataset_name}/${split}_tr_predictions.jsonl"
```
产出结果每行表示一个图片召回的top-k文本id,格式如下:
```json
{"image_id": 977856234, "text_ids": [156914, 157914, 158914, 155914, 156179, 158907, 157179, 154179, 154914, 154723]}
```
#### Recall计算
我们提供了评测脚本计算检索任务的Recall@1/5/10,同时给出mean recall(Recall@1/5/10的平均数)。运行如下命令即可获取分数:
对于文到图检索,请运行命令:
```bash
split=valid # 指定计算valid或test集特征
python cn_clip/eval/evaluation.py \
${DATAPATH}/datasets/${dataset_name}/${split}_texts.jsonl \
${DATAPATH}/datasets/${dataset_name}/${split}_predictions.jsonl \
output.json
cat output.json
```
对于图到文检索,请先运行下面的命令,将图文对标注的jsonl文件由文到图的格式转为图到文:
```bash
python cn_clip/eval/transform_ir_annotation_to_tr.py \
--input ${DATAPATH}/datasets/${dataset_name}/${split}_texts.jsonl
```
完成后,请运行命令:
```bash
split=valid # 指定计算valid或test集特征
python cn_clip/eval/evaluation_tr.py \
${DATAPATH}/datasets/${dataset_name}/${split}_texts.tr.jsonl \
${DATAPATH}/datasets/${dataset_name}/${split}_tr_predictions.jsonl \
output.json
cat output.json
```
打印出的结果格式将如下:
```json
{"success": true, "score": 85.67, "scoreJson": {"score": 85.67, "mean_recall": 85.67, "r1": 71.2, "r5": 90.5, "r10": 95.3}}
```
关于整套跨模态检索的训练和测试流程,我们以MUGE检索数据集([多模态电商图文挑战赛](https://tianchi.aliyun.com/competition/entrance/532031/introduction))为例,也提供了一个包含上述全部流程并可运行的Jupyter Notebook([下载链接](https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/others/Chinese-CLIP-on-MUGE-Retrieval.ipynb)),欢迎大家上手实践。 ## 零样本图像分类
本部分介绍如何使用Chinese-CLIP实现零样本图像分类,以零样本图像分类Benchmark ELEVATER中的数据集为例。ELEVATER是由多个知名的分类数据集(包括CIFAR-10、CIFAR-100、MNIST等)组成的评测集合,评测模型在这些数据集上的零样本效果。我们在实验中,给其中每个数据集准备了中文版本的prompt、类别标签连同原始图片,详见[数据文档](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/zeroshot_dataset.md),用于测试Chinese-CLIP模型。更多关于该benchmark的详情请点击[链接](https://eval.ai/web/challenges/challenge-page/1832/overview)。大家也可以参考我们提供的流程,仿照在自己的中文分类数据集准备数据并进行测试。 ### 准备工作
首先将数据按照如下格式进行准备。由于零样本图像分类仅需测试,因此只需要准备好测试集和预训练模型参数,按照如下目录结构,存放在用户指定的`${DATAPATH}`下:
```
${DATAPATH}
├── pretrained_weights/
└── datasets/
└── ${dataset_name}/
├── label_cn.txt
└── test/
├── 000/ # label id,如label个数大于10,则将其向左补零到3位数保证字典序
│ ├── image_0003.jpg # 图片样本,命名无特殊要求
│ ├── image_0005.jpg
│ └── ...
├── 001/
│ ├── image_0001.jpg
│ ├── image_0002.jpg
│ └── ...
└── 002/
├── image_0003.jpg
├── image_0005.jpg
└── ...
...
```
测试集保证test文件夹内数据按照label对应的id进行划分,并保证id为字典序(10以上的多位数,需向左补零`label.zfill(3)`, 如001,002等)。`label_cn.txt`为数据标签,每行一个标签名,如下所示:
```
手风琴
飞机
锚
...
```
每行的标签对应的label id为`行号-1`,如第1行的标签的id为0,第二行的标签的id为1。如果标签总数大于10,则统一向左补零到3位数,比如标签个数为100,标签id则为`000-099`。用户需为每个label id生成对应的文件夹,并将标注该label的样本放入其中。我们以ELEVATER中的**CIFAR-100数据集**为样例,请点击[链接](http://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/datasets/cifar-100.zip)下载处理好的数据。如果想尝试在其他ELEVATER包含的数据集上测试Chinese-CLIP,请参见我们的[数据文档](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/zeroshot_dataset.md)。 ### 预测和评估
我们准备了预测脚本,请查看`run_scripts/zeroshot_eval.sh`。运行命令例子如下:
```bash
bash run_scripts/zeroshot_eval.sh 0 \
${DATAPATH} ${dataset_name} \
${vision_model} ${text_model} \
${ckpt_path} ${index_file}
```
其中各参数意义为:
+ 第一个入参`0`为GPU id
+ `DATAPATH`参见上面的准备工作部分,根据实际位置输入对应路径
+ `dataset_name`参见上面的准备工作部分,输入评测的数据集目录名,如`cifar-100`
+ `vision_model`为指定模型类型,选项包括`["ViT-B-32", "ViT-B-16", "ViT-L-14", "ViT-L-14-336", "RN50", "ViT-H-14"]`
+ `text_model`包括`["RoBERTa-wwm-ext-base-chinese", "RoBERTa-wwm-ext-large-chinese", "RBT3-chinese"]`
+ `ckpt_path`为模型预训练ckpt的完整路径
+ `index_file`(可选,仅提交ELEVATER官网评测需要指定),请参见[数据文档](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/zeroshot_dataset.md)
例如,用ViT-B/16规模预训练模型进行评测CIFAR-100,则运行(`${DATAPATH}`需根据实际情况替换):
```bash
bash run_scripts/zeroshot_eval.sh 0 \
${DATAPATH} cifar-100 \
ViT-B-16 RoBERTa-wwm-ext-base-chinese \
${DATAPATH}/pretrained_weights/clip_cn_vit-b-16.pt
```
返回结果会打印top-1的准确率。
```
Result:
zeroshot-top1: 0.6444
```
在CIFAR-100上,ViT-B/16规模的Chinese-CLIP预期应该达到64.4%。我们在ELEVATER上其他规模、其他数据集的零样本分类结果,请详见[Results.md](https://github.com/OFA-Sys/Chinese-CLIP/blob/master/Results.md#zeroshot_results)。
同时,程序还会存下一个json文件用于提交ELEVATER官方用,json文件内容如下所示:
```json
{"model_name": "CN-CLIP-ViT-B-16", "dataset_name": "cifar-100", "num_trainable_params": 0, "num_params": 188262913, "num_visual_params": 86192640, "num_backbone_params": 188262913, "n_shot": 0, "rnd_seeds": [123], "predictions": "prediction probability tensor [size: (1, 10000, 100)]"}
```
其中包括模型名`model_name`、数据集名称`dataset_name`、总参数量`num_params`、视觉塔的参数量`num_visual_params`等模型的meta信息,以及模型输出结果,即模型的预测概率tensor,size为`[1, 样本数, 标签个数]`。
### 零样本分类在线Demo
基于我们集成于Huggingface transformers的特征提取API,我们在Huggingface Model Hub🤗提供了在线简单尝试零样本图像分类的demo(Hosted inference API),各个模型规模的demo链接见下,欢迎尝试!
- [OFA-Sys/chinese-clip-vit-base-patch16](https://huggingface.co/OFA-Sys/chinese-clip-vit-base-patch16)
- [OFA-Sys/chinese-clip-vit-large-patch14](https://huggingface.co/OFA-Sys/chinese-clip-vit-large-patch14)
- [OFA-Sys/chinese-clip-vit-large-patch14-336px](https://huggingface.co/OFA-Sys/chinese-clip-vit-large-patch14-336px)
- [OFA-Sys/chinese-clip-vit-huge-patch14](https://huggingface.co/OFA-Sys/chinese-clip-vit-huge-patch14)
- **(12.10日更新🔥)**[**基于Huggingface Spaces部署的新版demo**](https://huggingface.co/spaces/OFA-Sys/chinese-clip-zero-shot-image-classification):demo页面同时包含上述4个模型规模可选,支持输入自定义prompt模板,欢迎试用 # 引用
如果觉得本项目好用,希望能给我们提个star并分享给身边的用户,欢迎给相关工作citation,感谢支持!
```
@article{chinese-clip,
title={Chinese CLIP: Contrastive Vision-Language Pretraining in Chinese},
author={Yang, An and Pan, Junshu and Lin, Junyang and Men, Rui and Zhang, Yichang and Zhou, Jingren and Zhou, Chang},
journal={arXiv preprint arXiv:2211.01335},
year={2022}
}
``` | Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation. | chinese,computer-vision,multi-modal-learning,nlp,pytorch,vision-and-language-pre-training,image-text-retrieval,clip,pretrained-models,vision-language | 0 | 10 | 25 | 374 | 122 | 3 | 1 |
jina-ai/discoart | Create compelling Disco Diffusion artworks in one line DiscoArt is an elegant way of creating compelling Disco Diffusion [*] artworks for generative artists, AI enthusiasts and hard-core developers. DiscoArt has a modern & professional API with a beautiful codebase, ensuring high usability and maintainability. It introduces handy features such as result recovery and persistence, gRPC/HTTP serving w/o TLS, post-analysis, easing the integration to larger cross-modal or multi-modal applications. [*] Disco Diffusion is a Google Colab Notebook that leverages CLIP-Guided Diffusion to allow one to create compelling and beautiful images from text prompts. 💯 Best-in-class : industry-level engineering, top-notch code quality, lean dependencies, small RAM/VRAM footprint; important bug fixes, feature improvements vs. the original DD5.6 . 👼 Available to all : smooth install for self-hosting , Google Colab free tier , non-GUI (IPython) environment, and CLI! No brainfuck, no dependency hell, no stackoverflow. 🎨 Focus on create not code : one-liner create() with a Pythonic interface, autocompletion in IDE, and powerful features. Fetch real-time results anywhere anytime, no more worry on session outrage on Google Colab. Set initial state easily for more efficient parameter exploration. 🏭 Ready for integration & production : built on top of DocArray data structure, enjoy smooth integration with Jina , CLIP-as-service and other cross-/multi-modal applications. ☁️ As-a-service : simply python -m discoart serve , DiscoArt is now a high-performance low-latency service supports gRPC/HTTP/websockets and TLS. Scaling up/down is one-line; Cloud-native features e.g. Kubernetes, Prometheus and Grafana is one-line. Unbelievable simple thanks to Jina . Gallery with prompts Do you see the discoart-id in each tweet? To get the config & prompts, simply: ```python
from discoart import show_config show_config('discoart-id')
``` Install Python 3.7+ and CUDA-enabled PyTorch is required. bash
pip install discoart This applies to both self-hosting , Google Colab , system integration, non-GUI environments. Self-hosted Jupyter : to run a Jupyter Notebook on your own GPU machine, the easiest way is to use our prebuilt Docker image . Use it from CLI : python -m discoart create and python -m discoart config are CLI commands. Use it as a service : python -m discoart serve allows one to run it as gRPC/HTTP/websockets service. GUI DiscoArt is the infrastructure for creating Disco Diffusion artworks. The built-in Jupyter Notebook support gives you basic yet limited user experience, e.g. it does not offer any intuitive GUI for prompt scheduling . Note that DiscoArt is developer-centric and API-first, hence improving consumer-facing experience is out of the scope. There are services, platforms and products (not Jina AI affiliated) that already integrate DiscoArt as a service and provide nice GUI on top of it, e.g. Fever Dreams, Replicate, RunPod and Renderflux. Click to see third-party GUI - [Fever Dreams](https://www.feverdreams.app/): a free community-powered service with nice GUI and gallery, where people generate and share their DiscoArt artworks, prompts and configs.
- [Replicate](https://replicate.com/nightmareai/disco-diffusion): a free form-based GUI of DiscoArt with sandbox user experience and the visualizations.
- [RunPod](https://www.runpod.io/blog/accelerate-your-generate-art-with-disco-diffusion-and-runpod): a paid GPU cloud provider that runs DiscoArt container with a simple and clean GUI to visualize the configs and creations.
- [Renderflux](https://beta.renderflux.com/register?invite=bughunting): a paid creative art platform that wraps DiscoArt and provides end-to-end GUI for creation management.
**Please be aware that these platforms, products or companies are not affiliated with Jina AI.** They define their own terms of services, paywall and data and privacy policies, which are not in the scope of DiscoArt MIT License. Get Started Create artworks ```python
from discoart import create da = create()
``` That's it! It will create with the default text prompts and parameters . Set prompts and parameters Supported parameters are listed here . You can specify them in create() : ```python
from discoart import create da = create(
text_prompts='A painting of sea cliffs in a tumultuous storm, Trending on ArtStation.',
init_image='https://d2vyhzeko0lke5.cloudfront.net/2f4f6dfa5a05e078469ebe57e77b72f0.png',
skip_steps=100,
)
``` In case you forgot a parameter, just lookup the cheatsheet at anytime: ```python
from discoart import cheatsheet cheatsheet()
``` The difference on the parameters between DiscoArt and DD5.6 is explained here . Visualize results Final results and intermediate results are created under the current working directory, i.e. text
./{name-docarray}/{i}-done.png
./{name-docarray}/{i}-step-{j}.png
./{name-docarray}/{i}-progress.png
./{name-docarray}/{i}-progress.gif
./{name-docarray}/da.protobuf.lz4 where: name-docarray is the name of the run, you can specify it otherwise it is a random name. i-* is up to the value of n_batches . *-done-* is the final image on done. *-step-* is the intermediate image at certain step, updated in real-time. *-progress.png is the sprite image of all intermediate results so far, updated in real-time. *-progress.gif is the animated gif of all intermediate results so far, updated in real-time. da.protobuf.lz4 is the compressed protobuf of all intermediate results so far, updated in real-time. The save frequency is controlled by save_rate . Moreover, create() returns da , a DocumentArray -type object. It contains the following information:
- All arguments passed to create() function, including seed, text prompts and model parameters.
- 4 generated image and its intermediate steps' images, where 4 is determined by n_batches and is the default value. This allows you to further post-process, analyze, export the results with powerful DocArray API. Images are stored as Data URI in .uri , to save the first image as a local file: python
da[0].save_uri_to_file('discoart-result.png') To save all final images: python
for idx, d in enumerate(da):
d.save_uri_to_file(f'discoart-result-{idx}.png') You can also display all four final images in a grid: python
da.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True) Or display them one by one: python
for d in da:
d.display() Or take one particular run: python
da[0].display() Visualize intermediate steps You can also zoom into a run (say the first run) and check out intermediate steps: python
da[0].chunks.plot_image_sprites(
skip_empty=True, show_index=True, keep_aspect_ratio=True
) You can .display() the chunks one by one, or save one via .save_uri_to_file() , or save all intermediate steps as a GIF: python
da[0].chunks.save_gif(
'lighthouse.gif', show_index=True, inline_display=True, size_ratio=0.5
) Note that >=0.7.14, a 20FPS gif is generated which includes all intermedidate steps. Show/save/load configs To show the config of a Document/DocumentArray, ```python
from discoart import show_config show_config(da) # show the config of the first run
show_config(da[3]) # show the config of the fourth run
show_config(
'discoart-06030a0198843332edc554ffebfbf288'
) # show the config of the run with a known DocArray ID
``` To save the config of a Document/DocumentArray, ```python
from discoart import save_config save_config(da, 'my.yml') # save the config of the first run
save_config(da[3], 'my.yml') # save the config of the fourth run
``` To run create from a YAML config of Document/DocumentArray, ```python
from discoart import create, load_config config = load_config('my.yml')
create(**config)
``` You can also export the config as an SVG image: ```python
from discoart.config import save_config_svg save_config_svg(da)
``` One can also generate runnable Python code directly from the config: ```python
from discoart.config import export_python export_python(da)
``` Pull results anywhere anytime If you are a free-tier Google Colab user, one annoy thing is the lost of sessions from time to time. Or sometimes you just early stop the run as the first image is not good enough, and a keyboard interrupt will prevent .create() to return any result. Either case, you can easily recover the results by pulling the last session ID. Find the session ID. It appears on top of the image. Pull the result via that ID on any machine at any time , not necessarily on Google Colab:
```python
from docarray import DocumentArray da = DocumentArray.pull('discoart-3205998582')
``` Reuse a Document as initial state Consider a Document as a self-contained data with config and image, one can use it as the initial state for the future run. Its .tags will be used as the initial parameters; .uri if presented will be used as the initial image. ```python
from discoart import create
from docarray import DocumentArray da = DocumentArray.pull('discoart-3205998582') create(
init_document=da[0],
cut_ic_pow=0.5,
tv_scale=600,
cut_overview='[12] 1000',
cut_innercut='[12] 1000',
use_secondary_model=False,
)
``` If you just want to initialize from a known DocArray ID, then simply: ```python
from discoart import create create(init_document='discoart-3205998582')
``` Environment variables You can set environment variables to control the meta-behavior of DiscoArt. The environment variables must be set before importing DiscoArt, either in Bash or in Python via os.environ . bash
DISCOART_LOG_LEVEL='DEBUG' # more verbose logs
DISCOART_OPTOUT_CLOUD_BACKUP='1' # opt-out from cloud backup
DISCOART_DISABLE_IPYTHON='1' # disable ipython dependency
DISCOART_DISABLE_RESULT_SUMMARY='1' # disable result summary after the run ends
DISCOART_DEFAULT_PARAMETERS_YAML='path/to/your-default.yml' # use a custom default parameters file
DISCOART_CUT_SCHEDULES_YAML='path/to/your-schedules.yml' # use a custom cut schedules file
DISCOART_MODELS_YAML='path/to/your-models.yml' # use a custom list of models file
DISCOART_OUTPUT_DIR='path/to/your-output-dir' # use a custom output directory for all images and results
DISCOART_CACHE_DIR='path/to/your-cache-dir' # use a custom cache directory for models and downloads
DISCOART_DISABLE_REMOTE_MODELS='1' # disable the listing of diffusion models on Github, remote diffusion models allows user to use latest models without updating the codebase.
DISCOART_REMOTE_MODELS_URL='https://yourdomain/models.yml' # use a custom remote URL for fetching models list
DISCOART_DISABLE_CHECK_MODEL_SHA='1' # disable checking local model SHA matches the remote model SHA
DISCOART_DISABLE_TQDM='1' # disable tqdm progress bar on diffusion CLI DiscoArt provides two commands create and config that allows you to run DiscoArt from CLI. bash
python -m discoart create my.yml which creates artworks from the YAML config file my.yml . You can also do: bash
cat config.yml | python -m discoart create So how can I have my own my.yml and what does it look like? That's the second command: bash
python -m discoart config my.yml which forks the default YAML config and export them to my.yml . Now you can modify it and run it with python -m discoart create command. If no output path is specified, then python -m discoart config will print the default config to stdout. To get help on a command, add --help at the end, e.g.: bash
python -m discoart create --help ```text
usage: python -m discoart create [-h] [YAML_CONFIG_FILE] positional arguments:
YAML_CONFIG_FILE The YAML config file to use, default is stdin. optional arguments:
-h, --help show this help message and exit
``` Serving Serving DiscoArt is super easy. Simply run the following command: bash
python -m discoart serve You shall see: Now send request to the server via curl/Javascript, e.g. bash
curl \
-X POST http://0.0.0.0:51001/post \
-H 'Content-Type: application/json' \
-d '{"execEndpoint":"/create", "parameters": {"text_prompts": ["A beautiful painting of a singular lighthouse", "yellow color scheme"]}}' That's it. You can of course pass all parameters that accepted by create() function in the JSON. Polling intermediate results We already know that create function is slow even on GPU it could take 10 minutes to finish an artwork. This means the after sending the above request, the client will have to wait 10 minutes for the response. There is nothing wrong with this behavior given that everything runs synchronously. However, in practice, client may expect a progress or intermediate results in the middle instead of waiting for the end. /result endpoint is designed for this purpose. It will return the intermediate results as soon as they are available. All you need is to specify name_docarray in the request parameters as you specified in /create endpoint. Here is an example: Let's create mydisco-123 by sending the following request to /create endpoint: bash
curl \
-X POST http://0.0.0.0:51001/post \
-H 'Content-Type: application/json' \
-d '{"execEndpoint":"/create", "parameters": {"name_docarray": "mydisco-123", "text_prompts": ["A beautiful painting of a singular lighthouse", "yellow color scheme"]}}' Now that the above request is being processed on the server, you can periodically check mydisco-123 progress by sending the following request to /result endpoint: bash
curl \
-X POST http://0.0.0.0:51001/post \
-H 'Content-Type: application/json' \
-d '{"execEndpoint":"/result", "parameters": {"name_docarray": "mydisco-123"}}' A JSON will be returned with up-to-date progress, with image as DataURI, loss, steps etc. The JSON Schema of Document/DocumentArray is described here. Note, /result won't be blocked by /create thanks to the smart routing of Jina Gateway. To learn/play more about those endpoints, you can check ReDoc or the Swagger UI embedded in the server. Skip & Cancel Send to /skip , to skip the current run and move to the next run as defined in n_batches : bash
curl \
-X POST http://0.0.0.0:51001/post \
-H 'Content-Type: application/json' \
-d '{"execEndpoint":"/skip"}' Send to /stop , to stop the current run cancel all runs n_batches : bash
curl \
-X POST http://0.0.0.0:51001/post \
-H 'Content-Type: application/json' \
-d '{"execEndpoint":"/stop"}' Unblocking /create request It is possible to have an unblocked /create endpoint: the client request to /create will be immediately returned, without waiting for the results to be finished. You now have to fully rely on /result to poll the result. To enable this feature: Copy-paste the default flow.yml file to myflow.yml ; Change floating: false to floating: true under discoart executor section; Run the following command: bash
python -m discoart serve myflow.yml Beware that the request velocity is now under your control . That is, if the client sends 10 /create requests in a second, then the server will start 10 create() in parallel! This can easily lead to OOM. Hence, the suggestion is only enabling this feature if you are sure that the client is not sending too many requests, e.g. you control the client request rate; or you are using DiscoArt behind a BFF (backend for frontend). Scaling out If you have multiple GPUs and you want to run multiple DiscoArt instances in parallel by leveraging GPUs in a time-multiplexed fashion, you can copy-paste the default flow.yml file and modify it as follows: yaml
jtype: Flow
with:
protocol: http
monitoring: true
port: 51001
port_monitoring: 51002 # prometheus monitoring port
env:
JINA_LOG_LEVEL: debug
DISCOART_DISABLE_IPYTHON: 1
DISCOART_DISABLE_RESULT_SUMMARY: 1
executors:
- name: discoart
uses: DiscoArtExecutor
env:
CUDA_VISIBLE_DEVICES: RR0:3 # change this if you have multiple GPU
replicas: 3 # change this if you have larger VRAM
- name: poller
uses: ResultPoller Here replicas: 3 says spawning three DiscoArt instances, CUDA_VISIBLE_DEVICES: RR0:3 makes sure they use the first three GPUs in a round-robin fashion. Name it as myflow.yml and then run bash
python -m discoart serve myflow.yml Customization Thanks to Jina , there are tons of things you can customize! You can change the port number; change protocol to gRPC/Websockets; add TLS encryption; enable/disable Prometheus monitoring; you can also export it to Kubernetes deployment bundle simply via: bash
jina export kubernetes myflow.yml For more features and YAML configs, please check out Jina docs . Use gRPC gateway To switch from HTTP to gRPC gateway is simple: yaml
jtype: Flow
with:
protocol: grpc
... and then restart the server. There are multiple advantages of using gRPC gateway:
- Much faster and smaller network overhead.
- Feature-rich, like compression, status monitoring, etc. In general, if you are using the DiscoArt server behind a BFF (backend for frontend), or your DiscoArt server does not directly serve HTTP traffic from end-users, then you should use gRPC protocol. To communicate with a gRPC DiscoArt server, one can use a Jina Client: ```python !pip install jina from jina import Client c = Client(host='grpc://0.0.0.0:51001') da = c.post(
'/create',
parameters={
'name_docarray': 'mydisco-123',
'text_prompts': [
'A beautiful painting of a singular lighthouse',
'yellow color scheme',
],
},
) check intermediate results da = c.post('/result', parameters={'name_docarray': 'mydisco-123'})
``` To use an existing Document/DocumentArray as init Document for create : ```python
from jina import Client c = Client(host='grpc://0.0.0.0:51001') old_da = create(...) da = c.post(
'/create',
old_da, # this can be a DocumentArray or a single Document
parameters={
'width_height': [1024, 768],
},
)
``` This equals to run create(init_document=old_da, width_height=[1024, 768]) on the server. Note:
- follow-up parameters have higher priorities than the parameters in init_document .
- if init_document is a DocumentArray, then the first Document in the array will be used as the init Document.
- there is no need to do any serialization before sending, Jina automatically handles it. Hosting on Google Colab Though not recommended, it is also possible to use Google Colab to host DiscoArt server.
Please check out the following tutorials:
- https://docs.jina.ai/how-to/google-colab/
- https://clip-as-service.jina.ai/hosting/colab/ Run in Docker We provide a prebuilt Docker image for running DiscoArt out of the box. To update Docker image to latest version: bash
docker pull jinaai/discoart:latest Use Jupyter notebook The default entrypoint is starting a Jupyter notebook ```bash docker build . -t jinaai/discoart # if you want to build yourself docker run -p 51000:8888 -v $(pwd):/home/jovyan/ -v $HOME/.cache:/root/.cache --gpus all jinaai/discoart
``` Now you can visit http://127.0.0.1:51000 to access the notebook Enable GPU in Docker on Windows You can use it on Windows Subsystem for Linux (WSL), Check the official guide here . ```bash Make sure you install Windows 11 or Windows 10, version 21H2 docker run -p 8888:8888 -v $HOME/.cache:/root/.cache --gpus all jinaai/discoart
``` Use as a service ```bash docker build . -t jinaai/discoart # if you want to build yourself docker run --entrypoint "python" -p 51001:51001 -v $(pwd):/home/jovyan/ -v $HOME/.cache:/root/.cache --gpus all jinaai/discoart -m discoart serve
``` Your DiscoArt server is now running at http://127.0.0.1:51001 . Release cycle Docker images are built on every release , so one can lock it to a specific version, say 0.5.1 : bash
docker run -p 51000:8888 -v $(pwd):/home/jovyan/ -v $HOME/.cache:/root/.cache --gpus all jinaai/discoart:0.5.1 What's next? Next is create . 😎 If you are already a DD user : you are ready to go! There is no extra learning, DiscoArt respects the same parameter semantics as DD5.6. So just unleash your creativity! Read more about their differences here . You can always do from discoart import cheatsheet; cheatsheet() to check all new/modified parameters. 👶 If you are a DALL·E Flow or new user : you may want to take step by step, as Disco Diffusion works in a very different way than DALL·E. It is much more advanced and powerful: e.g. Disco Diffusion can take weighted & structured text prompts; it can initialize from a image with controlled noise; and there are way more parameters one can tweak. Impatient prompt like "armchair avocado" will give you nothing but confusion and frustration. I highly recommend you to check out the following resources before trying your own prompt:
- Zippy's Disco Diffusion Cheatsheet v0.3 - EZ Charts - Diffusion Parameter Studies - Disco Diffusion 70+ Artist Studies - A Traveler’s Guide to the Latent Space - Disco Diffusion Illustrated Settings - Coar’s Disco Diffusion Guide Support Join our Discord community and chat with other community members about ideas. Join our Engineering All Hands meet-up to discuss your use case and learn Jina's new features. When? The second Tuesday of every month Where? Zoom ( see our public events calendar / .ical )
and live stream on YouTube Subscribe to the latest video tutorials on our YouTube channel Join Us DiscoArt is backed by Jina AI and licensed under MIT License . We are actively hiring AI engineers, solution engineers to build the next neural search ecosystem in open-source. | 🪩 Create Disco Diffusion artworks in one line | creative-ai,disco-diffusion,cross-modal,dalle,generative-art,multimodal,diffusion,prompts,midjourney,imgen | 111 | 9 | 88 | 385 | 25 | 20 | 5 |
nilaoda/N_m3u8DL-RE | N_m3u8DL-RE 跨平台的DASH/HLS/MSS下载工具。支持点播、直播(DASH/HLS)。 遇到 BUG 请首先确认软件是否为最新版本(如果是 Release 版本,建议到 Actions 页面下载最新自动构建版本后查看问题是否已经被修复),如果确认版本最新且问题依旧存在,可以到 Issues 中查找是否有人遇到过相关问题,没有的话再进行询问。 版本较低的Windows系统自带的终端可能不支持本程序,替代方案:在 cmder 中运行。 Arch Linux 可以从 AUR 获取: n-m3u8dl-re-bin 、 n-m3u8dl-re-git ```bash Arch Linux 及其衍生版安装 N_m3u8DL-RE 发行版 yay -Syu n-m3u8dl-re-bin Arch Linux 及其衍生版安装 N_m3u8DL-RE 开发版 yay -Syu n-m3u8dl-re-git
``` 命令行参数 ```
Description:
N_m3u8DL-RE (Beta version) 20230628 Usage:
N_m3u8DL-RE [options] Arguments: 链接或文件 Options:
--tmp-dir 设置临时文件存储目录
--save-dir 设置输出目录
--save-name 设置保存文件名
--base-url 设置BaseURL
--thread-count 设置下载线程数 [default: 16]
--download-retry-count 每个分片下载异常时的重试次数 [default: 3]
--auto-select 自动选择所有类型的最佳轨道 [default: False]
--skip-merge 跳过合并分片 [default: False]
--skip-download 跳过下载 [default: False]
--check-segments-count 检测实际下载的分片数量和预期数量是否匹配 [default: True]
--binary-merge 二进制合并 [default: False]
--del-after-done 完成后删除临时文件 [default: True]
--no-date-info 混流时不写入日期信息 [default: False]
--no-log 关闭日志文件输出 [default: False]
--write-meta-json 解析后的信息是否输出json文件 [default: True]
--append-url-params 将输入Url的Params添加至分片, 对某些网站很有用, 例如 kakao.com [default: False]
-mt, --concurrent-download 并发下载已选择的音频、视频和字幕 [default: False]
-H, --header 为HTTP请求设置特定的请求头, 例如:
-H "Cookie: mycookie" -H "User-Agent: iOS"
--sub-only 只选取字幕轨道 [default: False]
--sub-format 字幕输出类型 [default: SRT]
--auto-subtitle-fix 自动修正字幕 [default: True]
--ffmpeg-binary-path ffmpeg可执行程序全路径, 例如 C:\Tools\ffmpeg.exe
--log-level 设置日志级别 [default: INFO]
--ui-language 设置UI语言
--urlprocessor-args 此字符串将直接传递给URL Processor
--key 设置解密密钥, 程序调用mp4decrpyt/shaka-packager进行解密. 格式:
--key KID1:KEY1 --key KID2:KEY2
--key-text-file 设置密钥文件,程序将从文件中按KID搜寻KEY以解密.(不建议使用特大文件)
--decryption-binary-path MP4解密所用工具的全路径, 例如 C:\Tools\mp4decrypt.exe
--use-shaka-packager 解密时使用shaka-packager替代mp4decrypt [default: False]
--mp4-real-time-decryption 实时解密MP4分片 [default: False]
-M, --mux-after-done 所有工作完成时尝试混流分离的音视频. 输入 "--morehelp mux-after-done" 以查看详细信息
--custom-hls-method 指定HLS加密方式 (AES_128|AES_128_ECB|CENC|CHACHA20|NONE|SAMPLE_AES|SAMPLE_AES_CTR|UNKNOWN)
--custom-hls-key 指定HLS解密KEY. 可以是文件, HEX或Base64
--custom-hls-iv 指定HLS解密IV. 可以是文件, HEX或Base64
--use-system-proxy 使用系统默认代理 [default: True]
--custom-proxy 设置请求代理, 如 http://127.0.0.1:8888
--custom-range 仅下载部分分片. 输入 "--morehelp custom-range" 以查看详细信息
--task-start-at 在此时间之前不会开始执行任务
--live-perform-as-vod 以点播方式下载直播流 [default: False]
--live-real-time-merge 录制直播时实时合并 [default: False]
--live-keep-segments 录制直播并开启实时合并时依然保留分片 [default: True]
--live-pipe-mux 录制直播并开启实时合并时通过管道+ffmpeg实时混流到TS文件 [default: False]
--live-fix-vtt-by-audio 通过读取音频文件的起始时间修正VTT字幕 [default: False]
--live-record-limit 录制直播时的录制时长限制
--live-wait-time 手动设置直播列表刷新间隔
--mux-import 混流时引入外部媒体文件. 输入 "--morehelp mux-import" 以查看详细信息
-sv, --select-video 通过正则表达式选择符合要求的视频流. 输入 "--morehelp select-video" 以查看详细信息
-sa, --select-audio 通过正则表达式选择符合要求的音频流. 输入 "--morehelp select-audio" 以查看详细信息
-ss, --select-subtitle 通过正则表达式选择符合要求的字幕流. 输入 "--morehelp select-subtitle" 以查看详细信息
-dv, --drop-video 通过正则表达式去除符合要求的视频流.
-da, --drop-audio 通过正则表达式去除符合要求的音频流.
-ds, --drop-subtitle 通过正则表达式去除符合要求的字幕流.
--morehelp 查看某个选项的详细帮助信息
--version Show version information
-?, -h, --help Show help and usage information
``` 点击查看More Help ```
More Help:
--mux-after-done
所有工作完成时尝试混流分离的音视频. 你能够以:分隔形式指定如下参数:
* format=FORMAT: 指定混流容器 mkv, mp4
* muxer=MUXER: 指定混流程序 ffmpeg, mkvmerge (默认: ffmpeg)
* bin_path=PATH: 指定程序路径 (默认: 自动寻找)
* skip_sub=BOOL: 是否忽略字幕文件 (默认: false)
* keep=BOOL: 混流完成是否保留文件 true, false (默认: false)
例如:
# 混流为mp4容器
-M format=mp4
# 使用mkvmerge, 自动寻找程序
-M format=mkv:muxer=mkvmerge
# 使用mkvmerge, 自定义程序路径
-M format=mkv:muxer=mkvmerge:bin_path="C\:\Program Files\MKVToolNix\mkvmerge.exe"
```
```
More Help:
--mux-import
混流时引入外部媒体文件. 你能够以:分隔形式指定如下参数:
* path=PATH: 指定媒体文件路径
* lang=CODE: 指定媒体文件语言代码 (非必须)
* name=NAME: 指定媒体文件描述信息 (非必须)
例如:
# 引入外部字幕
--mux-import path=zh-Hans.srt:lang=chi:name="中文 (简体)"
# 引入外部音轨+字幕
--mux-import path="D\:\media\atmos.m4a":lang=eng:name="English Description Audio" --mux-import path="D\:\media\eng.vtt":lang=eng:name="English (Description)"
```
```
More Help:
--select-video
通过正则表达式选择符合要求的视频流. 你能够以:分隔形式指定如下参数:
id=REGEX:lang=REGEX:name=REGEX:codec=REGEX:res=REGEX:frame=REGEX
segsMin=number:segsMax=number:ch=REGEX:range=REGEX:url=REGEX
plistDurMin=hms:plistDurMax=hms:for=FOR
* for=FOR: 选择方式. best[number], worst[number], all (默认: best)
例如:
# 选择最佳视频
-sv best
# 选择4K+HEVC视频
-sv res="3840*":codec=hvc1:for=best
# 选择长度大于1小时20分钟30秒的视频
-sv plistDurMin="1h20m30s":for=best
```
```
More Help:
--select-audio
通过正则表达式选择符合要求的音频流. 参考 --select-video
例如:
# 选择所有音频
-sa all
# 选择最佳英语音轨
-sa lang=en:for=best
# 选择最佳的2条英语(或日语)音轨
-sa lang="ja|en":for=best2
```
```
More Help:
--select-subtitle
通过正则表达式选择符合要求的字幕流. 参考 --select-video
例如:
# 选择所有字幕
-ss all
# 选择所有带有"中文"的字幕
-ss name="中文":for=all
```
```
More Help:
--custom-range
下载点播内容时, 仅下载部分分片.
例如:
# 下载[0,10]共11个分片
--custom-range 0-10
# 下载从序号10开始的后续分片
--custom-range 10-
# 下载前100个分片
--custom-range -99
# 下载第5分钟到20分钟的内容
--custom-range 05:00-20:00
``` 运行截图 点播 还可以并行下载+自动混流 直播 录制TS直播源: click to show gif 录制MPD直播源: click to show gif 录制过程中,借助ffmpeg完成对音视频的实时混流 ffmpeg -readrate 1 -i 2022-09-21_19-54-42_V.mp4 -i 2022-09-21_19-54-42_V.chi.m4a -c copy 2022-09-21_19-54-42_V.ts 在新版本(>=v0.1.5)中,可以尝试开启 live-pipe-mux 来代替以上命令 特别注意:如果网络环境不够稳定,请不要开启 live-pipe-mux 。管道内数据读取由 ffmpeg 负责,在某些环境下容易丢失直播数据 在新版本(>=v0.1.8)中,能够通过设置环境变量 RE_LIVE_PIPE_OPTIONS 来改变 live-pipe-mux 时 ffmpeg 的某些选项: https://github.com/nilaoda/N_m3u8DL-RE/issues/162#issuecomment-1592462532 赞助 | Cross-Platform, modern and powerful stream downloader for MPD/M3U8/ISM. English/简体中文/繁體中文. | m3u8,m3u8-downloader,mpd,hls,dash,ffmpeg,live,recorder,ism | 17 | 7 | 18 | 451 | 207 | 1 | 1 |
rocboss/paopao-ce | PaoPao 🔥一个清新文艺的微社区 View Demo · Pull Request · Features 预览 Web端: 更多演示请前往 官网 体验(谢绝灌水) 桌面端: ( back to top ) 🛠 技术栈 PaoPao主要由以下优秀的开源项目/工具构建 后端: Go Gin Mir Buf gRPC Zinc 前端: Naive UI Vue.js Vite.js tauri 🏗 快速开始 环境要求 Go (1.20+) Node.js (14+) MySQL (5.7+) Redis Zinc Zinc是一款轻量级全文搜索引擎,可以查阅 https://zincsearch.com/ 安装 以上环境版本为PaoPao官方的开发版本,仅供参考,其他版本的环境未进行充分测试 安装说明 方式一. 手动安装(推荐) 克隆代码库 sh
git clone https://github.com/rocboss/paopao-ce.git 后端 导入项目根目录下的 scripts/paopao.sql 文件至MySQL数据库 拷贝项目根目录下 config.yaml.sample 文件至 config.yaml ,按照注释完成配置编辑 编译后端 编译api服务: sh
make build 编译api服务、内嵌web前端ui: sh
make build 也可以使用精简模式编译,不内嵌web前端ui: sh
make build TAGS='slim embed' 编译后在 release 目录可以找到对应可执行文件。 sh
release/paopao 直接运行后端 运行api服务: sh
make run 运行api服务、web前端ui服务: sh
make run TAGS='embed' 提示: 如果需要内嵌web前端ui,请先构建web前端(建议设置web/.env为VITE_HOST="")。 使用内置的Migrate机制自动升级维护SQL DDL:
```sh
# 添加 Migration 功能到 Features 中 开启migrate功能
vim config.yaml
# file: config.yaml
# Features:
# Default: ["Base", "MySQL", "Zinc", "MinIO", "LoggerZinc", "Migration"] 编译时加入migration tag编译出支持migrate功能的可执行文件 make build TAGS='migration'
release/paopao 或者 带上migration tag直接运行 make run TAGS='migration'
``` 注意:默认编译出来的可执行文件是不内置migrate功能,需要编译时带上migration tag才能内置支持migrage功能。 前端 进入前端目录 web ,拷贝 .env 到 .env.local ,编辑 .env.local 文件中后端服务地址及其他配置项,下载依赖包 sh
cd ./web && cp .env .env.local
vim .env.local
yarn 编译前端 sh
yarn build build完成后,可以在dist目录获取编译产出,配置nginx指向至该目录即可 桌面端 进入前端目录 web ,拷贝 .env 到 .env.local ,编辑 .env.local 文件中后端服务地址及其他配置项,下载依赖包 sh
cd ./web && cp .env .env.local
vim .env.local
yarn 编译前端 sh
yarn build 构建桌面端 sh
yarn tauri build 桌面端是使用 Rust + tauri 编写
的,需要安装tauri的依赖,具体参考 https://tauri.studio/v1/guides/getting-started/prerequisites . 方式二. 使用Docker构建、运行 后端:
```sh
# 默认参数构建, 默认内嵌web ui并设置api host为空
docker build -t your/paopao-ce:tag . # 内嵌web ui并且自定义API host参数
docker build -t your/paopao-ce:tag --build-arg API_HOST=http://api.paopao.info . # 内嵌web ui并且使用本地web/.env中的API host
docker build -t your/paopao-ce:tag --build-arg USE_API_HOST=no . # 内嵌web ui并且使用本地编译的web/dist构建
docker build -t your/paopao-ce:tag --build-arg USE_DIST=yes . # 只编译api server
docker build -t your/paopao-ce:tag --build-arg EMBED_UI=no . # 运行
mkdir custom && docker run -d -p 8008:8008 -v ${PWD}/custom:/app/paopao-ce/custom -v ${PWD}/config.yaml.sample:/app/paopao-ce/config.yaml your/paopao-ce:tag # 或者直接运行构建好的docker image
mkdir custom && docker run -d -p 8008:8008 -v ${PWD}/custom:/app/paopao-ce/custom -v ${PWD}/config.yaml.sample:/app/paopao-ce/config.yaml bitbus/paopao-ce:latest
``` 前端:
```sh
cd web # 默认参数构建
docker build -t your/paopao-ce:web . # 自定义API host 参数构建
docker build -t your/paopao-ce:web --build-arg API_HOST=http://api.paopao.info . # 使用本地编译的dist构建
docker build -t your/paopao-ce:web --build-arg USE_DIST=yes . # 运行
docker run -d -p 8010:80 your/paopao-ce:web
``` 方式三. 使用 docker-compose 运行 ```sh
git clone https://github.com/rocboss/paopao-ce.git
cd paopao-ce && docker compose up -d visit http://localhost:8008 👀 paopao-ce visit http://localhost:8001 👀 RedisInsight visit http://localhost:8080 👀 phpMyAdmin ``` 默认是使用config.yaml.sample的配置,如果需要自定义配置,请拷贝默认配置文件(比如config.yaml),修改后再同步配置到docker-compose.yaml如下: ``` file: docker-compose.yaml ...
backend:
image: bitbus/paopao-ce:latest
restart: always
depends_on:
- db
- redis
- zinc
# modify below to reflect your custom configure
volumes:
- ./config.yaml:/app/paopao-ce/config.yaml
ports:
- 8008:8008
networks:
- paopao-network
....
``` 注意:默认提供的 docker-compose.yaml 初衷是搭建本机开发调试环境,如果需要产品部署供外网访问,请自行调优配置参数或使用其他方式部署。 开发文档 Docs文档说明 docs 目录提供了各种开发文档,包括: * deploy - paopao-ce部署文档
* discuss - 开发相关的问题交流论述文档
* openapi - paopao-ce后端导出API文档
* proposal - paopao-ce功能特性提按文档 比如,关于paopao-ce的设计定位,可以参考 docs/proposal/001-关于paopao-ce的设计定位 ,简要阐述了paopao-ce是如何定位自身的。 API文档 开发者可以在本地开启 Docs 服务,浏览后端导出的API服务接口文档。 * config.yaml 添加 Docs 功能项: yaml
...
Features:
Default: ["Base", "MySQL", "Option", "LocalOSS", "LoggerFile", "Docs"]
Docs: ["Docs:OpenAPI"]
... 构建时将 docs 添加到TAGS中:
```sh
make run TAGS='docs' visit http://127.0.0.1:8011/docs/openapi ``` 配置说明 config.yaml.sample 是一份完整的配置文件模版,paopao-ce启动时会读取 ./custom/config.yaml 、 ./config.yaml 任意一份配置文件(优先读取最先找到的文件)。 sh
cp config.yaml.sample config.yaml
vim config.yaml # 修改参数
paopao serve 配置文件中的 Features 小节是声明paopao-ce运行时开启哪些功能项: ```yaml
... Features:
Default: ["Base", "MySQL", "Option", "LocalOSS", "LoggerFile"]
Develop: ["Base", "MySQL", "Option", "Sms", "AliOSS", "LoggerZinc"]
Demo: ["Base", "MySQL", "Option", "Sms", "MinIO", "LoggerZinc"]
Slim: ["Base", "Sqlite3", "LocalOSS", "LoggerFile"]
Base: ["Zinc", "Redis", "Alipay",]
Option: ["SimpleCacheIndex"]
Sms: "SmsJuhe" ...
``` 如上:
Default/Develop/Demo/Slim 是不同 功能集套件(Features Suite), Base/Option 是子功能套件, Sms是关于短信验证码功能的参数选项。 这里 Default 套件 代表的意思是: 使用 Base/Option 中的功能,外加 MySQL/LocalOSS/LoggerFile 功能,也就是说开启了 Zinc/Redis/Alipay/SimpleCacheIndex/MySQL/LocalOSS/LoggerFile 7项功能; Develop 套件依例类推。 使用Feautures: ```sh
release/paopao serve --help
Usage of release/paopao:
-features value
use special features
-no-default-features
whether use default features 默认使用 Default 功能套件 release/paopao serve 不包含 default 中的功能集,仅仅使用 develop 中声明的功能集 release/paopao serve --no-default-features --features develop 使用 default 中的功能集,外加 sms 功能 release/paopao serve --features sms 手动指定需要开启的功能集 release/paopao serve --no-default-features --features sqlite3,localoss,loggerfile,redis
``` 目前支持的功能集合:
| 功能项 | 类别 | 状态 | 备注 |
| ----- | ----- | ----- | ----- |
| Web | 子服务 | 内测 | 开启Web服务|
| Admin | 子服务 | WIP | 开启Admin后台运维服务|
| SpaceX | 子服务 | WIP | 开启SpaceX服务|
| Bot | 子服务 | WIP | 开启Bot服务|
| NativeOBS | 子服务 | WIP | 开启NativeOBS服务|
| Docs | 子服务 | WIP | 开启开发者文档服务|
| Frontend:Web | 子服务 | 稳定 | 开启独立前端服务|
| Frontend:EmbedWeb | 子服务 | 稳定 | 开启内嵌于后端Web API服务中的前端服务|
| Gorm | 数据库 | 稳定(默认) | 使用 gorm 作为数据库的ORM,默认使用 Gorm + MySQL 组合|
| Sqlx | 数据库 | WIP | 使用 sqlx 作为数据库的ORM|
| Sqlc | 数据库 | WIP | 使用 sqlc 自动生成ORM代码|
| MySQL | 数据库 | 稳定(默认) | 使用MySQL作为数据库|
| Postgres | 数据库 | 稳定 | 使用PostgreSQL作为数据库|
| Sqlite3 | 数据库 | 稳定 | 使用Sqlite3作为数据库|
| AliOSS | 对象存储 | 稳定(推荐) |阿里云对象存储服务|
| COS | 对象存储 | 内测 |腾讯云对象存储服务|
| HuaweiOBS | 对象存储 | 内测 |华为云对象存储服务|
| MinIO | 对象存储 | 稳定 | MinIO 对象存储服务|
| S3 | 对象存储 | 内测 |AWS S3兼容的对象存储服务|
| LocalOSS | 对象存储 | 内测 |提供使用本地目录文件作为对象存储的功能,仅用于开发调试环境|
| OSS:Retention | 对象存储 | 内测 |基于对象存储系统的对象过期自动删除特性实现 先创建临时对象再持久化的功能|
| OSS:TempDir | 对象存储 | 内测 |基于对象存储系统的对象拷贝/移动特性实现 先创建临时对象再持久化的功能|
| Redis | 缓存 | 稳定 | Redis缓存功能 |
| SimpleCacheIndex | 缓存 | Deprecated | 提供简单的 广场推文列表 的缓存功能 |
| BigCacheIndex | 缓存 | Deprecated | 使用 BigCache 缓存 广场推文列表,缓存每个用户每一页,简单做到千人千面 |
| RedisCacheIndex | 缓存 | Deprecated | 使用Redis缓存 广场推文列表,缓存每个用户每一页,简单做到千人千面 |
| Zinc | 搜索 | 稳定(推荐) | 基于 Zinc 搜索引擎提供推文搜索服务 |
| Meili | 搜索 | 稳定(推荐) | 基于 Meilisearch 搜索引擎提供推文搜索服务 |
| Bleve | 搜索 | WIP | 基于 Bleve 搜索引擎提供推文搜索服务 |
| Sentry | 监控 | 内测 | 使用Sentry进行错误跟踪与性能监控 |
| LoggerFile | 日志 | 稳定 | 使用文件写日志 |
| LoggerZinc | 日志 | 稳定(推荐) | 使用 Zinc 写日志 |
| LoggerMeili | 日志 | 内测 | 使用 Meilisearch 写日志 |
| LoggerOpenObserve | 日志 | 内测 | 使用 OpenObserve 写日志 |
| Friendship | 关系模式 | 内置 Builtin | 弱关系好友模式,类似微信朋友圈 |
| Followship | 关系模式 | 内置 Builtin | 关注者模式,类似Twitter的Follow模式 |
| Lightship | 关系模式 | 弃用 Deprecated | 开放模式,所有推文都公开可见 |
| Alipay | 支付 | 稳定 | 开启基于 支付宝开放平台 的钱包功能 |
| Sms | 短信验证 | 稳定 | 开启短信验证码功能,用于手机绑定验证手机是否注册者的;功能如果没有开启,手机绑定时任意短信验证码都可以绑定手机 |
| Docs:OpenAPI | 开发文档 | 稳定 | 开启openapi文档功能,提供web api文档说明(visit http://127.0.0.1:8008/docs/openapi) |
| Pyroscope | 性能优化 | 内测 | 开启Pyroscope功能用于性能调试 | | Pprof | 性能优化 | 内测 | 开启Pprof功能收集Profile信息 | | PhoneBind | 其他 | 稳定 | 手机绑定功能 | | UseAuditHook | 其他 | 内测 | 使用审核hook功能 | | DisableJobManager | 其他 | 内测 | 禁止使用JobManager功能 | | Web:DisallowUserRegister | 功能特性 | 稳定 | 不允许用户注册 | 功能项状态详情参考 features-status . 搭建依赖环境 Zinc 搜索引擎: Zinc运行
```sh 创建用于存放zinc数据的目录 mkdir -p data/zinc/data 使用Docker运行zinc docker run -d --name zinc --user root -v ${PWD}/data/zinc/data:/data -p 4080:4080 -e ZINC_FIRST_ADMIN_USER=admin -e ZINC_FIRST_ADMIN_PASSWORD=admin -e DATA_PATH=/data public.ecr.aws/zinclabs/zinc:latest 查看zinc运行状态 docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
41465feea2ff getmeili/meilisearch:v0.27.0 "tini -- /bin/sh -c …" 20 hours ago Up 20 hours 0.0.0.0:7700->7700/tcp paopao-ce-meili-1
7daf982ca062 public.ecr.aws/prabhat/zinc:latest "/go/bin/zinc" 3 weeks ago Up 6 days 0.0.0.0:4080->4080/tcp zinc 使用docker compose运行 docker compose up -d zinc visit http://localhost:4080 打开自带的ui管理界面 ``` 修改Zinc配置
```yaml features中加上 Zinc 和 LoggerZinc Features:
Default: ["Zinc", "LoggerZinc", "Base", "Sqlite3", "BigCacheIndex","MinIO"]
...
LoggerZinc: # 使用Zinc写日志
Host: 127.0.0.1:4080 # 这里的host就是paopao-ce能访问到的zinc主机
Index: paopao-log
User: admin
Password: admin
Secure: False # 如果使用https访问zinc就设置为True
...
Zinc: # Zinc搜索配置
Host: 127.0.0.1:4080
Index: paopao-data
User: admin
Password: admin
Secure: False
``` Meilisearch 搜索引擎: Meili运行
```sh
mkdir -p data/meili/data 使用Docker运行 docker run -d --name meili -v ${PWD}/data/meili/data:/meili_data -p 7700:7700 -e MEILI_MASTER_KEY=paopao-meilisearch getmeili/meilisearch:v0.29.0 visit http://localhost:7700 打开自带的搜索前端ui 使用docker compose运行,需要删除docker-compose.yaml中关于meili的注释 docker compose up -d meili 查看meili运行状态 docker compose ps
NAME COMMAND SERVICE STATUS PORTS
paopao-ce-meili-1 "tini -- /bin/sh -c …" meili running 0.0.0.0:7700->7700/tcp
``` 修改Meili配置
```yaml features中加上 Meili 和 LoggerMeili Features:
Default: ["Meili", "LoggerMeili", "Base", "Sqlite3", "BigCacheIndex","MinIO"]
...
LoggerMeili: # 使用Meili写日志
Host: 127.0.0.1:7700
Index: paopao-log
ApiKey: paopao-meilisearch
Secure: False
MinWorker: 5 # 最小后台工作者, 设置范围[5, 100], 默认5
MaxLogBuffer: 100 # 最大log缓存条数, 设置范围[10, 10000], 默认100
...
Meili: # Meili搜索配置
Host: 127.0.0.1:7700 # 这里的host就是paopao-ce能访问到的meili主机
Index: paopao-data
ApiKey: paopao-meilisearch
Secure: False # 如果使用https访问meili就设置为True
``` MinIO 对象存储服务 MinIO运行
```sh
mkdir -p data/minio/data 使用Docker运行 docker run -d --name minio -v ${PWD}/data/minio/data:/data -p 9000:9000 -p 9001:9001 -e MINIO_ROOT_USER=minio-root-user -e MINIO_ROOT_PASSWORD=minio-root-password -e MINIO_DEFAULT_BUCKETS=paopao:public bitnami/minio:latest 使用docker compose运行, 需要删除docker-compose.yaml中关于minio的注释 docker compose up -d minio
``` 修改Minio配置
```yaml features中加上 MinIO Features:
Default: ["MinIO", "Meili", "LoggerMeili", "Base", "Sqlite3", "BigCacheIndex"]
...
MinIO: # MinIO 存储配置
AccessKey: Q3AM3UQ867SPQQA43P2F # AccessKey/SecretKey 需要登入minio管理界面手动创建,管理界面地址: http://127.0.0.1:9001
SecretKey: zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG
Secure: False
Endpoint: 127.0.0.1:9000 # 根据部署的minio主机修改对应地址
Bucket: paopao # 如上,需要在管理界面创建bucket并赋予外部可读写权限
Domain: 127.0.0.1:9000 # minio外网访问的地址(如果想让外网访问,这里需要设置为外网可访问到的minio主机地址)
...
``` OpenObserve 日志收集、指标度量、轨迹跟踪 OpenObserve运行
```sh 使用Docker运行 mkdir data && docker run -v $PWD/data:/data -e ZO_DATA_DIR="/data" -p 5080:5080 \
-e ZO_ROOT_USER_EMAIL="root@paopao.info" -e ZO_ROOT_USER_PASSWORD="paopao-ce" \
public.ecr.aws/zinclabs/openobserve:latest 使用docker compose运行, 需要删除docker-compose.yaml中关于openobserve的注释 docker compose up -d openobserve visit http://loclahost:5080 ``` 修改LoggerOpenObserve配置
```yaml features中加上 LoggerOpenObserve Features:
Default: ["Meili", "LoggerOpenObserve", "Base", "Sqlite3", "BigCacheIndex"]
...
LoggerOpenObserve: # 使用OpenObserve写日志
Host: 127.0.0.1:5080
Organization: paopao-ce
Stream: default
User: root@paopao.info
Password: tiFEI8UeJWuYA7kN
Secure: False
...
``` Pyroscope 性能剖析 Pyroscope运行
```sh
mkdir -p data/minio/data 使用Docker运行 docker run -it -p 4040:4040 pyroscope/pyroscope:latest server 使用docker compose运行, 需要删除docker-compose.yaml中关于pyroscope的注释 docker compose up -d pyroscope visit http://loclahost:4040 ``` 修改Pyroscope配置
```yaml features中加上 Pyroscope Features:
Default: ["Meili", "LoggerMeili", "Base", "Sqlite3", "BigCacheIndex", "Pyroscope"]
...
Pyroscope: # Pyroscope配置
AppName: "paopao-ce"
Endpoint: "http://localhost:4040" # Pyroscope server address
AuthToken: # Pyroscope authentication token
Logger: none # Pyroscope logger (standard | logrus | none)
...
``` 源代码分支管理 主代码库 github.com/rocboss/paopao-ce bash
git branch
main
beta
dev
feature/bleve
feature/followship
feature/mir
feature/localoss
jc/alimy
r/paopao-plus
r/paopao-pro
x/sqlc
x/sqlx 分支说明 | 名称 | 说明 | 备注|
| ----- | ----- | ----- | | main | 主分支 |分支 main 是主分支,也是paopao-ce的稳定版本发布分支,只有经过内部测试,没有重大bug出现的稳定代码才会推进到这个分支;该分支主要由 beta 分支代码演进而来,原则上 只接受bug修复PR 。 rc版本/稳定版本 发布都应该在 main 主分支中进行。|
| beta | 公测分支 |分支 beta 是公测分支,代码推进到 main 主分支的候选分支;该分支主要由 alpha 分支代码演进而来, 接受bug修复以及新功能优化的PR ,原则上不接受新功能PR。 beta版本 发布都应该在 beta 公测分支下进行。|
| alpha | 内测分支 |分支 alpha 是内测分支,代码推进到 beta 分支的候选分支;该分支主要由 dev 分支代码演进而来, 接受bug修复以及新功能相关的PR ,接受新功能PR。分支代码演进到一个里程碑式的阶段后 冻结所有新功能 ,合并代码到 beta 公测分支进行下一阶段的持续演进。 alpha版本 发布都应该在 alpha 内测分支下进行。| | dev | 开发分支 | 分支 dev 是开发分支, 不定期频繁更新 ,接受 新功能PR、代码优化PR、bug修复PR ; 新功能PR 都应该首先提交给 dev 分支进行合并,bug修复/新功能开发/代码优化 阶段性冻结 后将代码演进合并到 alpha 分支。| | feature/* | 子功能分支 | feature/* 是新功能子分支,一般新功能子分支都是 从 dev 开发分支fork出来的 ;子功能分支 只专注于该新功能 代码的开发/优化,待开发接近内测阶段 提交新功能PR给 dev 分支进行review/merge ,待新功能代码演进到 beta 分支后,原则上是可以删除该分支,但也可以保留到稳定版本发布。 该分支专注于新功能的开发,只接受新功能的bug修复/优化PR 。|
| jc/* |维护者的开发分支| jc/* 是代码库维护者的开发分支,一般包含一些局部优化或者bug修复代码,有时可以直接将代码merge到 dev/beta 分支,原则上不允许直接merge代码到 main 主分支。|
| x/* |实验分支| x/* 是技术实验分支,某些技术的引入需要经过具体的代码实现与真实场景的测评,考量评估后如果某项技术适合引入到paopao-ce,就fork出一个 feature/* 分支,作为新功能引入到paopao-ce。一般一些比较激进的技术,从 dev 分支fork出一个新的 x/* 分支,各种尝试、考量、评估后,或丢弃、或引入到paopao-ce。| | t/* | 临时分支 | t/* 是临时发版本分支,一般 beta 分支演进到正式版本发布前的最后某个beta版本(比如v0.2.0-beta)就从beta分支fork出一个 t/* 分支用于向 main 分支提交 PR 用于Review,待 PR Reviewed 合并到 main 分支后,可以删除这个临时创建的分支。这样设计主要是考虑到有时合并到 main 分支时,需要Review的时间可能会长一些,而dev分支的代码又急需推进到beta分支以发布下一个alpha版本用于内测,相当于为下一个测试版本发布腾地方。| | r/* |发行版本分支| r/* 是不同发行版本分支,不同发行版本各有不同的侧重点,可以根据需要选择适合的发行版本。| 发行版本分支说明 | 名称 | 说明 | 维护者 | 备注 |
| ----- | ----- | ----- | ----- | | paopao-ce |paopao-ce 主发行版本| ROC |该分支 数据逻辑层 使用 gorm 作为数据逻辑层的ORM框架,适配MySQL/PostgreSQL/Sqlite3数据库。|
| r/paopao-ce |paopao-ce 主分支预览版本| ROC 北野 |该分支 数据逻辑层 使用 gorm 作为数据逻辑层的ORM框架,适配MySQL/PostgreSQL/Sqlite3数据库。代码较 main 分支新,是主发行版本的前瞻预览版本。|
| r/paopao-ce-plus |paopao-ce-plus 发行版本| 北野 |该分支 数据逻辑层 使用 sqlx 作为数据逻辑层的ORM框架,专注于为MySQL/PostgreSQL/Sqlite3使用更优化的查询语句以提升数据检索效率。建议熟悉 sqlx 的开发人员可以基于此版本来做 二次开发。|
| r/paopao-ce-pro |paopao-ce-pro 发行版本| 北野 |该分支 数据逻辑层 使用 sqlc 作为sql语句生成器自动生成ORM代码,专门针对特定数据库MySQL/PostgreSQL进行查询优化,熟悉 sqlc 的开发人员可以基于此版本来做 二次开发。(另:分支目前只使用 pgx-v5 适配了PostgreSQL数据库,后续或许会适配MySQL/TiDB数据库。)|
| r/paopao-ce-xtra |paopao-ce-xtra 发行版本| 北野 |该分支 是r/paopao-ce、r/paopao-ce-plus、r/paopao-ce-pro的合集|
| r/paopao-ce-mini |paopao-ce-mini 发行版本| 北野 |该分支是paopao-ce最小可用版本,专注于个人部署、一键傻瓜式最简部署| 代码分支演进图 部署站点信息 官方 paopao.info 具体部署站点信息请查阅 deployed-sites . 欢迎站长将已部署PaoPao实例的站点信息添加到 deployed-sites 列表中。 Collaborator's paopao account | 昵称 | @GitHub | @PaoPao |
| ----- | ----- | ----- |
| ROC | ROC | ROC |
| 北野 | Michael Li | alimy |
| orzi!| orzi! || 其他说明 建议后端服务使用 supervisor 守护进程,并通过 nginx 反向代理后,提供API给前端服务调用。 短信通道使用的 聚合数据 ,如果申请不下来,可以考虑替换其他服务商。 代码结构比较简单,很方便扩展,开发文档请参阅 docs . 👯♀️ 贡献 paopao-ce 是一个利用 业余时间 本着 "Just for fun just do it." 的心态 持续有序 开发/优化/维护 的开源项目,没有KPI考核、没有Roadmap进度压力、没有技术支持日程安排,或许有些许不足之处,但是重在精神可嘉。 借用网络中的话 "F*k talk, f*k of tech innovation, Shut up and show me your code." 一切都因更好的体验,一切都是为了爱好,一切都在代码里;期待老铁们加入,一起开发、一起折腾、一起快乐。 喜欢的朋友记得给个Star,欢迎贡献PR。 License Distributed under the MIT License. See LICENSE for more information. | 🔥An artistic "twitter like" community built on gin+zinc+vue+ts 清新文艺微社区 | forum,go,vue3,bbs,gin,zinc,naive,twitter | 58 | 14 | 505 | 1,516 | 0 | 36 | 1 |
FlagAI-Open/FlagAI | 简体中文 FlagAI (Fast LArge-scale General AI models) is a fast, easy-to-use and extensible toolkit for large-scale model. Our goal is to support training, fine-tuning, and deployment of large-scale models on various downstream tasks with multi-modality. Why should I use FlagAI? Quickly Download Models via API FlagAI provides an API that allows you to quickly download pre-trained models and fine-tune them on a wide range of datasets collected from SuperGLUE and CLUE benchmarks for both Chinese and English text. FlagAI now supports over 30 mainstream models, including Language Model Aquila , multilingual text and image representation model AltCLIP , text-to-image generation model AltDiffusion , WuDao GLM (with a maximum of 10 billion parameters), EVA-CLIP , OPT , BERT , RoBERTa , GPT2 , T5 , ALM , and models from Huggingface Transformers , etc. Parallel train with fewer than 10 lines of code Backed by the four most popular data/model parallel libraries -- PyTorch , Deepspeed , Megatron-LM , BMTrain -- FlagAI allows for seamless integration between them, enabling users to parallel their training/testing process with fewer than ten lines of code. Conveniently use the few-shot learning toolkits FlagAI also provides prompt-learning toolkit for few-shot tasks. Particularly good at Chinese tasks These models can be applied to (Chinese/English) Text, for tasks like text classification, information extraction, question answering, summarization, and text generation, with a particular focus on Chinese tasks. Toolkits and Pre-trained Models The code is partially based on GLM , Transformers , timm and DeepSpeedExamples . Toolkits | Name | Description | Examples |
|:-------------- |:---------- |:------------------------------------------------------ |
| GLM_custom_pvp | Customizing PET templates | README.md |
| GLM_ptuning | p-tuning tool | —— |
| BMInf-generate | Accelerating generation | README.md | Pre-trained Models | Model | Task | Train | Finetune | Inference/Generate | Examples | | :---------------- | :------- | :-- |:-- | :-- | :--------------------------------------------- |
| Aquila | Natural Language Processing | ✅ | ✅ | ✅ | README.md | ALM | Arabic Text Generation | ✅ | ❌ | ✅ | README.md | | AltCLIP | Image-Text Matching | ✅ | ✅ | ✅ | README.md | | AltCLIP-m18 | Image-Text Matching | ✅ | ✅ | ✅ | README.md | | AltDiffusion | Text-to-Image Generation | ❌ | ❌ | ✅ | README.md |
| AltDiffusion-m18 | Text-to-Image Generation,supporting 18 languages | ❌ | ❌ | ✅ | README.md |
| BERT-title-generation-english | English Title Generation | ✅ | ❌ | ✅ | README.md |
| CLIP | Image-Text Matching | ✅ | ❌ | ✅ | —— | | CPM3-finetune | Text Continuation | ❌ | ✅ | ❌ | —— | | CPM3-generate | Text Continuation | ❌ | ❌ | ✅ | —— | | CPM3_pretrain | Text Continuation | ✅ | ❌ | ❌ | —— |
| CPM_1 | Text Continuation | ❌ | ❌ | ✅ | README.md |
| EVA-CLIP | Image-Text Matching | ✅ | ✅ | ✅ | README.md |
| Galactica | Text Continuation | ❌ | ❌ | ✅ | —— | | GLM-large-ch-blank-filling | Blank Filling | ❌ | ❌ | ✅ | TUTORIAL |
| GLM-large-ch-poetry-generation | Poetry Generation | ✅ | ❌ | ✅ | TUTORIAL |
| GLM-large-ch-title-generation | Title Generation | ✅ | ❌ | ✅ | TUTORIAL |
| GLM-pretrain | Pre-Train | ✅ | ❌ | ❌ | —— | | GLM-seq2seq | Generation | ✅ | ❌ | ✅ | —— | | GLM-superglue | Classification | ✅ | ❌ | ❌ | —— | | GPT-2-text-writting | Text Continuation | ❌ | ❌ | ✅ | TUTORIAL |
| GPT2-text-writting | Text Continuation | ❌ | ❌ | ✅ | —— | | GPT2-title-generation | Title Generation | ❌ | ❌ | ✅ | —— | | OPT | Text Continuation | ❌ | ❌ | ✅ | README.md | | RoBERTa-base-ch-ner | Named Entity Recognition| ✅ | ❌ | ✅ | TUTORIAL |
| RoBERTa-base-ch-semantic-matching |Semantic Similarity Matching | ✅ | ❌ | ✅ | TUTORIAL |
| RoBERTa-base-ch-title-generation | Title Generation | ✅ | ❌ | ✅ | TUTORIAL |
| RoBERTa-faq | Question-Answer | ❌ | ❌ | ✅ | README.md | | Swinv1 | Image Classification | ✅ | ❌ | ✅ | —— | | Swinv2 | Image Classification | ✅ | ❌ | ✅ | —— | | T5-huggingface-11b | Train | ✅ | ❌ | ❌ | TUTORIAL |
| T5-title-generation | Title Generation | ❌ | ❌ | ✅ | TUTORIAL |
| T5-flagai-11b | Pre-Train | ✅ | ❌ | ❌ | —— | | ViT-cifar100 | Pre-Train | ✅ | ❌ | ❌ | —— | More excamples in ./examples More tutorials in ./docs Contributing Thanks for your interest in contributing! There are many ways to get involved;
start with our contributor guidelines and then
check these open issues for specific tasks. Contact us Welcome to raise your questions or feature requests on GitHub Issues , and share your experience on the Discussions board. Official email: open.platform@baai.ac.cn. Zhihu: FlagAI Scan the qrcode to join the WeChat group for communication: Quick Start We provide many models which are trained to perform different tasks. You can load these models by AutoLoader to make prediction. See more in FlagAI/quickstart . Requirements and Installation Python version >= 3.8 PyTorch version >= 1.8.0 [Optional] For training/testing models on GPUs, you'll also need to install CUDA and NCCL To install FlagAI with pip: shell
pip install -U flagai [Optional] To install FlagAI and develop locally: shell
git clone https://github.com/FlagAI-Open/FlagAI.git
python setup.py install [Optional] For faster training, install NVIDIA's apex git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" ./ [Optional] For ZeRO optimizers, install DEEPSPEED (>= 0.7.7) git clone https://github.com/microsoft/DeepSpeed
cd DeepSpeed
DS_BUILD_CPU_ADAM=1 DS_BUILD_AIO=1 DS_BUILD_UTILS=1 pip install -e .
ds_report # check the deespeed status [Optional] For BMTrain training, install BMTrain (>= 0.2.2) git clone https://github.com/OpenBMB/BMTrain
cd BMTrain
python setup.py install [Optional] For BMInf low-resource inference, install BMInf ```
pip install bminf - [Optional] For Flash Attention, install [Flash-attention](https://github.com/HazyResearch/flash-attention) (>=1.0.2) pip install flash-attn
``` [Tips] For single-node docker environments, we need to set up ports for your ssh. e.g., root@127.0.0.1 with port 711
``` vim ~/.ssh/config
Host 127.0.0.1
Hostname 127.0.0.1
Port 7110
User root
``` [Tips] For multi-node docker environments, generate ssh keys and copy the public key to all nodes (in ~/.ssh/ )
``` ssh-keygen -t rsa -C "xxx@xxx.com"
``` Load model and tokenizer We provide the AutoLoad class to load the model and tokenizer quickly, for example:
```python
from flagai.auto_model.auto_loader import AutoLoader auto_loader = AutoLoader(
task_name="title-generation",
model_name="BERT-base-en"
)
model = auto_loader.get_model()
tokenizer = auto_loader.get_tokenizer() ``
This example is for the title_generation task, and you can also model other tasks by modifying the task_name`.
Then you can use the model and tokenizer to fine-tune or test. Examples 1. Predictor We provide the Predictor class to predict for different tasks, for example: ```python
from flagai.model.predictor.predictor import Predictor
predictor = Predictor(model, tokenizer)
test_data = [
"Four minutes after the red card, Emerson Royal nodded a corner into the path of the unmarked Kane at the far post, who nudged the ball in for his 12th goal in 17 North London derby appearances. Arteta's misery was compounded two minutes after half-time when Kane held the ball up in front of goal and teed up Son to smash a shot beyond a crowd of defenders to make it 3-0.The goal moved the South Korea talisman a goal behind Premier League top scorer Mohamed Salah on 21 for the season, and he looked perturbed when he was hauled off with 18 minutes remaining, receiving words of consolation from Pierre-Emile Hojbjerg.Once his frustrations have eased, Son and Spurs will look ahead to two final games in which they only need a point more than Arsenal to finish fourth.",
] for text in test_data:
print(
predictor.predict_generate_beamsearch(text,
out_max_length=50,
beam_size=3)) ``
This example is for the seq2seq task, where we can get beam-search results by calling the predict_generate_beamsearch function. In addition, we also support prediction for tasks such as NER and title generate`. 2. NER ```python
from flagai.auto_model.auto_loader import AutoLoader
from flagai.model.predictor.predictor import Predictor task_name = "ner"
model_name = "RoBERTa-base-ch"
target = ["O", "B-LOC", "I-LOC", "B-ORG", "I-ORG", "B-PER", "I-PER"]
maxlen = 256 auto_loader = AutoLoader(task_name,
model_name=model_name,
load_pretrain_params=True,
class_num=len(target)) model = auto_loader.get_model()
tokenizer = auto_loader.get_tokenizer() predictor = Predictor(model, tokenizer) test_data = [
"6月15日,河南省文物考古研究所曹操高陵文物队公开发表声明承认:“从来没有说过出土的珠子是墓主人的",
"4月8日,北京冬奥会、冬残奥会总结表彰大会在人民大会堂隆重举行。习近平总书记出席大会并发表重要讲话。在讲话中,总书记充分肯定了北京冬奥会、冬残奥会取得的优异成绩,全面回顾了7年筹办备赛的不凡历程,深入总结了筹备举办北京冬奥会、冬残奥会的宝贵经验,深刻阐释了北京冬奥精神,对运用好冬奥遗产推动高质量发展提出明确要求。",
"当地时间8日,欧盟委员会表示,欧盟各成员国政府现已冻结共计约300亿欧元与俄罗斯寡头及其他被制裁的俄方人员有关的资产。",
"这一盘口状态下英国必发公司亚洲盘交易数据显示博洛尼亚热。而从欧赔投注看,也是主队热。巴勒莫两连败,",
] for t in test_data:
entities = predictor.predict_ner(t, target, maxlen=maxlen)
result = {}
for e in entities:
if e[2] not in result:
result[e[2]] = [t[e[0]:e[1] + 1]]
else:
result[e[2]].append(t[e[0]:e[1] + 1])
print(f"result is {result}")
``` 3. Semantic Matching example ```python
from flagai.auto_model.auto_loader import AutoLoader
from flagai.model.predictor.predictor import Predictor maxlen = 256 auto_loader = AutoLoader("semantic-matching",
model_name="RoBERTa-base-ch",
load_pretrain_params=True,
class_num=2)
model = auto_loader.get_model()
tokenizer = auto_loader.get_tokenizer() predictor = Predictor(model, tokenizer) test_data = [["后悔了吗", "你有没有后悔"], ["打开自动横屏", "开启移动数据"],
["我觉得你很聪明", "你聪明我是这么觉得"]] for text_pair in test_data:
print(predictor.predict_cls_classifier(text_pair)) ``` LICENSE The majority of FlagAI is licensed under the Apache 2.0 license , however portions of the project are available under separate license terms: Megatron-LM is licensed under the Megatron-LM license GLM is licensed under the MIT license AltDiffusion is licensed under the CreativeML Open RAIL-M license News [9 June 2023] release v1.7.0, Support Aquila #324 ; [31 Mar 2023] release v1.6.3, Support AltCLIP-m18 #303 and AltDiffusion-m18 #302 ; [17 Mar 2023] release v1.6.2, Support application of new optimizers #266 , and added a new gpt model name 'GPT2-base-en' for English; [2 Mar 2023] release v1.6.1, Support Galactica model #234 ; BMInf, a low-resource inference package #238 , and examples for p-tuning #227 [12 Jan 2023] release v1.6.0, support a new parallel lib called BMTrain and integate Flash Attention to speedup training of BERT and ViT models, examples in FlashAttentionBERT and FlashAttentionViT . Also add the contrastive search based text generation method SimCTG and DreamBooth finetuning based on AltDiffusion, examples in AltDiffusionNaruto . [28 Nov 2022] release v1.5.0, support 1.1B EVA-CLIP and [ALM: A large Arabic Language Model based on GLM], examples in ALM [10 Nov 2022] release v1.4.0, support AltCLIP: Altering the Language Encoder in CLIP for Extended Language Capabilities , examples in AltCLIP and AltDiffusion [29 Aug 2022] release v1.3.0, Added CLIP module and redesigned tokenizer APIs in #81 [21 Jul 2022] release v1.2.0, ViTs are supported in #71 [29 Jun 2022] release v1.1.0, support OPTs downloading and inference/fine-tuning #63 [17 May 2022] made our first contribution in #1 Platforms supported Misc ↳ Stargazers, thank you for your support! ↳ Forkers, thank you for your support! ↳ Star History ![Star History Chart](https://api.star-history.com/svg?repos=FlagAI-Open/FlagAI&type=Date)] | FlagAI (Fast LArge-scale General AI models) is a fast, easy-to-use and extensible toolkit for large-scale model. | [] | 37 | 35 | 354 | 1,416 | 17 | 3 | 2 |
yandex/YaLM-100B | YaLM 100B YaLM 100B is a GPT-like neural network for generating and processing text. It can be used freely by developers and researchers from all over the world. The model leverages 100 billion parameters. It took 65 days to train the model on a cluster of 800 A100 graphics cards and 1.7 TB of online texts, books, and countless other sources in both English and Russian. Training details and best practices on acceleration and stabilizations can be found on Medium (English) and Habr (Russian) articles. We used DeepSpeed to train the model and drew inspiration from Megatron-LM example. However, the code in this repo is not the same code that was used to train the model. Rather it is stock example from DeepSpeed repo with minimal changes needed to infer our model. Setup Make sure to have 200GB of free disk space before downloading weights. The model (code is based on microsoft/DeepSpeedExamples/Megatron-LM-v1.1.5-ZeRO3 ) is supposed to run on multiple GPUs with tensor parallelism. It was tested on 4 (A100 80g) and 8 (V100 32g) GPUs, but is able to work with different configurations with ≈200GB of GPU memory in total which divide weight dimensions correctly (e.g. 16, 64, 128). Downloading checkpoint Run bash download/download.sh to download model weights and vocabulary. By default, weights will be downloaded to ./yalm100b_checkpoint/weights/ , and vocabulary will be downloaded to ./yalm100b_checkpoint/vocab/ . As another option, you can clone our HF repo and pull the checkpoint . Docker We published image on Docker Hub, it can be pulled with docker/pull.sh . It is compatible with A100 and V100. Alternatively, you can build docker image from source using docker/build.sh (which will just build docker image from docker/Dockerfile ). To run container, use docker/run.sh (volumes, name and other parameters can be changed) . Usage You can start with the following scripts:
* examples/generate_interactive.sh : interactive generation from command line, the simplest way to try the model.
* examples/generate_conditional_sampling.sh : conditional generation with sampling strategy. Top-p is used by default, feel free to change temperature or use top-k. Input is jsonlines (example: examples/example_cond_input.json ), output will be the same jsonlines with generated text field added to each line.
* examples/generate_conditional_greedy.sh : same as previous, but generation is greedy. Suitable for solving problems with few-shot.
* examples/generate_unconditional.sh : unconditional generation. No input is used, output will be jsonlines. License The model is published under the Apache 2.0 license that permits both research and commercial use, Megatron-LM is licensed under the Megatron-LM license . Training details Dataset composition Dataset used for the training of YaLM-100B is comprised of the following parts (rough percentages are measured in tokens seen by the model): 25% The Pile — open English dataset by Eleuther AI team 75% Texts in Russian collected by our team (percentages of the whole dataset are given) 49% Russian web pages from Yandex Search index filtered from ~100Tb to ~1Tb by the following heuristics: LSH Deduplication — clusters of similar texts were truncated to just one text each Length filtration — too short or too long texts or texts with too few natural sentences were discarded. Entropy filtration — texts with too high or too low entropy were discarded Domain filtration — domains with repetitive texts (like online retail) were discarded Classifier filtration — dataset of good texts was collected in a manner similar to WebText from pages linked in tweets in Russian that have at least one reply. Then a classifier was trained to distinguish those good texts from random pages from the dataset. Texts from the original crawled dataset with low classifier scores were then discarded 12% News from various sources from Yandex Search index 10% Books from the dataset used in Russian Distributional Thesarus 3% Misc texts from the Taiga Dataset 1.5% Dialogues from social media preprocessed in a manner similar to how Reddit is proccessed in The Pile 0.5% Russian portion of Wikipedia Some subsets were traversed up to 3 times during the training. Training process Model was trained on a cluster of 800 A100 for ~65 days. In that time it consumed 300B tokens. You can see TensorBoard with LR and ramp up schedule, training metrics and our "thermometers" on the HF page . | Pretrained language model with 100B parameters | [] | 0 | 13 | 5 | 10 | 16 | 1 | 0 |
ansh/jiffyreader.com | ========================================================================================================== THIS PROJECT IS NO LONGER MAINTAINED. THIS REPOSITORY IS AN ARCHIVE FOR EDUCATIONAL PURPOSES ONLY. ========================================================================================================== Jiffy Reader A Browser Extension for Faster Reading on ANY website! How it works Below is a screenshot demonstrating how the extension works by bolding out the initial parts of all text on any page when clicked. There are toggles and sliders to customize it to your preference so you can enjoy your time reading. You must agree this is awesome right? The best way to install this extension is to follow the instructions below. Table of Contents Jiffy Reader How it works Table of Contents Installation Instructions Chrome Firefox Safari Firefox Nightly / Fennec F-droid / Mull (Android) Opera Edge Android (kiwi Browser) Bookmarklet First Installation Welcome Notes on the extension Notes on this page FAQ How to access the extension settings/popup ui Desktop Android (kiwi browser) What are the functions of the buttons and sliders Global preferences button Site preferences button Enable reading mode button Saccades interval slider Fixation strength slider Fixation edge opacity Saccades colors Saccades styles Line height buttons Always on/off button Reset Defaults PDF and Epub support Google Play Books Native (Epub) Upload Epubs to Google PlayBooks (Epubs) PFD Support (convert pdf files to epub or html) Google Docs support (publish method) Google Docs support (html download method) Enable file url permissions (chrome html) Customizations Shortcut What is Faster Reading? Reporting Issues, bugs and feature request How to Contribute Help with Translations Working with the translation files. Submitting your translations Supported languages Development Configure vscode to run the project when it is opened Release a new version Installation Instructions Chrome Download via Chrome Store or follow the instructions below Click here to download the latest jiffyReader-chrome.zip release Extract the file Open Chrome Enter chrome://extensions in the address bar Enable developer mode with the toggle on the top right side of the page if it is not enabled already Click load unpacked on the left side of the page Find and select the extracted folder, this extension should now be installed To pin the extension, click the puzzle icon on the top right of Chrome, then pin the extension . The extensions default reading mode is set to off when installed See the faq section on how to use the extension, customize it (global and per site settings) and excluding sites from always on Firefox Download via the Mozilla Firefox Plugin/Add-on Store or follow the instructions below Download jiffyReader-firefox.xpi by right clicking here and choose Save link as to download the latest jiffyReader-firefox.xpi release Open Firefox Enter about:debugging#/runtime/this-firefox in the address bar Click Load Temporary Add-on... and navigate to the path of the downloaded jiffyReader-firefox.xpi and select it to install it The extensions default reading mode is set to off when installed See the faq section on how to use the extension, customize it (global and per site settings) and excluding sites from always on Firefox will remove the extension when the browser is closed if the extension is not downloaded from the store. Safari Download via the App Store here or TestFlight here . This works for both macOS and iOS. We are working on getting it approved to download directly via the App Store. If you want to build the app yourself, follow the instructions below We will be converting the web extension for Safari usage. This will require a macOS computer and the latest version of XCode installed. Use git clone to clone the Jiffy Reader repo locally. Run pnpm build:xcode or pnpm build:xcode:all to convert the extension. Open the Safari app on your Mac and make sure to click Develop -> Allow Unsigned Extensions in the top menu bar. Open the project in XCode and click run! Firefox Nightly / Fennec F-droid / Mull (Android) Go to settings Scroll to the bottom and select About {browser name} Tap the browser logo five times Go back to settings and in the Advanced section, select Custom Add-on collection Type 17432789 as the collection owner (user ID) Type jiffyreader as the collection name. The browser will close to apply the settings. Go to Add-ons/Add-ons manager to install the add-on. For convenience you may want to enable the extension by default by clicking on the Turn On Always button in the add-on's menu. Opera Download: Click here to download the latest jiffyReader-opera.crx release Extract the file Open Opera Enter opera://extensions in the address bar Enable developer mode with the toggle on the top right side of the page if it is not enabled already Click load unpacked on the left side of the page Find and select the extracted folder, this extension should now be installed and listed on the screen To pin the extension, click the cube icon on the top right of Chrome, then pin the extension . The extensions default reading mode is set to off when installed See the faq section on how to use the extension, customize it (global and per site settings) and excluding sites from always on Edge Please follow the steps for chrome above Android (kiwi Browser) Download the kiwi browser if you do not already have it installed Open kiwi browser Navigate to the extension listing on Chrome Store and Click the Add to Chrome button to install the extension The extensions default reading mode is set to off when installed See the faq section on how to use the extension, customize it (global and per site settings) and excluding sites from always on Bookmarklet (Note: Bookmarklet is not in active support and may break when new updates are released)
1. To install the bookmarklet, head over to this link First Installation Welcome Thank you for installing JiffyReader. Read the 8 points below which will help you the most in getting you started with JiffyReader Notes on the extension Why did the browser open this page? because this is the first time you installed JiffyReader. The extension is on the default settings and optimal for most websites. Changes to settings are saved instantly and can be restored to the default optimal settings by clicking the Reset Settings button at the bottom of the extension. If confused on how to use the buttons and sliders check out the section on what are the functions of the buttons and sliders . Notes on this page You can find important resources such as the FAQ section , how to contribute and how to report issues on this page. You can always get to this page by clicking the FAQ link in the footer of the extension popup. For further help, check the table of contents or open an issue ticket using the links at the very top of this page. You can close this page and return at anytime to find more help or clarification. FAQ How to access the extension settings/popup ui Desktop Click on the (on chrome: puzzle icon | on edge puzzle icon | on opera cube icon | on brave puzzle icon ) Note: Firefox will auto pin the extension Click on the pin icon next to jiffy reader to pin it next the address bar Click on the pinned icon to access the settings/popup menu Android (kiwi browser) Click on the more (3 vertical dots) button and scroll down Click on Jiffy Reader to open the settings/popup ui What are the functions of the buttons and sliders Global preferences button clicking this button enters global mode where your preferences are saved and applied to all other sites when you open them afterwards Site preferences button Clicking this buttons activates and saves preferences only for the site you are presently on. Any changes you make with the other buttons and sliders persist for only this site. Enable reading mode button Click this button to turn on/off the emphasis (bionification) of the text on the page. Press ALT + B on chrome and ALT + W on firefox to achieve the same effect as clicking this button on chrome. see the shortcut-section for more info. Saccades interval slider Use this slider to set how many words are left untouched/unbolded or un-emphasized after the first emphasized word or the first word. 0 means there will not be a single or any untouched words, all words are emphasized. 1 means exactly 1 word is left untouched before the next successive emphasized word. 2 means 2 words are left untouched so does 3 and 4. Fixation strength slider Use this to control how much or how little of each word is emphasized you your liking . Fixation edge opacity Use this to control how faint(weakly visible) or strongly visible you want the edge(un-emphasized) part of words to appear. Saccades colors Use this to select a means of creating emphasis using colors. Saccades styles Use this to select a means of creating emphasis using bold variations or underline variations. Line height buttons Use these buttons to increase or decrease line height to strain and improve the comfort of reading. Always on/off button Use this button to controls the default behavior which is if words on pages are or aren't emphasized when loaded by default. Reset Defaults Resets preferences of the currently engaged preference mode. PDF and Epub support Google Play Books Native (Epub) This extension works with google play books Open or navigate to google play books Click on any book in your library to read it and turn on the extension if not on already You can search for new books (paid or free) and add to your library to start reading Upload Epubs to Google PlayBooks (Epubs) Upload your Epub ebooks to Google Play Books reads to be able to read it with JiffyReader. Open google play books Click the upload button Select your epub file to complete the upload Click on the uploaded file to open it in the Google Play Books web reader. Have fun. PFD Support (convert pdf files to epub or html) Open cloud convert to Upload your pdf file Select your output format (html or epub) Click convert to start the process Click download Open your downloaded html file in your browser and turn on JiffyReader For chrome permission issues, follow the steps in Enable file url permissions (Chrome) For epub files follow the steps in Upload Epub to Google PlayBooks JiffyReader does not collaborate with cloudconvert. Please consult their privacy policy for any privacy concerns. Google Docs support (publish method) Open the google docs document in your browser Click File > click share > click publish to web Click publish and copy the published link. Alternatively you can replace edit in the address bar with pub to access the published document Open the published link in a new tab and turn on JiffyReader Note: the document will be accessible to anyone on the internet as long as they have the correct link.
If you want do not want to publish the document to the web then please follow the alternative steps in Google Docs support (download method) Google Docs support (html download method) Click on File > click download Download the document as an html (preferred) or epub optional Google Play Books Native (Epub) Open the downloaded html with your browser and turn on JiffyReader You may be required to enable permissions to access file urls for chrome. To do so follow Enable file permissions Enable file url permissions (chrome html) (Chrome) Enable JiffyReader to work with tabs that have file urls by right clicking on the JiffyReader icon Click manage extension Find and enable work with file urls Customizations Shortcut Alt+B is the default toggle shortcut to turn on or off the extension If preferred you may customize the extension shortcut with the help of the resources below Chrome, Firefox and Edge Opera: open the extension management tab and click the Keyboard shortcuts link to access the page for customizing opera shortcuts What is Faster Reading? This extension provides faster reading through facilitating the reading process by bolding half the words.
As a result, the reader is only focusing on the bolded initial letters and lets the brain autocomplete the words. This allows you to read faster. Reporting Issues, bugs and feature request Visit the issues page to report, bugs or tell us about a feature you would like to see and hopefully we will get to you.
Kindly allow for some time after submitting a issue for someone to get back to you.
You can also see a list of open issues that you may contribute to by commenting to help out someone with a challenge or developing and opening a PR. See contribution section How to Contribute Anyone is welcome to provide contributions to this project by submitting a PR (Pull Request) and it will be happily merged to provide features and fixes to the incredible people using the extension. Help with Translations JiffyReader is in need of translation help for what ever language you can.
To help:
1. Please check that the language you would like to help with has not already been taken up by someone else by looking through both the open and closed tickets for translations.
2. Open a issue ticket and add the translation label to it along with the name of the language you want to translate. Use this shortcut link to open a new ticket 3. Copy either the english locale json click here or spanish local json click here translate into the language you can assist with using your preferred editor or even ms word.
4. Indicate the language you would like to help translate in the ticket title. This helps to eliminate duplicate work. Attach any questions or updates to the ticket you are working on and someone will try and get to them within a day or two. ## Working with the translation files.
The translation files are json formats. You only need to worry about translating the text associated with the message key. <!-- example -->
"exampleText":{
"message": "this is the text to translate",
"description": "it is not required to translate this text"
} - result after translation into spanish
```
"exampleText":{
"message": "esto es el texto a traducir",
"description": "it is not required to translate this text"
} ``` ## Submitting your translations
- You can email the translated file or paste the entire translation as a new comment in the ticket you opened and we will take it from there.
- Don't forget to indicate your name for attribution. # Supported languages
1. English: by JiffyReader maintainer
2. Spanish: by JiffyReader maintainer
3. Others coming soon: contributor name JiffyReader has been updated to support displaying information in multiple languages thanks to a strong interest and constant emails and enquiries about it.
We have implemented the required mechanisms to support displaying the extension in the language of your choice. The challenge we have now is to get as many translations as possible. Development Clone the project Open in VS Code or your favorite editor Run yarn or npm i to install dependencies Install pnpm if you don't already have it, use npm i -g pnpm Run pnpm dev:chrome or pnpm run dev:chrome to build the development version. Substitute chrome for firefox if that is your preferred browser. Follow the installation version for your preferred browser but navigate to the projectRootFolder/build/ and choose the folder that corresponds with your browser. Configure vscode to run the project when it is opened Copy .vscode/tasks.json.example to .vscode/tasks.json or enter cp .vscode/tasks.json.example .vscode/tasks.json in the terminal from the project root Open vs code command pallet Type and select Tasks: Manage Automatic Tasks in Folder Click Allow Automatic Tasks in Folder . Reload VS code. Release a new version Change version in package.json , eg. "version": "1.0.0" Push a new tag to GitHub, eg. git tag 1.0.0 && git push --tags The Workflow should be running here Check the release version here and edit release notes. | A Browser Extension for faster reading on ANY website! | reading,chrome-extension,web-extension,brave-extension,browser-extension,firefox-extension,opera-extension,safari-extension,bookmark,reader | 53 | 24 | 116 | 586 | 31 | 4 | 4 |
Coder-World04/Complete-System-Design | Complete System Design with Implemented Case Studies and Code This repository contains everything you need to become proficient in System Design . Youtube for all the projects and tech interview resources - Ignito Youtube Channel Complete Cheat Sheet for Tech Interviews - How to prepare efficiently Mega Launch - 200+ System Design Case Studies System Design Most Important Terms System Design Template Complete System Design Case Studies How to solve any System Design Question ( approach that you should take) ML System Design Case Studies Series For Data Structures and Algorithms, start here : Day 1 of 30 days of Data Structures and Algorithms and System Design Simplified : DSA and System Design made Easy Topics you should know in System Design --- System design basics Horizontal and vertical scaling Load balancing and Message queues High level design and low level design, Consistent Hashing, Monolithic and Microservices architecture Caching, Indexing, Proxies Networking, How Browsers work, Content Network Delivery ( CDN) Database Sharding, CAP Theorem, Database schema Design Concurrency, API, Components + OOP + Abstraction Estimation and Planning, Performance Map Reduce, Patterns and Microservices System Design Case Studies | Case Study Name | Link to the Case Study |
| --- | --- |
| Design Instagram | Link |
| Design Messenger App | Link |
| Design Twitter | Link |
| Design URL Shortner | Link |
| Design Dropbox | Link |
| Design Youtube | Link |
| Design Tinder | Link |
| Design Google Drive | Link |
| Design Messenger App | Link |
| Design Instagram | Link |
| Design Twitter | Link |
| Design Ticketmaster | Link |
| Design Quora | Link |
| Design Flipkart | Link |
| Design Flickr | Link |
| Design TikTok | Link |
| Design Netflix | Link |
| Design Foursquare | Link |
| Design Uber | Link |
| Design Youtube | Link |
| Design Reddit | Link |
| Design Facebook’s Newsfeed | Link |
| Design Amazon Prime Video | Link |
| Design Web Crawler | Link | Design Web Crawler | Link |
| Design API Rate Limiter | Link |
| Design Dropbox | Link |
| Design Yelp | Link |
| Design Whatsapp | Link |
| Design URL Shortener | Link |
| Design Bookmyshow | Link |
| Design Linkedin | Link |
| Design Telegram | Link |
| Design Snapchat | Link |
| Design One Drive | Link |
| Design BookmyShow | Link |
| Design Google Maps | Link |
| Design Quora | Link |
| Design Foursquare | Link |
| Design Tiny URL | Link |
| Design Flipkart | Link | Design Instagram Design Messenger App Design Twitter Design URL Shortner Design Dropbox Design Youtube Design Tinder Design Google Drive Design Messenger App Design Instagram Design Twitter Design Ticketmaster Design Quora Design Flipkart Design Flickr Design TikTok Design Netflix Design Foursquare Design Uber Design Youtube Design Reddit Design Facebook’s Newsfeed Design Amazon Prime Video Design Web Crawler Design API Rate Limiter Design Dropbox Design Yelp Design Whatspp Design URL shortener Design Bookmyshow Design Linkedin Design Telegram Design Snapchat Design One Drive Design BookmyShow Design Google Maps Design Quora Design Foursquare Design Tiny URL Design Flipkart Part 1 of System Design Made Easy Series In the part 1, we covered what and why of System Design and the important topics that you should know. System Design is mostly an open ended concept and most of the questions can be answered in different degrees and aptitudes. In layman’s language, system design is about — Architecture + Data + Applications Learn what is system design and why it’s important, how they work and how to use them in your system design interviews in part 1 Part 2 of System Design Made Easy Series In the part 2, we covered— System design basics Horizontal and Vertical Scaling with an example In technical words, scalability is a the technique/process of adding/removing infrastructure/resources required by applications to better serve/accommodate increased/decreased demand/growth. We also covered the tradeoffs of horizontal and vertical scaling. Learn why good understanding of system design basics and horizontal and vertical scaling are important, how they work and how to use them in your system design interviews in part 2 Part 3 of System Design Made Easy Series In the part 3, we covered system design’s most important concepts — Load Balancing Message Queues Load balancing is a technique of distributing tasks over a set of servers/machines to improve the performance,throughput, high availability, redundancy and reliability of the system. Not just it enables horizontal scaling but also dynamic resizing/scaling. Message queues are nothing bit temporary buffers placed between users/applications and servers to store the message requests and process them in FIFO order asynchronously until the requests/messages are delivered to the desired server. We also covered the tradeoffs of the different techniques. Learn why load balancing and message queues are important, how they work and how to use them in your system design interviews in part 3 Part 4 of System Design Made Easy Series In the part 4, we covered- High level design and Low level design Monolithic and microservices architecture and which one to choose and when ? Consistent Hashing High level Design(HLD) describes the overall architecture of the application and covers functionality of each module of the system very briefly. LLD details the functional logic of the each module in the system. Monolithic architecture — consists of single code base with multiple modules and it’s easier and faster to deploy. Microservices architecture — consists of individual service units with each service being responsible for exactly one functionality. It’s relatively complex and time taking to deploy. Consistent hashing is a technique to divide keys/data between multiple servers/machines using a hash function ( key — value). Learn why high level design and low level design, Monolith and microservices architecture and consistent hashing are important, how they work and how to use them in your system design interviews in part 4 Part 5 of System Design Made Easy Series In the part 5, we covered — Caching Indexing Proxies Caching is a technique which is based on the principal of locality — stores the copies of most frequently used/accessed data in a small and faster memory to improve Data retrieval times, Compute costs, User Experience and Throughput. Indexing helps in Improving the speed of data access/retrieval, Reducing the number of expensive I/O operations, Providing better organization and management of multilevel data records. Proxies play an important role of coordinating user requests, handling concurrent requests, filtering user request, transforming user requests by adding an additional layer of encryption or header information or compression information and then forwarding the user request to the server. Learn why Caching, Indexing and Proxies are important, how they work and how to use them in your system design interviews in part 5 Part 6 of System Design Made Easy Series In the part 6, we covered — Networking How Browsers work Content Network Delivery ( CDN) Networking is nothing but interconnected devices that can exchange data-messages and share resources amongst themselves/with outside world based in the system protocols/rules, technologies and algorithms that govern these devices inner workings. Browsers are used to present the website/resource you would like to visit say, for example google.com by sending the request to the server and displaying it on their browser window. Content Network Delivery caters to the users by serving their requests by quickly transferring the data back and forth. Learn why networking, browsers working and CDN are important, how they work and how to use them in your system design interviews in part 6 Part 7 of System Design Made Easy Series In the part 7, we covered — Database Sharding CAP Theorem Database schema Design Sharding is the technique to database partitioning that separates large and complex databases into smaller, faster and distributed databases for higher throughput operations. CAP theorem lets you determine how you want to handle your distributed databases with there is possibility of inconsistencies, unavailability and connection errors/failures/outrage. Database schema Design lets you organize data into separate entities and establish and organize the relationships between different entities. Learn why database sharding, CAP theorem and Database Schema Design are important, how they work and how to use them in your system design interviews in part 7 **Part 8 of System Design Made Easy Series
** In the part 8 , we covered — Concurrency API Components + OOP + Abstraction Concurrency is the process in which multiple computations/operations/process happen/execute in parallel/concurrently. API is an acronym for application programming interface which provides a way to two or more programs to communicate, work together despite different configurations, architectures, resources etc Components in the system design are building blocks designed to coordinate, cooperate, reuse and work well with other components of the same/different systems. They can be as simple as visual components or internal components/backend components. Learn why concurrency, API and Components + OOP + Abstraction are important, how they work and how to use them in your system design interviews in part 8 Part 9 of System Design Made Easy Series In the part 9 , we covered — Planning and Estimation Performance Planning and estimation( numbers) and performance are very important concepts ( concept that you should be able to demonstrate well when asked). Learn why Planning, estimation( numbers) and performance are important, how they work and how to use them in your system design interviews in part 9 Part 10 of System Design Made Easy Series In the part 10, we covered — Map Reduce Patterns and Microservices In system design, map reduce ( Hadoop systems) is a batch processing technique in which the engine takes huge amounts of data, processes ( map and reduce) and gives the output. In system design, microservices architecture is used to build enterprise level applications which helps in structuring the whole application as a collection of tiny autonomous, self contained services for each task ( service) that you want/are allowed to perform. Learn why map reduce, patterns and microservices are important, how they work and how to use them in your system design interviews in part 10 Most Popular System Design Questions — Mega Compilation In this post. we covered the most popular/important system design questions that you should practice to build a thorough understanding of how large systems are designed. Popular Questions : Link Some of the other best Series - Complete 60 Days of Data Science and Machine Learning Series 30 days of Machine Learning Ops 30 Days of Natural Language Processing ( NLP) Series Data Science and Machine Learning Research ( papers) Simplified ** 30 days of Data Engineering with projects Series 60 days of Data Science and ML Series with projects 100 days : Your Data Science and Machine Learning Degree Series with projects 23 Data Science Techniques You Should Know Tech Interview Series — Curated List of coding questions Complete System Design with most popular Questions Series Complete Data Visualization and Pre-processing Series with projects Complete Python Series with Projects Complete Advanced Python Series with Projects Kaggle Best Notebooks that will teach you the most Complete Developers Guide to Git Exceptional Github Repos — Part 1 Exceptional Github Repos — Part 2 All the Data Science and Machine Learning Resources 210 Machine Learning Projects 6 Highly Recommended Data Science and Machine Learning Courses that you MUST take ( with certificate) - Complete Data Scientist : https://bit.ly/3wiIo8u Learn to run data pipelines, design experiments , build recommendation systems, and deploy solutions to the cloud. Complete Data Engineering : https://bit.ly/3A9oVs5 Learn to design data models, build data warehouses and data lakes, automate data pipelines, and work with massive datasets Complete Machine Learning Engineer : https://bit.ly/3Tir8ub Learn advanced machine learning techniques and algorithms - including how to package and deploy your models to a production environment. Complete Data Product Manager : https://bit.ly/3QGUtwi Leverage data to build products that deliver the right experiences, to the right users, at the right time. Lead the development of data-driven products that position businesses to win in their market. Complete Natural Language Processing : https://bit.ly/3T7J8qY Build models on real data, and get hands-on experience with sentiment analysis, machine translation, and more. Complete Deep Learning: https://bit.ly/3T5ppIo Learn to implement Neural Networks using the deep learning framework PyTorch | This repository contains everything you need to become proficient in System Design | [] | 0 | 1 | 1 | 89 | 0 | 1 | 0 |
unjs/magic-regexp | 🦄 magic-regexp A compiled-away, type-safe, readable RegExp alternative ✨ Changelog 📖 Documentation ▶️ Online playground Features Runtime is zero-dependency and ultra-minimal Ships with transform to compile to pure RegExp Automatically typed capture groups Natural language syntax Generated RegExp displays on hover 📖 Read more 💻 Development Clone this repository Enable Corepack using corepack enable (use npm i -g corepack for Node.js < 16.10) Install dependencies using pnpm install Run interactive tests using pnpm dev Similar packages verbal-expressions typed-regex License Made with ❤️ Published under MIT License . | A compiled-away, type-safe, readable RegExp alternative | regexp,typescript,regex,regular-expression,hacktoberfest | 18 | 25 | 361 | 590 | 19 | 11 | 5 |
google-research/multinerf | MultiNeRF: A Code Release for Mip-NeRF 360, Ref-NeRF, and RawNeRF This is not an officially supported Google product. This repository contains the code release for three CVPR 2022 papers: Mip-NeRF 360 , Ref-NeRF , and RawNeRF .
This codebase was written by
integrating our internal implementations of Ref-NeRF and RawNeRF into our
mip-NeRF 360 implementation. As such, this codebase should exactly
reproduce the results shown in mip-NeRF 360, but may differ slightly when
reproducing Ref-NeRF or RawNeRF results. This implementation is written in JAX , and
is a fork of mip-NeRF .
This is research code, and should be treated accordingly. Setup ``` Clone the repo. git clone https://github.com/google-research/multinerf.git
cd multinerf Make a conda environment. conda create --name multinerf python=3.9
conda activate multinerf Prepare pip. conda install pip
pip install --upgrade pip Install requirements. pip install -r requirements.txt Manually install rmbrualla's pycolmap (don't use pip's! It's different). git clone https://github.com/rmbrualla/pycolmap.git ./internal/pycolmap Confirm that all the unit tests pass. ./scripts/run_all_unit_tests.sh
```
You'll probably also need to update your JAX installation to support GPUs or TPUs. Running Example scripts for training, evaluating, and rendering can be found in scripts/ . You'll need to change the paths to point to wherever the datasets
are located. Gin configuration files
for our model and some ablations can be found in configs/ .
After evaluating on the test set of each scene in one of the datasets, you can
use scripts/generate_tables.ipynb to produce error metrics across all scenes
in the same format as was used in tables in the paper. OOM errors You may need to reduce the batch size ( Config.batch_size ) to avoid out of memory
errors. If you do this, but want to preserve quality, be sure to increase the number
of training iterations and decrease the learning rate by whatever scale factor you
decrease batch size by. Using your own data Summary: first, calculate poses. Second, train MultiNeRF. Third, render a result video from the trained NeRF model. Calculating poses (using COLMAP): DATA_DIR=my_dataset_dir
bash scripts/local_colmap_and_resize.sh ${DATA_DIR} Training MultiNeRF: python -m train \
--gin_configs=configs/360.gin \
--gin_bindings="Config.data_dir = '${DATA_DIR}'" \
--gin_bindings="Config.checkpoint_dir = '${DATA_DIR}/checkpoints'" \
--logtostderr Rendering MultiNeRF: python -m render \
--gin_configs=configs/360.gin \
--gin_bindings="Config.data_dir = '${DATA_DIR}'" \
--gin_bindings="Config.checkpoint_dir = '${DATA_DIR}/checkpoints'" \
--gin_bindings="Config.render_dir = '${DATA_DIR}/render'" \
--gin_bindings="Config.render_path = True" \
--gin_bindings="Config.render_path_frames = 480" \
--gin_bindings="Config.render_video_fps = 60" \
--logtostderr Your output video should now exist in the directory my_dataset_dir/render/ . See below for more detailed instructions on either using COLMAP to calculate poses or writing your own dataset loader (if you already have pose data from another source, like SLAM or RealityCapture). Running COLMAP to get camera poses In order to run MultiNeRF on your own captured images of a scene, you must first run COLMAP to calculate camera poses. You can do this using our provided script scripts/local_colmap_and_resize.sh . Just make a directory my_dataset_dir/ and copy your input images into a folder my_dataset_dir/images/ , then run: bash scripts/local_colmap_and_resize.sh my_dataset_dir This will run COLMAP and create 2x, 4x, and 8x downsampled versions of your images. These lower resolution images can be used in NeRF by setting, e.g., the Config.factor = 4 gin flag. By default, local_colmap_and_resize.sh uses the OPENCV camera model, which is a perspective pinhole camera with k1, k2 radial and t1, t2 tangential distortion coefficients. To switch to another COLMAP camera model, for example OPENCV_FISHEYE, you can run bash scripts/local_colmap_and_resize.sh my_dataset_dir OPENCV_FISHEYE If you have a very large capture of more than around 500 images, we recommend switching from the exhaustive matcher to the vocabulary tree matcher in COLMAP (see the script for a commented-out example). Our script is simply a thin wrapper for COLMAP--if you have run COLMAP yourself, all you need to do to load your scene in NeRF is ensure it has the following format: my_dataset_dir/images/ <--- all input images
my_dataset_dir/sparse/0/ <--- COLMAP sparse reconstruction files (cameras, images, points) Writing a custom dataloader If you already have poses for your own data, you may prefer to write your own custom dataloader. MultiNeRF includes a variety of dataloaders, all of which inherit from the
base Dataset class . The job of this class is to load all image and pose information from disk, then
create batches of ray and color data for training or rendering a NeRF model. Any inherited subclass is responsible for loading images and camera poses from
disk by implementing the _load_renderings method (which is marked as
abstract by the decorator @abc.abstractmethod ). This data is then used to
generate train and test batches of ray + color data for feeding through the NeRF
model. The ray parameters are calculated in _make_ray_batch . Existing data loaders To work from an example, you can see how this function is overloaded for the
different dataloaders we have already implemented: Blender DTU dataset Tanks and Temples ,
as processed by the NeRF++ paper Tanks and Temples ,
as processed by the Free View Synthesis paper The main data loader we rely on is LLFF (named for historical reasons), which is the loader for a dataset that has been
posed by COLMAP. Making your own loader by implementing _load_renderings To make a new dataset, make a class inheriting from Dataset and overload the _load_renderings method: class MyNewDataset(Dataset):
def _load_renderings(self, config):
... In this function, you must set the following public attributes: images camtoworlds pixtocams height, width Many of our dataset loaders also set other useful attributes, but these are the
critical ones for generating rays. You can see how they are used (along with a batch of pixel coordinates) to create rays in camera_utils.pixels_to_rays . Images images = [N, height, width, 3] numpy array of RGB images. Currently we
require all images to have the same resolution. Extrinsic camera poses camtoworlds = [N, 3, 4] numpy array of extrinsic pose matrices. camtoworlds[i] should be in camera-to-world format, such that we can run pose = camtoworlds[i]
x_world = pose[:3, :3] @ x_camera + pose[:3, 3:4] to convert a 3D camera space point x_camera into a world space point x_world . These matrices must be stored in the OpenGL coordinate system convention for camera rotation:
x-axis to the right, y-axis upward, and z-axis backward along the camera's focal
axis. The most common conventions are [right, up, backwards] : OpenGL, NeRF, most graphics code. [right, down, forwards] : OpenCV, COLMAP, most computer vision code. Fortunately switching from OpenCV/COLMAP to NeRF is simple :
you just need to right-multiply the OpenCV pose matrices by np.diag([1, -1, -1, 1]) ,
which will flip the sign of the y-axis (from down to up) and z-axis (from
forwards to backwards): camtoworlds_opengl = camtoworlds_opencv @ np.diag([1, -1, -1, 1]) You may also want to scale your camera pose translations such that they all
lie within the [-1, 1]^3 cube for best performance with the default mipnerf360
config files. We provide a useful helper function camera_utils.transform_poses_pca that computes a translation/rotation/scaling transform for the input poses that aligns the world space x-y plane with the ground (based on PCA) and scales the scene so that all input pose positions lie within [-1, 1]^3 . (This function is applied by default when loading mip-NeRF 360 scenes with the LLFF data loader.) For a scene where this transformation has been applied, camera_utils.generate_ellipse_path can be used to generate a nice elliptical camera path for rendering videos. Intrinsic camera poses pixtocams = [N, 3, 4] numpy array of inverse intrinsic matrices, OR [3, 4]
numpy array of a single shared inverse intrinsic matrix. These should be in OpenCV format, e.g. camtopix = np.array([
[focal, 0, width/2],
[ 0, focal, height/2],
[ 0, 0, 1],
])
pixtocam = np.linalg.inv(camtopix) Given a focal length and image size (and assuming a centered principal point,
this matrix can be created using camera_utils.get_pixtocam . Alternatively, it can be created by using camera_utils.intrinsic_matrix and inverting the resulting matrix. Resolution height = int, height of images. width = int, width of images. Distortion parameters (optional) distortion_params = dict, camera lens distortion model parameters. This
dictionary must map from strings -> floats, and the allowed keys are ['k1',
'k2', 'k3', 'k4', 'p1', 'p2'] (up to four radial coefficients and up to two
tangential coefficients). By default, this is set to the empty dictionary {} ,
in which case undistortion is not run. Details of the inner workings of Dataset The public interface mimics the behavior of a standard machine learning pipeline
dataset provider that can provide infinite batches of data to the
training/testing pipelines without exposing any details of how the batches are
loaded/created or how this is parallelized. Therefore, the initializer runs all
setup, including data loading from disk using _load_renderings , and begins
the thread using its parent start() method. After the initializer returns, the
caller can request batches of data straight away. The internal self._queue is initialized as queue.Queue(3) , so the infinite
loop in run() will block on the call self._queue.put(self._next_fn()) once
there are 3 elements. The main thread training job runs in a loop that pops 1
element at a time off the front of the queue. The Dataset thread's run() loop
will populate the queue with 3 elements, then wait until a batch has been
removed and push one more onto the end. This repeats indefinitely until the main thread's training loop completes
(typically hundreds of thousands of iterations), then the main thread will exit
and the Dataset thread will automatically be killed since it is a daemon. Citation If you use this software package, please cite whichever constituent paper(s)
you build upon, or feel free to cite this entire codebase as: @misc{multinerf2022,
title={{MultiNeRF}: {A} {Code} {Release} for {Mip-NeRF} 360, {Ref-NeRF}, and {RawNeRF}},
author={Ben Mildenhall and Dor Verbin and Pratul P. Srinivasan and Peter Hedman and Ricardo Martin-Brualla and Jonathan T. Barron},
year={2022},
url={https://github.com/google-research/multinerf},
} | A Code Release for Mip-NeRF 360, Ref-NeRF, and RawNeRF | nerf,neural-radiance-fields | 0 | 8 | 10 | 61 | 102 | 1 | 0 |
THUDM/CogVideo | CogVideo This is the official repo for the paper: CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers . News! The demo for CogVideo is available! It's also integrated into Huggingface Spaces 🤗 using Gradio . Try out the Web Demo News! The code and model for text-to-video generation is now available! Currently we only supports simplified Chinese input . https://user-images.githubusercontent.com/48993524/170857367-2033c514-3c9f-4297-876f-2468592a254b.mp4 Read our paper CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers on ArXiv for a formal introduction. Try our demo at https://models.aminer.cn/cogvideo/ Run our pretrained models for text-to-video generation. Please use A100 GPU. Cite our paper if you find our work helpful @article{hong2022cogvideo,
title={CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers},
author={Hong, Wenyi and Ding, Ming and Zheng, Wendi and Liu, Xinghan and Tang, Jie},
journal={arXiv preprint arXiv:2205.15868},
year={2022}
} Web Demo The demo for CogVideo is at https://models.aminer.cn/cogvideo/ , where you can get hands-on practice on text-to-video generation. The original input is in Chinese. Generated Samples Video samples generated by CogVideo . The actual text inputs are in Chinese. Each sample is a 4-second clip of 32 frames, and here we sample 9 frames uniformly for display purposes. CogVideo is able to generate relatively high-frame-rate videos. A 4-second clip of 32 frames is shown below. Getting Started Setup Hardware: Linux servers with Nvidia A100s are recommended, but it is also okay to run the pretrained models with smaller --max-inference-batch-size and --batch-size or training smaller models on less powerful GPUs. Environment: install dependencies via pip install -r requirements.txt . LocalAttention: Make sure you have CUDA installed and compile the local attention kernel. shell
pip install git+https://github.com/Sleepychord/Image-Local-Attention Docker Alternatively you can use Docker to handle all dependencies. Run ./build_image.sh Run ./run_image.sh Run ./install_image_local_attention Optionally, after that you can recommit the image to avoid having to install image local attention again. Download Our code will automatically download or detect the models into the path defined by environment variable SAT_HOME . You can also manually download CogVideo-Stage1 , CogVideo-Stage2 and CogView2-dsr place them under SAT_HOME (with folders named cogvideo-stage1 , cogvideo-stage2 and cogview2-dsr ) Text-to-Video Generation ./script/inference_cogvideo_pipeline.sh Arguments useful in inference are mainly: --input-source [path or "interactive"] . The path of the input file with one query per line. A CLI would be launched when using "interactive". --output-path [path] . The folder containing the results. --batch-size [int] . The number of samples will be generated per query. --max-inference-batch-size [int] . Maximum batch size per forward. Reduce it if OOM. --stage1-max-inference-batch-size [int] Maximum batch size per forward in Stage 1. Reduce it if OOM. --both-stages . Run both stage1 and stage2 sequentially. --use-guidance-stage1 Use classifier-free guidance in stage1, which is strongly suggested to get better results. You'd better specify an environment variable SAT_HOME to specify the path to store the downloaded model. Currently only Chinese input is supported. | Text-to-video generation. The repo for ICLR2023 paper "CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers" | [] | 0 | 13 | 4 | 24 | 11 | 1 | 0 |
lancedb/lance | **Modern columnar data format for ML. Convert from Parquet in 2-lines of code for 100x faster random access, a vector index, data versioning, and more. **
**Compatible with pandas, DuckDB, Polars, and pyarrow with more integrations on the way.** Documentation • Blog • Discord • Twitter [CI]: https://github.com/lancedb/lance/actions/workflows/rust.yml
[CI Badge]: https://github.com/lancedb/lance/actions/workflows/rust.yml/badge.svg
[Docs]: https://lancedb.github.io/lance/
[Docs Badge]: https://img.shields.io/badge/docs-passing-brightgreen
[crates.io]: https://crates.io/crates/lance
[crates.io badge]: https://img.shields.io/crates/v/lance.svg
[Python versions]: https://pypi.org/project/pylance/
[Python versions badge]: https://img.shields.io/pypi/pyversions/pylance
[![CI Badge]][CI]
[![Docs Badge]][Docs]
[![crates.io badge]][crates.io]
[![Python versions badge]][Python versions] Lance is a modern columnar data format that is optimized for ML workflows and datasets. Lance is perfect for: Building search engines and feature stores. Large-scale ML training requiring high performance IO and shuffles. Storing, querying, and inspecting deeply nested data for robotics or large blobs like images, point clouds, and more. The key features of Lance include: High-performance random access: 100x faster than Parquet without sacrificing scan performance. Vector search: find nearest neighbors in milliseconds and combine OLAP-queries with vector search. Zero-copy, automatic versioning: manage versions of your data without needing extra infrastructure. Ecosystem integrations: Apache Arrow, Pandas, Polars, DuckDB and more on the way. Quick Start Installation shell
pip install pylance To install a preview release: shell
pip install --pre --extra-index-url https://pypi.fury.io/lancedb/ pylance [!TIP]
Preview releases are released more often than full releases and contain the
latest features and bug fixes. They receive the same level of testing as full releases.
We guarantee they will remain published and available for download for at
least 6 months. When you want to pin to a specific version, prefer a stable release. Converting to Lance ```python
import lance import pandas as pd
import pyarrow as pa
import pyarrow.dataset df = pd.DataFrame({"a": [5], "b": [10]})
uri = "/tmp/test.parquet"
tbl = pa.Table.from_pandas(df)
pa.dataset.write_dataset(tbl, uri, format='parquet') parquet = pa.dataset.dataset(uri, format='parquet')
lance.write_dataset(parquet, "/tmp/test.lance")
``` Reading Lance data python
dataset = lance.dataset("/tmp/test.lance")
assert isinstance(dataset, pa.dataset.Dataset) Pandas python
df = dataset.to_table().to_pandas()
df DuckDB ```python
import duckdb If this segfaults, make sure you have duckdb v0.7+ installed duckdb.query("SELECT * FROM dataset LIMIT 10").to_df()
``` Vector search Download the sift1m subset shell
wget ftp://ftp.irisa.fr/local/texmex/corpus/sift.tar.gz
tar -xzf sift.tar.gz Convert it to Lance ```python
import lance
from lance.vector import vec_to_table
import numpy as np
import struct nvecs = 1000000
ndims = 128
with open("sift/sift_base.fvecs", mode="rb") as fobj:
buf = fobj.read()
data = np.array(struct.unpack("<128000000f", buf[4 : 4 + 4 * nvecs * ndims])).reshape((nvecs, ndims))
dd = dict(zip(range(nvecs), data)) table = vec_to_table(dd)
uri = "vec_data.lance"
sift1m = lance.write_dataset(table, uri, max_rows_per_group=8192, max_rows_per_file=1024*1024)
``` Build the index python
sift1m.create_index("vector",
index_type="IVF_PQ",
num_partitions=256, # IVF
num_sub_vectors=16) # PQ Search the dataset ```python Get top 10 similar vectors import duckdb dataset = lance.dataset(uri) Sample 100 query vectors. If this segfaults, make sure you have duckdb v0.7+ installed sample = duckdb.query("SELECT vector FROM dataset USING SAMPLE 100").to_df()
query_vectors = np.array([np.array(x) for x in sample.vector]) Get nearest neighbors for all of them rs = [dataset.to_table(nearest={"column": "vector", "k": 10, "q": q})
for q in query_vectors]
``` Directory structure | Directory | Description |
|--------------------|--------------------------|
| rust | Core Rust implementation |
| python | Python bindings (pyo3) |
| docs | Documentation source | What makes Lance different Here we will highlight a few aspects of Lance’s design. For more details, see the full Lance design document . Vector index : Vector index for similarity search over embedding space.
Support both CPUs ( x86_64 and arm ) and GPU ( Nvidia (cuda) and Apple Silicon (mps) ). Encodings : To achieve both fast columnar scan and sub-linear point queries, Lance uses custom encodings and layouts. Nested fields : Lance stores each subfield as a separate column to support efficient filters like “find images where detected objects include cats”. Versioning : A Manifest can be used to record snapshots. Currently we support creating new versions automatically via appends, overwrites, and index creation . Fast updates (ROADMAP): Updates will be supported via write-ahead logs. Rich secondary indices (ROADMAP):
- Inverted index for fuzzy search over many label / annotation fields. Benchmarks Vector search We used the SIFT dataset to benchmark our results with 1M vectors of 128D For 100 randomly sampled query vectors, we get <1ms average response time (on a 2023 m2 MacBook Air) ANNs are always a trade-off between recall and performance Vs. parquet We create a Lance dataset using the Oxford Pet dataset to do some preliminary performance testing of Lance as compared to Parquet and raw image/XMLs. For analytics queries, Lance is 50-100x better than reading the raw metadata. For batched random access, Lance is 100x better than both parquet and raw files. Why are you building yet another data format?! The machine learning development cycle involves the steps: mermaid
graph LR
A[Collection] --> B[Exploration];
B --> C[Analytics];
C --> D[Feature Engineer];
D --> E[Training];
E --> F[Evaluation];
F --> C;
E --> G[Deployment];
G --> H[Monitoring];
H --> A; People use different data representations to varying stages for the performance or limited by the tooling available.
Academia mainly uses XML / JSON for annotations and zipped images/sensors data for deep learning, which
is difficult to integrated into data infrastructure and slow to train over cloud storage.
While industry uses data lakes (Parquet-based techniques, i.e., Delta Lake, Iceberg) or data warehouses (AWS Redshift
or Google BigQuery) to collect and analyze data, they have to convert the data into training-friendly formats, such
as Rikai / Petastorm or TFRecord .
Multiple single-purpose data transforms, as well as syncing copies between cloud storage to local training
instances have become a common practice. While each of the existing data formats excels at the workload it was originally designed for, we need a new data format
tailored for multistage ML development cycles to reduce and data silos. A comparison of different data formats in each stage of ML development cycle. | | Lance | Parquet & ORC | JSON & XML | TFRecord | Database | Warehouse |
|---------------------|-------|---------------|------------|----------|----------|-----------|
| Analytics | Fast | Fast | Slow | Slow | Decent | Fast |
| Feature Engineering | Fast | Fast | Decent | Slow | Decent | Good |
| Training | Fast | Decent | Slow | Fast | N/A | N/A |
| Exploration | Fast | Slow | Fast | Slow | Fast | Decent |
| Infra Support | Rich | Rich | Decent | Limited | Rich | Rich | Community Highlights Lance is currently used in production by:
* LanceDB , a serverless, low-latency vector database for ML applications
* Self-driving car company for large-scale storage, retrieval and processing of multi-modal data.
* E-commerce company for billion-scale+ vector personalized search.
* and more. Presentations and Talks Lance Deep Dive . July 2023. Lance: A New Columnar Data Format , Scipy 2022, Austin, TX . July, 2022. | Modern columnar data format for ML and LLMs implemented in Rust. Convert from parquet in 2 lines of code for 100x faster random access, vector index, and data versioning. Compatible with Pandas, DuckDB, Polars, Pyarrow, with more integrations coming.. | machine-learning,computer-vision,data-format,deep-learning,python,apache-arrow,duckdb,mlops,data-analysis,data-analytics | 154 | 54 | 1,665 | 1,650 | 330 | 108 | 21 |
kuprel/min-dalle | min(DALL·E) YouTube Walk-through by The AI Epiphany This is a fast, minimal port of Boris Dayma's DALL·E Mini (with mega weights). It has been stripped down for inference and converted to PyTorch. The only third party dependencies are numpy, requests, pillow and torch. To generate a 3x3 grid of DALL·E Mega images it takes:
- 55 sec with a T4 in Colab
- 33 sec with a P100 in Colab
- 15 sec with an A10G on Hugging Face Here's a more detailed breakdown of performance on an A100. Credit to @technobird22 and his NeoGen discord bot for the graph. The flax model and code for converting it to torch can be found here . Install bash
$ pip install min-dalle Usage Load the model parameters once and reuse the model to generate multiple images. ```python
from min_dalle import MinDalle model = MinDalle(
models_root='./pretrained',
dtype=torch.float32,
device='cuda',
is_mega=True,
is_reusable=True
)
``` The required models will be downloaded to models_root if they are not already there. Set the dtype to torch.float16 to save GPU memory. If you have an Ampere architecture GPU you can use torch.bfloat16 . Set the device to either "cuda" or "cpu". Once everything has finished initializing, call generate_image with some text as many times as you want. Use a positive seed for reproducible results. Higher values for supercondition_factor result in better agreement with the text but a narrower variety of generated images. Every image token is sampled from the top_k most probable tokens. The largest logit is subtracted from the logits to avoid infs. The logits are then divided by the temperature . If is_seamless is true, the image grid will be tiled in token space not pixel space. ```python
image = model.generate_image(
text='Nuclear explosion broccoli',
seed=-1,
grid_size=4,
is_seamless=False,
temperature=1,
top_k=256,
supercondition_factor=32,
is_verbose=False
) display(image)
``` Credit to @hardmaru for the example Saving Individual Images The images can also be generated as a FloatTensor in case you want to process them manually. python
images = model.generate_images(
text='Nuclear explosion broccoli',
seed=-1,
grid_size=3,
is_seamless=False,
temperature=1,
top_k=256,
supercondition_factor=16,
is_verbose=False
) To get an image into PIL format you will have to first move the images to the CPU and convert the tensor to a numpy array. python
images = images.to('cpu').numpy() Then image $i$ can be coverted to a PIL.Image and saved python
image = Image.fromarray(images[i])
image.save('image_{}.png'.format(i)) Progressive Outputs If the model is being used interactively (e.g. in a notebook) generate_image_stream can be used to generate a stream of images as the model is decoding. The detokenizer adds a slight delay for each image. Set progressive_outputs to True to enable this. An example is implemented in the colab. ```python
image_stream = model.generate_image_stream(
text='Dali painting of WALL·E',
seed=-1,
grid_size=3,
progressive_outputs=True,
is_seamless=False,
temperature=1,
top_k=256,
supercondition_factor=16,
is_verbose=False
) for image in image_stream:
display(image)
``` Command Line Use image_from_text.py to generate images from the command line. bash
$ python image_from_text.py --text='artificial intelligence' --no-mega | min(DALL·E) is a fast, minimal port of DALL·E Mini to PyTorch | artificial-intelligence,deep-learning,pytorch,text-to-image | 5 | 15 | 28 | 376 | 22 | 1 | 0 |
Ph0enixKM/Amber | Amber Programming language that compiles to Bash. It's a high level programming language that makes it easy to create shell scripts. It's particularly well suited for cloud services. [!Warning]
This software is not ready for extended usage. Join our Discord! Install Amber compiler currently works on:
- Linux x86 and ARM
- macOS x86 and ARM (Apple Silicon)
- Nix (NixOS) macOS / Linux Make sure that the operating system meets the following prerequisites
- Bourne-again shell (Bash)
- Curl tool for downloading the installation script
- Basic calculator bc command (On Debian run sudo apt install bc ) system-wide install bash
curl -s "https://raw.githubusercontent.com/Ph0enixKM/AmberNative/master/setup/install.sh" | /bin/bash local-user install bash
curl -s "https://raw.githubusercontent.com/Ph0enixKM/AmberNative/master/setup/install.sh" | /bin/bash -s -- --user Via a package manager Amber is packaged in the following distros: Arch (AUR) - amber-bash-bin Nix See NIX.md Windows support As windows does not come with bash installed it makes no sense to support it. Please install WSL 2 on your windows machine and install Linux version of Amber compiler inside. In order for it to work you may need to run the following code that pulls all the prerequisites. bash
sudo apt install curl bc
sudo mkdir /opt /usr/local/bin Contributing In order to contribute, you have to add couple of build targets: bash
rustup target add x86_64-unknown-linux-musl
rustup target add x86_64-apple-darwin
rustup target add x86_64-pc-windows-gnu
rustup target add aarch64-apple-darwin And linkers (macos): bash
brew install messense/macos-cross-toolchains/aarch64-unknown-linux-musl
brew install messense/macos-cross-toolchains/x86_64-unknown-linux-gnu Finally in order to build bash
amber build.ab Debugging Amber:
```bash
// Shows the AST
AMBER_DEBUG_PARSER=true cargo run // Shows the time it took to compile each phase
AMBER_DEBUG_TIME=true cargo run // Flamegraph is a profiling tool that is used to visualize the time each function took to execute
sudo cargo flamegraph -- ``` Github Actions We are using cargo-dist to build the binaries for all the platforms. The binaries are then uploaded to the release page once a new release a tag is created. | 💎 Amber the programming language compiled to bash | bash-scripting,bash,compilers | 7 | 19 | 118 | 360 | 52 | 6 | 4 |
eludadev/ui-buttons | 🚀️ We're on Product Hunt! If you want me to keep making amazing free resources for you, I would really appreaciate your feedback and support on my Product Hunt launch! 🤗️ 🤖️ To See Code, Click on One of The Links | Preview | Link | Description |
| --- | --- | --- |
| | Basic | CSS Button that changes color on click or hover. |
| | Inverted Triangles | CSS Button slides its two inverted triangles to the middle on click or hover. |
| | Line Slide | CSS Button that slides its pseudo-element underline on hover or click. |
| | Don't Cross The Line | CSS Button that crosses over itself and expands on hover or click. |
| | Slicer And Marquee | CSS Button that slices its background and cycles its content vertically on click or hover. |
| | Zoom In And Text Rotate | CSS Button that slides two inward-pointing pseudo-element triangles to the center on hover or click. |
| | Alternate Blocks And Text Flip | CSS Button that slides its four alternate blocks and flips its text vertically on click or hover. |
| | Slide Right | CSS Button with background that slides right on click or hover. |
| | Tilted diagonal | CSS Button that slides its increasingly tilted diagonal to the right on click or hover. |
| | In And Out | CSS Button that slides its background to the right on click or hover and more to the right again on click or hover. |
| | Bubble Right | CSS Button that slides its circular background to the right on click or hover. |
| | Marquee Sign | CSS Button that moves copies of its text horizontally and at an angle on click or hover. |
| | ShapeShifter | CSS Button that morphs one side of its border into a triangle pseudo-element on click or hover. |
| | Click To Fill | CSS Button with background that fills it up vertically on click. |
| | Double ShapeShifter | CSS Button that morphs both sides into a triangle pseudo-element on click or hover. |
| | X ShapeShifter | CSS Button that morphs into an X using pseudo-elements on click or hover. |
| | Fold Middle | CSS Button that folds from the middle using CSS 3D Transforms on hover or click. |
| | Fold One Side | CSS Button that folds from one side using CSS 3D Transforms on hover or click. |
| | Arrow Slide + Text Rotate | CSS Button that slides its triangular background from the left to the right and rotates its text on hover or click. |
| | Slide Down | CSS Button with backgrounds that slides down on click or hover. |
| | Bubble Up + Text Rotate | CSS Button that slides its bubbly radial background to the bottom and rotates its text on hover or click. |
| | OverFold | CSS Button that moves one corner from the bottom right to the top left to reveal its background on hover or click. |
| | Focus In | CSS Button that focuses its border in on hover or click. |
| | Cover Over | CSS Button that has a pseudo-element background going over it and out on hover or click. |
| | Enlarge | CSS Button that fills up its background radially from the center and scales up on hover or click. |
| | Slanted | CSS Button that tilts its background from the top left corner on hover or click. |
| | Split Reveal | CSS Button that reveals new text by splitting it horizontally from the center on hover or click. |
| | Split Reveal Alternate | CSS Button that reveals new text by splitting it alternately from the center on hover or click. |
| | Split Reveal Horizontal | CSS Button that reveals new text by splitting it horizontally from the center on hover or click. |
| | Slide Reveal | CSS Button that reveals new text by sliding it to the right on hover or click. |
| | Diagonal Swipe | CSS Button that slides its diagonal background to the right on click or hover. |
| | Slide Reveal + Text Down | CSS Button that reveals new text by sliding it to the right and sliding the old text down on hover or click. |
| | Pill Shrink | CSS Button that scales its pill-like background down on hover or click. |
| | Pill Halo | CSS Button that focuses in its pill-like border on hover or click. |
| | Glow Button | CSS Button that has a moving and glowing border on hover or click. |
| | Rotate Reveal | CSS Button that reveals new text by rotating it in from the bottom left on hover or click. |
| | Double Slide Down | CSS Button that slides its two backgrounds successively to the bottom on hover or click. |
| | Double Slide Right | CSS Button that slides its two backgrounds successively to the right on hover or click. |
| | 3D Rotate Down | CSS Button that rotates down using 3D Transforms on hover or click. |
| | 3D Rotate Right | CSS Button that rotates right using 3D Transforms on hover or click. |
| | 3D Rotate Left | CSS Button that rotates left using 3D Transforms on hover or click. |
| | 3D Rotate Down | CSS Button that rotates up using 3D Transforms on hover or click. |
| | Rush Triangle | CSS Button that slides its triangular background to the right on click or hover. |
| | 3D Float | CSS Button that has a large box shadow and that tilts down using 3D Transforms on hover or click. |
| | 3D Button Click | CSS Button that pushes itself down in 3D space on hover or click. |
| | Striped Zebra | CSS Button with striped background that scrolls on click or hover. |
| | Letter Dance | CSS Button that slides its characters down successively one after the other on hover or click. |
| | Letter Dance 2 | CSS Button that slides its characters up and down alternately on hover or click. |
| | 3D Button 2 | CSS Button that simulates 3D using html elements and that pushes down on hover or click. |
| | Rainbow Fill | CSS Button that reveals its fun rainbow gradient background sitting inside of its rainbow gradient image border on hover or click. |
| | Pulse | CSS Button that pulsates on hover on hover or click. |
| | Offset | CSS Button that moves its background back into-place on hover or click. |
| | Backdrop Blur | CSS Button that overlays a blurry layer on its background on hover or click. |
| | Tada | CSS Button that plays the TADA animation from animate.css on hover or click. |
| | Double Horizontal | CSS Button that slides its two backgrounds horizontally to the middle on click or hover. |
| | Jello | CSS Button that plays the jello animation from animate.css on hover or click. |
| | Alternate Blocks | CSS Button with four blocks on alternate sides that move to the center on click or hover. |
| | Rubberband | CSS Button that plays the rubberband animation from animate.css on hover or click. |
| | Wobble | CSS Button that plays the wobble animation from animate.css on hover or click. |
| | Head Shake | CSS Button that plays the head-shake animation from animate.css on hover or click. |
| | Heart Beat | CSS Button that plays the heart-beat animation from animate.css on hover or click. |
| | Flash | CSS Button that plays the flash animation from animate.css on hover or click. |
| | Text Slide | CSS Button that slides a copy of its text vertically with another color on hover or click. |
| | Border Snake | CSS Button that has borders that fill-up one after another on hover or click. |
| | Snakes Alternate | CSS Button that has borders filling up from the parallel sides on hover or click. |
| | Snakes Meet | CSS Button that has borders filling up to meet at 2 points on hover or click. |
| | Double Vertical | CSS Button with two backgrounds that slide vertically to the center on click or hover. |
| | Quadruple Corners | CSS Button with 4 corners that all converge to the middle on click or hover. |
| | Snakes Center | CSS Button that has borders filling up from the center on hover or click. |
| | Material Ripple | CSS Button that fills up its background radially from the center then fades out on hover or click. |
| | Neumorphism 1 | CSS Button that has a fluffy shadow that moves to the inside on hover or click. |
| | Neumorphism 2 | CSS Button that has a fluffy shadow and text with a 3D effect using text shadows and that moves to the inside on hover or click. |
| | Neumorphism 3 | CSS Button that has a fluffy shadow that smoothly moves to the inside on hover or click. |
| | Neumorphism 4 | CSS Button that moves down on hover or click. |
| | Neon | CSS Button that has borders filling up slowly then revealing a large neon shadow on hover or click. |
| | I Want Attention | CSS Button that keeps pulsing on hover or click. |
| | Hug | CSS Button that moves its background from the outside to the inside on hover or click. |
| | Hug 2 | CSS Button that moves its background closer from the outside to the inside on hover or click. |
| | Float Up | CSS Button that floats up with a box shadow below it on click or hover. |
| | Double Diagonal | CSS Button that slides its two diagonal backgrounds horizontally to the center on click or hover. |
| | Progress Fill Right | CSS Button that has a background that slowly fills up with a progress animation on hover or click. |
| | Progress Fill Up | CSS Button that has a background that slowly fills up vertically on hover or click. |
| | Progress Shrink Vertical | CSS Button that shrinks into a progress-bar vertically on hover or click. |
| | 3D Progress | CSS Button that tilts in 3D space to reveal a horizontal progress-bar on hover or click. |
| | Elastic Progress | CSS Button that shrinks into a horizontal progress-bar in a smooth and elastic animation on hover or click. |
| | Letter Dance 3 | CSS Button that double-fills its background and plays an elastic animation with its characters on hover or click. |
| | Circular Charge | CSS Button that has a circular border that is clipped and fills up then fills up the background on hover or click. |
| | Icon Pulse | CSS Button that scales its background like a pulse on hover or click. |
| | Slicer | CSS Button that slices its background in half on click or hover. |
| | Icon Slide | CSS Button that slides vertically inside its borders on hover or click. |
| | Double Triangle | CSS Button that slides its two triangular backgrounds horizontally to the center on click or hover. |
| | Gooey | CSS Button that moves two circles closer to each other that have a gooey and slimy effect on hover or click. |
| | Seizure Glitch | CSS Button that plays an RGB split animation on hover or click. |
| | HandDrawn 1 | CSS Button that has borders mimicking hand-drawn edges on hover or click. |
| | HandDrawn 2 | CSS Button that has borders mimicking hand-drawn edges and floats up on hover or click. |
| | Icon Zoom | CSS Button that scales down inside its borders on hover or click. |
| | Icon Focus | CSS Button that has a border scaling in on it on hover or click. |
| | Progress Fold | CSS Button that paper-folds one side to reveal a progress-bar on hover or click. |
| | Sandwish | CSS Button that moves up many shadows successively on hover or click. | | 100 Modern CSS Buttons. Every style that you can imagine. | css,frontend,angular,awesome,awesome-list,design,design-system,framework,free,npm | 0 | 2 | 32 | 19 | 1 | 1 | 1 |
loov/lensm | lensm A tool for viewing assembly and source. Install with the usual Go commands: go install loov.dev/lensm@main For Linux you may need to add some additional dependencies . You can use go install --tags nowayland loov.dev/lensm@main or go install --tags nox11 loov.dev/lensm@main respectively to skip building Wayland or X11 version. To run the program provide a regular expression filter for the
function you want to inspect. -watch allows to automatically
reload the executable and information when it changes. lensm -watch -filter Fibonacci lensm Note: The program requires a binary that is built on your computer, otherwise the source code for the functions cannot be loaded. Why? I wrote a blog post at https://www.storj.io/blog/lensm on why and how the core functionality works. | Go assembly and source viewer | [] | 1 | 4 | 7 | 87 | 6 | 3 | 0 |
EmergeTools/Pow | Pow Delightful SwiftUI effects for your app. Check out other open source projects from Emerge Tools Installation To add a package dependency to your Xcode project, select File > Add Package and enter this repository's URL (https://github.com/EmergeTools/Pow). To add a package dependency to Swift Package, add this repository to your list of dependencies. swift
.package(url: "https://github.com/EmergeTools/Pow", from: Version(1, 0, 0)) And to your target as a product: swift
.product(name: "Pow", package: "Pow") If you are moving from the previously closed source Pow framework to the new open source package, please refer to our Transition Guide . If you have any problems please file an issue . Overview Pow features a selection of SwiftUI transitions as well as Change Effects that trigger every time a value is updated. You can find previews of all effects on the Pow website . If you have an iOS Developer Environment, you can check out the Pow Example App . Feedback & Contribution This project provides multiple forms of delivering feedback to maintainers. If you are figuring out how to use about Pow or one of it's effects we ask that you first consult the effects examples page . If you still have a question, enhancement, or a way to improve Pow, this project leverages GitHub's Issues to manage your requests. If you find a bug and wish to report it, an issue would be greatly appreciated. Requirements iOS 15.0+ macOS 12.0 Mac Catalyst 15.0+ visionOS beta 6 (requires Xcode 15.1 beta 3) Change Effects Change Effects are effects that will trigger a visual or haptic every time a value changes. Use the changeEffect modifier and pass in an AnyChangeEffect as well as a value to watch for changes. swift
Button {
post.toggleLike()
} label: {
Label(post.likes.formatted(), systemName: "heart.fill")
}
.changeEffect(.spray { heart }, value: post.likes, isEnabled: post.isLiked)
.tint(post.isLiked ? .red : .gray) You can choose from the following Change Effects: Spray , Haptic Feedback , Jump , Ping , Rise , Shake , Shine , and Spin . Spray Preview An effect that emits multiple particles in different shades and sizes moving up from the origin point. swift
likeButton
.changeEffect(
.spray(origin: .center) { Image(systemName: "heart.fill") },
value: likes
) Parameters: origin : The origin of the particles. layer : The ParticleLayer on which to render the effect, default is local . particles : The particles to emit. swift
static func spray(origin: UnitPoint = .center, layer: ParticleLayer = .local, @ViewBuilder _ particles: () -> some View) -> AnyChangeEffect Haptic Feedback Triggers haptic feedback to communicate successes, failures, and warnings whenever a value changes. notification : The feedback type to trigger. swift
static func feedback(hapticNotification type: UINotificationFeedbackGenerator.FeedbackType) -> AnyChangeEffect Triggers haptic feedback to simulate physical impacts whenever a value changes. impact : The feedback style to trigger. swift
static func feedback(hapticImpact style: UIImpactFeedbackGenerator.FeedbackStyle) -> AnyChangeEffect Triggers haptic feedback to indicate a change in selection whenever a value changes. swift
static var feedbackHapticSelection: AnyChangeEffect Jump Preview Makes the view jump the given height and then bounces a few times before settling. height : The height of the jump. swift
static func jump(height: CGFloat) -> AnyChangeEffect Ping Preview Adds one or more shapes that slowly grow and fade-out behind the view. The shape will be colored by the current tint style. Parameters: shape : The shape to use for the effect. count : The number of shapes to emit. swift
static func ping(shape: some InsettableShape, count: Int) -> AnyChangeEffect An effect that adds one or more shapes that slowly grow and fade-out behind the view. Parameters: shape : The shape to use for the effect. style : The style to use for the effect. count : The number of shapes to emit. swift
static func ping(shape: some InsettableShape, style: some ShapeStyle, count: Int) -> AnyChangeEffect Rise Preview An effect that emits the provided particles from the origin point and slowly float up while moving side to side. Parameters: origin : The origin of the particle. layer : The ParticleLayer on which to render the effect, default is local . particles : The particles to emit. swift
static func rise(origin: UnitPoint = .center, layer: ParticleLayer = .local, @ViewBuilder _ particles: () -> some View) -> AnyChangeEffect Shake Preview Shakes the view when a change happens. swift
static var shake: AnyChangeEffect An effect that shakes the view when a change happens. rate : The rate of the shake. swift
static func shake(rate: ShakeRate) -> AnyChangeEffect Shine Preview Highlights the view with a shine moving over the view. The shine moves from the top leading edge to bottom trailing edge. swift
static var shine: AnyChangeEffect Highlights the view with a shine moving over the view. The shine moves from the top leading edge to bottom trailing edge. swift
static func shine(duration: Double) -> AnyChangeEffect Highlights the view with a shine moving over the view. The angle is relative to the current layoutDirection , such that 0° represents sweeping towards the trailing edge and 90° represents sweeping towards the bottom edge. Parameters: angle : The angle of the animation. duration : The duration of the animation. swift
static func shine(angle: Angle, duration: Double = 1.0) -> AnyChangeEffect Sound Effect Feedback Triggers a sound effect as feedback whenever a value changes. This effect will not interrupt or duck any other audio that may be currently playing. This effect is not guaranteed to be triggered; the effect running depends on the user's silent switch position and the current playback device. To relay important information to the user, you should always accompany audio effects with visual cues. soundEffect : The sound effect to trigger. swift
static func feedback(_ soundEffect: SoundEffect) -> AnyChangeEffect Spin Preview Spins the view around the given axis when a change happens. swift
static var spin: AnyChangeEffect Spins the view around the given axis when a change happens. Parameters: axis: The x, y and z elements that specify the axis of rotation. anchor: The location with a default of center that defines a point in 3D space about which the rotation is anchored. anchorZ: The location with a default of 0 that defines a point in 3D space about which the rotation is anchored. perspective: The relative vanishing point with a default of 1 / 6 for this rotation. swift
static func spin(axis: (x: CGFloat, y: CGFloat, z: CGFloat), anchor: UnitPoint = .center, anchorZ: CGFloat = 0, perspective: CGFloat = 1 / 6) -> AnyChangeEffect Delay Every change effect can be delayed to trigger the effect after some time. swift
Button("Submit") {
<#code#>
}
.buttonStyle(.borderedProminent)
.disabled(name.isEmpty)
.changeEffect(.shine.delay(1), value: name.isEmpty, isEnabled: !name.isEmpty) Parameters: delay : The delay in seconds. swift
func delay(_ delay: Double) -> AnyChangeEffect Particle Layer A particle layer is a context in which particle effects draw their particles. The particleLayer(name:) view modifier wraps the view in a particle layer with the given name. Particle effects such as AnyChangeEffect.spray can render their particles on this position in the view tree to avoid being clipped by their immediate ancestor. For example, certain List styles may clip their rows. Use particleLayer(_:) to render particles on top of the entire List or even its enclosing NavigationStack . swift
func particleLayer(name: AnyHashable) -> some View Transitions All transitions are namespaced under the movingParts static variable, e.g. swift
myView.transition(.movingParts.anvil) Anvil Preview A transition that drops the view down from the top with matching haptic feedback. The transition is only performed on insertion and takes 1.4 seconds. swift
static var anvil: AnyTransition Blinds Preview A transition that reveals the view as if it was behind window blinds. swift
static var blinds: AnyTransition A transition that reveals the view as if it was behind window blinds. Parameters:
- slatWidth : The width of each slat.
- style : The style of blinds, either .venetian or .vertical .
- isStaggered : Whether all slats opens at the same time or in sequence. swift
static func blinds(slatWidth: CGFloat, style: BlindsStyle = .venetian, isStaggered: Bool = false) -> AnyTransition Blur Preview A transition from blurry to sharp on insertion, and from sharp to blurry
on removal. swift
static var blur: AnyTransition Boing Preview A transition that moves the view down with any overshoot resulting in an
elastic deformation of the view. swift
static var boing: AnyTransition A transition that moves the view from the specified edge on insertion, and towards it on removal, with any overshoot resulting in an elastic deformation of the view. swift
static func boing(edge: Edge) -> AnyTransition Clock Preview A transition using a clockwise sweep around the centerpoint of the view. swift
static var clock: AnyTransition A transition using a clockwise sweep around the centerpoint of the view. Parameter blurRadius : The radius of the blur applied to the mask. swift
static func clock(blurRadius: CGFloat) -> AnyTransition Flicker Preview A transition that toggles the visibility of the view multiple times
before settling. swift
static var flicker: AnyTransition A transition that toggles the visibility of the view multiple times
before settling. Parameter count : The number of times the visibility is toggled. swift
static func flicker(count: Int) -> AnyTransition Film Exposure Preview A transition from completely dark to fully visible on insertion, and
from fully visible to completely dark on removal. swift
static var filmExposure: AnyTransition Flip Preview A transition that inserts by rotating the view towards the viewer, and
removes by rotating the view away from the viewer. Note: Any overshoot of the animation will result in the view continuing the rotation past the view's normal state before eventually settling. swift
static var flip: AnyTransition Glare Preview A transitions that shows the view by combining a diagonal wipe with a
white streak. swift
static var glare: AnyTransition A transitions that shows the view by combining a wipe with a colored
streak. The angle is relative to the current layoutDirection , such that 0°
represents sweeping towards the trailing edge on insertion and 90°
represents sweeping towards the bottom edge. In this example, the removal of the view is using a glare with an
exponential ease-in curve, combined with a anticipating scale animation,
making for a more dramatic exit. swift
infoBox
.transition(
.asymmetric(
insertion: .movingParts.glare(angle: .degrees(225)),
removal: .movingParts.glare(angle: .degrees(45)
)
.animation(.movingParts.easeInExponential(duration: 0.9))
.combined(with:
.scale(scale: 1.4)
.animation(.movingParts.anticipate(duration: 0.9).delay(0.1)
)
)
)
) Parameters: direction : The angle of the wipe. color : The color of the glare effect. swift
static func glare(angle: Angle, color: Color = .white) -> AnyTransition Iris Preview A transition that takes the shape of a growing circle when inserting,
and a shrinking circle when removing. Parameters: origin : The center point of the circle as it grows or shrinks. blurRadius : The radius of the blur applied to the mask. swift
static func iris(origin: UnitPoint = .center, blurRadius: CGFloat = 0) -> AnyTransition Move Preview A transition that moves the view from the specified edge of the on
insertion and towards it on removal. swift
static func move(edge: Edge) -> AnyTransition A transition that moves the view at the specified angle. The angle is relative to the current layoutDirection , such that 0° represents animating towards the trailing edge on insertion and 90° represents inserting towards the bottom edge. In this example, the view insertion is animated by moving it towards the top trailing corner and the removal is animated by moving it towards the bottom edge. swift
Text("Hello")
.transition(
.asymmetric(
insertion: .movingParts.move(angle: .degrees(45)),
removal: .movingParts.move(angle: .degrees(90))
)
) Parameter angle : The direction of the animation. swift
static func move(angle: Angle) -> AnyTransition Pop Preview A transition that shows a view with a ripple effect and a flurry of
tint-colored particles. The transition is only performed on insertion and takes 1.2 seconds. swift
static var pop: AnyTransition A transition that shows a view with a ripple effect and a flurry of
colored particles. In this example, the star uses the pop effect only when transitioning
from starred == false to starred == true : swift
Button {
starred.toggle()
} label: {
if starred {
Image(systemName: "star.fill")
.foregroundStyle(.orange)
.transition(.movingParts.pop(.orange))
} else {
Image(systemName: "star")
.foregroundStyle(.gray)
.transition(.identity)
}
} The transition is only performed on insertion. Parameter style : The style to use for the effect. swift
static func pop<S: ShapeStyle>(_ style: S) -> AnyTransition Poof Preview A transition that removes the view in a dissolving cartoon style cloud. The transition is only performed on removal and takes 0.4 seconds. swift
static var poof: AnyTransition Rotate3D A transition that inserts by rotating from the specified rotation, and
removes by rotating to the specified rotation in three dimensions. In this example, the view is rotated 90˚ about the y axis around
its bottom edge as if it was rising from lying on its back face: swift
Text("Hello")
.transition(.movingParts.rotate3D(
.degrees(90),
axis: (1, 0, 0),
anchor: .bottom,
perspective: 1.0 / 6.0)
) Note: Any overshoot of the animation will result in the view continuing the rotation past the view's normal state before eventually settling. Parameters: angle : The angle from which to rotate the view. axis : The x, y and z elements that specify the axis of rotation. anchor : The location with a default of center that defines a point
in 3D space about which the rotation is anchored. anchorZ : The location with a default of 0 that defines a point in 3D
space about which the rotation is anchored. perspective : The relative vanishing point with a default of 1 for
this rotation. swift
static func rotate3D(_ angle: Angle, axis: (x: CGFloat, y: CGFloat, z: CGFloat), anchor: UnitPoint = .center, anchorZ: CGFloat = 0, perspective: CGFloat = 1) -> AnyTransition Snapshot Preview A transition from completely bright to fully visible on insertion, and
from fully visible to completely bright on removal. swift
static var snapshot: AnyTransition Skid Preview A transition that moves the view in from its leading edge with any
overshoot resulting in an elastic deformation of the view. swift
static var skid: AnyTransition A transition that moves the view in from the specified edge during
insertion and towards it during removal with any overshoot resulting
in an elastic deformation of the view. Parameter direction : The direction of the transition. swift
static func skid(direction: SkidDirection) -> AnyTransition Swoosh Preview A three-dimensional transition from the back of the towards the front
during insertion and from the front towards the back during removal. swift
static var swoosh: AnyTransition Vanish Preview A transition that dissolves the view into many small particles. The transition is only performed on removal. Note: This transition will use an ease-out animation with a duration of 900ms if the current Animation is .default . swift
static var vanish: AnyTransition A transition that dissolves the view into many small particles. The transition is only performed on removal. Note: This transition will use an ease-out animation with a duration of 900ms if the current Animation is .default . Parameter style : The style to use for the particles. swift
static func vanish<S: ShapeStyle>(_ style: S) -> AnyTransition A transition that dissolves the view into many small particles following a given shape. The transition is only performed on removal. Note: This transition will use an ease-out animation with a duration of 900ms if the current Animation is .default . Parameter style : The style to use for the particles. Parameter mask : The mask that determines where particles should be placed. Parameter eoFill : A Boolean that indicates whether the shape is interpreted with the even-odd winding number rule. swift
static func vanish<T: ShapeStyle, S: Shape>(_ style: T, mask: S, eoFill: Bool = false) -> AnyTransition Wipe Preview A transition using a sweep from the specified edge on insertion, and
towards it on removal. Parameters: edge : The edge at which the sweep starts or ends. blurRadius : The radius of the blur applied to the mask. swift
static func wipe(edge: Edge, blurRadius: CGFloat = 0) -> AnyTransition A transition using a sweep at the specified angle. The angle is relative to the current layoutDirection , such that 0°
represents sweeping towards the trailing edge on insertion and 90°
represents sweeping towards the bottom edge. Parameters: angle : The angle of the animation. blurRadius : The radius of the blur applied to the mask. swift
static func wipe(angle: Angle, blurRadius: CGFloat = 0) -> AnyTransition | Delightful SwiftUI effects for your app | ios,swiftui,animation,swift,transitions,effects,particles | 14 | 12 | 31 | 51 | 12 | 2 | 1 |
grafana/oncall | Grafana OnCall Developer-friendly incident response with brilliant Slack integration. Android & iOS : Collect and analyze alerts from multiple monitoring systems On-call rotations based on schedules Automatic escalations Phone calls, SMS, Slack, Telegram notifications Getting Started We prepared multiple environments: production developer hobby (described in the following steps) Download docker-compose.yml : bash
curl -fsSL https://raw.githubusercontent.com/grafana/oncall/dev/docker-compose.yml -o docker-compose.yml Set variables: bash
echo "DOMAIN=http://localhost:8080
# Remove 'with_grafana' below if you want to use existing grafana
# Add 'with_prometheus' below to optionally enable a local prometheus for oncall metrics
# e.g. COMPOSE_PROFILES=with_grafana,with_prometheus
COMPOSE_PROFILES=with_grafana
# to setup an auth token for prometheus exporter metrics:
# PROMETHEUS_EXPORTER_SECRET=my_random_prometheus_secret
# also, make sure to enable the /metrics endpoint:
# FEATURE_PROMETHEUS_EXPORTER_ENABLED=True
SECRET_KEY=my_random_secret_must_be_more_than_32_characters_long" > .env (Optional) If you want to enable/setup the prometheus metrics exporter
(besides the changes above), create a prometheus.yml file (replacing my_random_prometheus_secret accordingly), next to your docker-compose.yml : ```bash
echo "global:
scrape_interval: 15s
evaluation_interval: 15s scrape_configs:
- job_name: prometheus
metrics_path: /metrics/
authorization:
credentials: my_random_prometheus_secret
static_configs:
- targets: [\"host.docker.internal:8080\"]" > prometheus.yml
``` NOTE: you will need to setup a Prometheus datasource using http://prometheus:9090 as the URL in the Grafana UI. Launch services: bash
docker-compose pull && docker-compose up -d Go to OnCall Plugin Configuration , using log in credentials
as defined above: admin / admin (or find OnCall plugin in configuration->plugins) and connect OnCall plugin with OnCall backend : text
OnCall backend URL: http://engine:8080 Enjoy! Check our OSS docs if you want to set up
Slack, Telegram, Twilio or SMS/calls through Grafana Cloud. Update version To update your Grafana OnCall hobby environment: ```shell Update Docker image docker-compose pull engine Re-deploy docker-compose up -d
``` After updating the engine, you'll also need to click the "Update" button on the plugin version page .
See Grafana docs for more
info on updating Grafana plugins. Join community Have a question, comment or feedback? Don't be afraid to open an issue ! Stargazers over time Further Reading Automated migration from other on-call tools - Migrator Documentation - Grafana OnCall Overview Webinar - YouTube How To Add Integration - How to Add Integration Blog Post - Announcing Grafana OnCall, the easiest way to do on-call management Presentation - Deep dive into the Grafana, Prometheus, and Alertmanager stack for alerting and on-call management | Developer-friendly incident response with brilliant Slack integration | alert,alerting,oncall,oncall-schedule,slack,telegram,grafana | 282 | 967 | 3,084 | 4,165 | 420 | 33 | 18 |
CrowdDotDev/crowd.dev | Effortlessly centralize community, product, and customer data 🌐 Cloud version (beta) · 📖 Docs · ❤️ Discord · 📣 Newsletter · 🗺️ Roadmap Table of Contents About crowd.dev Features Getting started Roadmap Stay up-to-date Contribution License Security Book a call About crowd.dev crowd.dev is the Developer Data Platform (DDP) that allows companies to centralize all touch points developers have with their product and brand, whether in the community (e.g., Stack Overflow or Reddit), product (open-source or SaaS), or commercial channels (e.g., HubSpot).The platform pulls data from various sources, normalizes it, matches identities across platforms, and enriches it with third-party data. The result is a unified 360-degree view of the developers who engage with your product and community, the companies they work for, and their position in their personal customer journey. crowd.dev is open-source, built with developers in mind, available for both hosted and self-hosted deployments, open to extensions, and offers full control over your data. To our users : - You can get actively involved, contribute to our roadmap, and turn crowd.dev into the tool you've always wanted.
- We are open about what we are building, allowing you to take a look inside, and ensuring that we handle your data in a privacy-preserving way.
- Our interests as a company are aligned with yours, and we need to ensure that we always deliver enough value to you with our commercial offering in relation to our pricing. To our developer community: - You can self-host crowd.dev to centralize data for your community or company while keeping full control over your data.
- Our product is built for extensibility. If you can think of any use cases that you want to build with the data we collect and store for you, please go ahead and build them! We will be here to help out if you need us.
- You can actively contribute to crowd.dev (e.g. integrations), and we will be supporting you along the journey. Just take a look at our Contributing guide . Features Plug & play integrations to tie all relevant platforms - like GitHub, Discord, Slack, or LinkedIn - together. ( all integrations ) Identity resolution & automated segmentation to effortlessly understand activities and profiles across platforms. Opinionated analytics & reports on topics like product-market-fit and open-source community activity to further inform your GTM strategy. Workflows automation with webhooks. 2-way CRM sync & Slack alerts to get notified about intent events in real-time. [cloud only] User enrichment with 25+ attributes, including emails, social profiles, work experience, and technical skills. [cloud only] Organization enrichment with 50+ attributes, including industry, headcount, and revenue. [cloud only] Sentiment analysis and conversation detection to stay on top of what's going on in your open-source community. [cloud only] Eagle Eye : Monitor dev-focused community platforms to find relevant content to engage with, helping you to gain developers’ mindshare and grow your community organically [cloud only] Getting started Cloud version Our cloud version is a fast, easy, and free way to get started with crowd.dev. Self-hosted version To get started with self-hosting, take a look at our self-hosting docs . Integrations We currently support all our integrations for self-hosting. For each one of them, you will need to create your own application. Development environment Requirements Node v16.16.0 Docker and docker-compose Getting started Get the mono repo from GitHub shell
git clone git@github.com:CrowdDotDev/crowd.dev.git Run the start script shell
cd scripts
./cli start For hot reloading, you can run shell
cd scripts
./cli clean-start-dev This app will be available at http://localhost:8081 For more information on development, you can check our docs . Roadmap You can find more features on our public roadmap . Feel free to also open an issue for anything you're missing. Stay up-to-date crowd.dev is still in beta and we ship new features every week. To stay in the loop, leave us a star and subscribe to our monthly newsletter . Thanks a lot! ❤️ Contribution There are many ways you can contribute to crowd.dev! Here are a few options: Star this repo Create issues every time you feel something is missing or goes wrong Upvote issues with 👍 reaction so we know what's the demand for a particular issue to prioritize it within the roadmap If you would like to contribute to the development of the project, please refer to our Contributing guide . All contributions are highly appreciated. 🙏 License Distributed under the Apache 2.0 License. See LICENSE for more information. Our self-hosted version can be run and deployed by default under the permissive Apache 2.0 license. All premium components will be hidden and inactive with the default configuration. You can run, deploy, and contribute to the app without fearing a violation of the premium license. Check out the premium self-hosted features docs to know more about the premium self-hosted features. Security We take security very seriously. If you come across any security vulnerabilities, please disclose them by sending an email to security@crowd.dev. We appreciate your help in making our platform as secure as possible and are committed to working with you to resolve any issues quickly and efficiently. Book a call Schedule a call with a crowd.dev team member to learn more about our product and ensure you get the most out of it. | ⚡️ The developer data platform to centralize community, product, and customer data | community-led-growth,devrel,developer-advocacy,community,developer-marketing,developer-relations,developer-led-growth,cdp,customer-data-platform,analytics | 51 | 46 | 2,043 | 2,214 | 137 | 181 | 49 |
loro-dev/loro | Loro Reimagine state management with CRDTs 🦜 Make your app state synchronized and collaborative effortlessly. Documentation | Getting Started | Rust Doc https://github.com/loro-dev/loro/assets/18425020/fe246c47-a120-44b3-91d4-1e7232a5b4ac ⚠️ Notice : The current API and encoding schema of Loro are experimental and subject to change . You should not use it in production. Loro is a CRDTs(Conflict-free Replicated Data Types) library that makes building local-first apps easier. It is currently available for JavaScript (via WASM) and Rust developers. Explore our vision in our blog: ✨ Reimagine State Management with CRDTs . Features Basic Features Provided by CRDTs P2P Synchronization Automatic Merging Local Availability Scalability Delta Updates Supported CRDT Algorithms 📝 Text Editing with Fugue 📙 Peritext-like Rich Text CRDT 🌲 Moveable Tree 🚗 Moveable List 🗺️ Last-Write-Wins Map 🔄 Replayable Event Graph Advanced Features in Loro 📖 Preserve Editing History in a Replayable Event Graph ⏱️ Fast Time Travel Through History https://github.com/loro-dev/loro/assets/18425020/ec2d20a3-3d8c-4483-a601-b200243c9792 Example ```ts
import { expect, test } from 'vitest';
import { Loro, LoroList } from 'loro-crdt'; /* * Demonstrates synchronization of two documents with two rounds of exchanges. /
// Initialize document A
const docA = new Loro();
const listA: LoroList = docA.getList('list');
listA.insert(0, 'A');
listA.insert(1, 'B');
listA.insert(2, 'C'); // Export the state of document A as a byte array
const bytes: Uint8Array = docA.exportFrom(); // Simulate sending bytes across the network to another peer, B
const docB = new Loro();
// Peer B imports the updates from A
docB.import(bytes); // Verify that B's state matches A's state
expect(docB.toJSON()).toStrictEqual({
list: ['A', 'B', 'C'],
}); // Get the current operation log version of document B
const version = docB.oplogVersion(); // Simulate editing at B: delete item 'B'
const listB: LoroList = docB.getList('list');
listB.delete(1, 1); // Export the updates from B since the last synchronization point
const bytesB: Uint8Array = docB.exportFrom(version); // Simulate sending bytesB back across the network to A
// A imports the updates from B
docA.import(bytesB); // Verify that the list at A now matches the list at B after merging
expect(docA.toJSON()).toStrictEqual({
list: ['A', 'C'],
});
``` Credits Loro draws inspiration from the innovative work of the following projects and individuals: Ink & Switch : The principles of Local-first Software have greatly influenced this project. The Peritext project has also shaped our approach to rich text CRDTs. Diamond-types : The Replayable Event Graph (REG) algorithm from @josephg has been adapted to reduce the computation and space usage of CRDTs. Automerge : Their use of columnar encoding for CRDTs has informed our strategies for efficient data encoding. Yjs : We have incorporated a similar algorithm for effectively merging collaborative editing operations, thanks to their pioneering works. Matthew Weidner : His work on the Fugue algorithm has been invaluable, enhancing our text editing capabilities. Martin Kleppmann : His work on CRDTs has significantly influenced our comprehension of the field. | Reimagine state management with CRDTs. Make your app collaborative effortlessly. | crdt,local-first,offline-first,p2p,rich-text,collaborative-editing,privacy-first | 28 | 11 | 297 | 1,338 | 15 | 13 | 2 |
WilliamStar007/ClashX-V2Ray-TopFreeProxy | ClashX Setup Tutorial Tutorial for setting up a ClashX proxy with free subscription links. Feel free to submit an Issue or make a Pull Request ! For V2Ray users, see V2Ray Setup Tutorial . For a tutorial in Chinese, see 中文版教程 . The original Clash repo has been deleted. Table of Contents Installation Subscription Links Setup Credits Disclaimer Installation Clash · Salute Dreamacro. * Download Clash for Windows from CFW Page page.
* Download ClashX from WannaFlix page.
* Download ClashX Pro with enhanced mode and native Apple Silicon support at 糖糖のWIKI . Subscription Links NodeFree: https://nodefree.org/dy/2024/06/20240623.yaml ★ Mfuu: https://raw.githubusercontent.com/mfuu/v2ray/master/clash.yaml Anaer: https://raw.githubusercontent.com/anaer/Sub/main/clash.yaml ★ Ermaozi: https://raw.githubusercontent.com/ermaozi/get_subscribe/main/subscribe/clash.yml ★ Learnhard-cn: https://cdn.jsdelivr.net/gh/vxiaov/free_proxies@main/clash/clash.provider.yaml OpenRunner: https://freenode.openrunner.net/uploads/20240617-clash.yaml ★ Xrayfree: https://tt.vg/freeclash Free Node Pool * Zu1k: https://github.com/zu1k/proxypool/releases ★ Setup Open ClashX Click ClashX icon in the status bar Click Config and then Remote Config Click Manage and then Add Paste a Subscription Link to the url field OK!! (manually select a node if necessary) https://user-images.githubusercontent.com/89805831/179545223-69177f8e-5f2d-4bd3-ba27-b68018418e5a.mp4 Credits NodeFree (https://nodefree.org) Mfuu (https://github.com/mfuu/v2ray) Anaer (https://github.com/anaer/Sub) Ermaozi (https://github.com/ermaozi/get_subscribe) Learnhard-cn (https://github.com/vxiaov/free_proxies) OpenRunner (https://github.com/openRunner/clash-freenode) Xrayfree (https://github.com/xrayfree/free-ssr-ss-v2ray-vpn-clash) Zu1k (https://github.com/zu1k/proxypool) Disclaimer This project is meant for personal and educational uses only. Please follow relevant laws and regulations when using this project. Project owner is not responsible or liable in any manner for the use of the content. | Top free VPN (ClashX & V2Ray proxy) with subscription links. [免费VPN、免费梯子、免费科学上网、免费订阅链接、免费节点、精选、ClashX & V2Ray 教程] | clash,clashx,clash-for-windows,free-node,proxy,proxy-pool,tutorial,free-proxy,free-vpn,vpn | 0 | 2 | 1 | 27 | 0 | 1 | 3 |
farm-fe/farm | Extremely fast Vite-compatible web building tool written in Rust English | 简体中文 Intro Farm is a extremely fast vite-compatible web-building tool written in Rust. It's designed to be fast, powerful and consistent, aims to provide best experience for web development, which is the real next generation build tool. Online experience Why Farm? See Why Farm for details. In short, tools like webpack are too slow, but new tools like Vite are not perfect, Vite has a lot of drawbacks when comes to a large project: A huge number of requests during development :when there are hundreds or thousands modules per page, loading performance severely degraded, it may takes seconds or more when refresh the page. Inconsistency between development and production : Using different strategy and tools in development and production, it's really inconsistent and it's hard to debug online issues. Inflexible Code Splitting : It's hard to control the output of your bundles. Farm can solve these problems perfectly, and it's really fast cause it's written in Rust. Farm aims to be fast, consistent, flexible, which is the real next generation build tool. Features [!NOTE] Since Farm v0.13, Vite plugins can be used directly in Farm. Refer to Using vite plugins in Farm Since Farm v0.14, persistent disk cache enabled by default. Refer to Incremental Building Now Farm is 1.0 stable and production ready! . See Farm official website to get started. ⚡ Extremely Fast : Written in Rust, start a React / Vue project in milliseconds and perform an HMR update within 20ms for most situations. ⚡ Incremental Building : Support persistent cache, module level cache enabled by default, any module won't be compiled twice until it's changed! 🧰 Fully Pluggable and Vite Compatible : Everything inside Farm is powered by plugins, Support Vite Plugins out of box. Supports Farm compilation plugins(both Rust and JavaScript plugins, and SWC plugins), Farm runtime plugins and Farm server plugin. ⚙️ Powerful : Compiles JS/TS/JSX/TSX, CSS, Css Modules, HTML, and static assets out of the box. Support official compilation plugins for Popular frameworks/tools like React, Vue, SolidJs, Sass, Less, Postcss and so on. ⏱️ Lazy Compilation : Dynamically imported resources are compiled only when requested, speed up compilation for large scale project. Just write a dynamic import and the imported module won't be compiled when it is executed. 📦 Partial Bundling : Bundle your project into a few reasonable bundles automatically, speeding up resource loading without losing caching granularity. Refer to RFC-003 Partial Bundling for details. 🔒 Consistency : What you see in development will be the same as what you get in production. 🌳 Compatibility : Supports both legacy (ES5) and modern browsers. Farm has implemented all features of a web build tool, including production optimization like tree shake and minification. It's now 1.0 stable. We have already migrated enterprise projects to Farm, and it works great! See RFC-001 Architecture for design motivation and architecture. Getting Started Create a new Farm(support both React and Vue) project with your favorite package manager: ```bash with npm npm create farm@latest with yarn yarn create farm@latest with pnpm pnpm create farm@latest
``` Visit Farm Documentation to learn more about Farm. Benchmark Farm is much faster than similar tool, 20x faster than webpack and 10x faster than Vite in the benchmark: See Benchmark for details. Contribution See Contributing Guide . Chat With Us Author Twitter , Official Twitter With Discord Wechat group QQ group Contributors Credits Thanks to: The SWC project created by @kdy1 , which powers Farm's code parsing, transformation and minification. The NAPI-RS project created by @Brooooooklyn , which powers Farm's node-binding implementation. The Rollup project created by @lukastaegert , which inspired Farm's plugin system implementation. The Vite project created by Evan You , which inspired Farm's compatibility design of ecosystem. Author & Maintainer Author: brightwu(吴明亮) ,worked at bytedance. Twitter Maintainer: ErKeLost shulandmimi | Extremely fast Vite-compatible web build tool written in Rust | build-tool,compiler,rust,hmr,frontend,farm,bundler,typescript,vite | 674 | 33 | 997 | 928 | 70 | 53 | 8 |